How to get nth derivative of $e^{x^2/2}$
I want to calculate the nth derivative of $e^{x^2/2}$. It is as follow:
$$
frac{d}{dx} e^{x^2/2} = x e^{x^2/2} = P_1(x) e^{x^2/2}
$$
$$
frac{d^n}{dx^n} e^{x^2/2} = frac{d}{dx} (P_{n-1}(x) e^{x^2/2}) = (x P_{n-1}(x) + frac{dP_{n-1}}{dx})e^{x^2/2} = P_n(x) e^{x^2/2}
$$
So we get recursive relation of $P_n$:
$$ P_n(x) = xP_{n-1} + frac{dP_{n-1}}{dx}, P_0(x) = 1tag1
$$
My question is how to solve the recursive relation involving function and derivative. I know generating function for recursion like $a_{n+1}=a_{n}+a_{n-1}$. But I am not sure how to solve $(1)$.
functions derivatives polynomials recursion
add a comment |
I want to calculate the nth derivative of $e^{x^2/2}$. It is as follow:
$$
frac{d}{dx} e^{x^2/2} = x e^{x^2/2} = P_1(x) e^{x^2/2}
$$
$$
frac{d^n}{dx^n} e^{x^2/2} = frac{d}{dx} (P_{n-1}(x) e^{x^2/2}) = (x P_{n-1}(x) + frac{dP_{n-1}}{dx})e^{x^2/2} = P_n(x) e^{x^2/2}
$$
So we get recursive relation of $P_n$:
$$ P_n(x) = xP_{n-1} + frac{dP_{n-1}}{dx}, P_0(x) = 1tag1
$$
My question is how to solve the recursive relation involving function and derivative. I know generating function for recursion like $a_{n+1}=a_{n}+a_{n-1}$. But I am not sure how to solve $(1)$.
functions derivatives polynomials recursion
2
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03
add a comment |
I want to calculate the nth derivative of $e^{x^2/2}$. It is as follow:
$$
frac{d}{dx} e^{x^2/2} = x e^{x^2/2} = P_1(x) e^{x^2/2}
$$
$$
frac{d^n}{dx^n} e^{x^2/2} = frac{d}{dx} (P_{n-1}(x) e^{x^2/2}) = (x P_{n-1}(x) + frac{dP_{n-1}}{dx})e^{x^2/2} = P_n(x) e^{x^2/2}
$$
So we get recursive relation of $P_n$:
$$ P_n(x) = xP_{n-1} + frac{dP_{n-1}}{dx}, P_0(x) = 1tag1
$$
My question is how to solve the recursive relation involving function and derivative. I know generating function for recursion like $a_{n+1}=a_{n}+a_{n-1}$. But I am not sure how to solve $(1)$.
functions derivatives polynomials recursion
I want to calculate the nth derivative of $e^{x^2/2}$. It is as follow:
$$
frac{d}{dx} e^{x^2/2} = x e^{x^2/2} = P_1(x) e^{x^2/2}
$$
$$
frac{d^n}{dx^n} e^{x^2/2} = frac{d}{dx} (P_{n-1}(x) e^{x^2/2}) = (x P_{n-1}(x) + frac{dP_{n-1}}{dx})e^{x^2/2} = P_n(x) e^{x^2/2}
$$
So we get recursive relation of $P_n$:
$$ P_n(x) = xP_{n-1} + frac{dP_{n-1}}{dx}, P_0(x) = 1tag1
$$
My question is how to solve the recursive relation involving function and derivative. I know generating function for recursion like $a_{n+1}=a_{n}+a_{n-1}$. But I am not sure how to solve $(1)$.
functions derivatives polynomials recursion
functions derivatives polynomials recursion
edited Oct 19 '15 at 22:40
asked Oct 19 '15 at 22:06
Math Wizard
13.2k11036
13.2k11036
2
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03
add a comment |
2
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03
2
2
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03
add a comment |
5 Answers
5
active
oldest
votes
As noted before this is a variant of the Hermite polynomials. Due to subtle differences, we will adapt the standard derivation from Arfken.
Generating function
First
begin{align}
frac{d^n}{dx^n} e^{x^2/2}
&=
lim_{trightarrow 0} frac{d^n}{dx^n} e^{(x+t)^2/2}
tag{1} \
&=
lim_{trightarrow 0} frac{d^n}{dt^n} e^{(x+t)^2/2} \
&=
e^{x^2/2} lim_{trightarrow 0} frac{d^n}{dt^n} e^{xt + t^2/2},
end{align}
This means that the polynomials $P_n(x)$ we are looking for
are just the $n$th coefficients of the Taylor expansion of
$e^{xt+t^2/2}$. In other words,
begin{align}
e^{xt + t^2/2} = sum_{n = 0}^infty frac{P_n(x)}{n!} t^n.
tag{2}
end{align}
The left-hand side is the exponential generating function of $P_n(x)$.
Recurrence relations
If we differentiate (2) with respect to $t$,
$$
(x + t) , e^{xt + t^2/2} = sum_{n = 1}^infty frac{P_n(x)}{(n-1)!} t^{n-1}
= sum_{n = 0}^infty frac{P_{n+1}(x)}{n!} t^n.
tag{3}
$$
where the previous n = 0 term is simply equal to zero, allowing for a shift in indices.
Expanding the left-hand side,
begin{align}
(x + t) , e^{xt + t^2/2}
&=
(x + t) sum_{n = 0}^infty frac{ P_n(x) }{n!} t^n
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 0}^infty frac{ P_n(x) }{n!} t^{n+1}
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 1}^infty frac{ n , P_{n-1}(x) }{n!} t^{n}.
tag{4}
end{align}
Comparing the coefficients of $t^n/n!$ in (3) and (4) yields
$$
P_{n+1}(x) = x , P_n(x) + n P_{n-1}(x).
tag{5}
$$
Similarly, by differentiating (2) with respect to $x$, we get
$$
P'_n(x) = n P_{n-1}(x),
tag{6}
$$
which is noted by Barry Cipra.
Combining the two yields the relation by hermes.
Explicit formula
An explicit formula is more readily derived from the generating function instead of the recurrence relations:
begin{align}
e^{xt+t^2/2}
&= e^{xt} , e^{t^2/2}\
&= sum_{s = 0}^infty frac{(xt)^s}{s!} sum_{m = 0}^infty frac{t^{2m}}{2^m , m!} \
&= sum_{n = 0}^infty
left(
sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }
right) frac{t^n}{n!},
end{align}
where $[n/2]$ denotes the largest integer not exceeding $n/2$.
Comparing this to (2), we get
begin{align}
P_n(x) = sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }.
end{align}
Relations to the standard definitions
For reference, $P_n(x)$ is related to the standard Hermite polynomials as
begin{align}
P_n(x)
&= (-i)^n , mathrm{He}_n(ix) \
&= frac{1}{(sqrt{2} , i)^n} , H_nleft(frac{ix}{sqrt{2}}right).
end{align}
Notes
I recently discovered that the problem is related to Find an expression for the $n$-th derivative of $f(x)=e^{x^2}$, and the above solution is essentially the same as this one.
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
add a comment |
Try proving (by induction) that
$$P_{n+1}=xP_n+nP_{n-1}$$
(Note, this is equivalent to proving that $P_n'=nP_{n-1}$.)
add a comment |
Here is another method which will work. You have shown by a simple argument that
$${P_i}(x) = {P'_{i - 1}}(x) + x{P_{i - 1}}(x),,,,,,,,,,,{P_0}(x) = 1,,,,,,,,,,,i = 1,2,...,ntag{1}$$
Now let's take a look at the derivatives of $f(x)$ directly
$$eqalign{
& {f^{(0)}}(x) = {e^{{{{x^2}} over 2}}} cr
& {f^{(1)}}(x) = x{e^{{{{x^2}} over 2}}} cr
& {f^{(2)}}(x) = left( {1 + {x^2}} right){e^{{{{x^2}} over 2}}} cr
& {f^{(3)}}(x) = left( {3x + {x^3}} right){e^{{{{x^2}} over 2}}} cr
& . cr
& . cr
& . cr
& {f^{(n)}}(x) = {P_n}(x){e^{{{{x^2}} over 2}}} cr} tag{2}$$
The first few terms suggest the formula ${{P'}_i}(x) = i{P_{i - 1}}(x)$ to be true. We may prove this easily by using $(1)$ and induction. According to $(2)$, it is clear that the formula is true for $i=1$. Now suppose it is true for $i=k$, ${{P'}_k}(x) = k{P_{k - 1}}(x)$, and then we shall prove it is also true for $i=k+1$. For this purpose, consider the following
$$eqalign{
& {P_{k + 1}}(x) = {{P'}_k}(x) + x{P_k}(x) = k{P_{k - 1}}(x) + x{P_k}(x) cr
& {{P'}_{k + 1}}(x) = k{{P'}_{k - 1}}(x) + {left( {x{P_k}(x)} right)^prime } cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + x{{P'}_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + kx{P_{k - 1}}(x) cr
& ,,,,,,,,,,,,,,,,, = kleft( {{{P'}_{k - 1}}(x) + x{P_{k - 1}}(x)} right) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{P_k}(x) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = left( {k + 1} right){P_k}(x) cr}tag{3} $$
Now combining this new result with $(1)$ we can conclude
$${P_i}(x) = x{P_{i - 1}}(x) + i{P_{i - 2}}(x),,,,,,{P_0}(x) = 1,,,,{P_1}(x)=x,,,,,,i = 2,3,...,ntag{4}$$
Finally, you can use $(4)$ as a recursive relation to derive $P_n(x)$ by a usual systematic procedure. That's all.
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
add a comment |
You could try $P_n(x)=Sigma^n_{i=0} a_{n,i}x^i$
Then your equation $(1)$ becomes $Sigma^n_{i=0} a_{n,i}x^i=xSigma^{n-1}_{i=0} a_{n-1,i}x^i+Sigma^n_{i=0} ia_{n-1,i}x^{i-1}$
Comparing coefficients of $x^i$ gives $a_{n,i}=a_{n-1,i-1}+ia_{n-1,i+1}$
Does that help?
add a comment |
We have the multiplicative rule for differentiation: $$frac{partial fg}{partial x} = frac{partial f}{partial x} g + f frac{partial g }{partial x}$$
Also the "chain rule" or rule for function composition: $$frac{partial (g(h))}{partial x} = frac{partial g}{partial h}frac{partial h}{partial x}$$
So we let $g = exp(h)$, $h = x^2/2$. We see that $frac{partial h}{partial x} = x$ and $frac{partial g}{partial h} = g$
Now we let $f$ be a polynomial. Differentiating a polynomial and expressing the result is a simple linear thing to do. Each exponent is reduced by one and coefficient multiplied by the old exponent so we can write this as a matrix with the numbers $1,2,3$ in one of the superdiagonals. But multiplication with $x$ is also a matrix operation on the coefficients of a polynomial. So we can rewrite each differentiation as a matrix-vector product and then investigate it's properties in terms of matrix properties and linear algebra.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1488210%2fhow-to-get-nth-derivative-of-ex2-2%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
As noted before this is a variant of the Hermite polynomials. Due to subtle differences, we will adapt the standard derivation from Arfken.
Generating function
First
begin{align}
frac{d^n}{dx^n} e^{x^2/2}
&=
lim_{trightarrow 0} frac{d^n}{dx^n} e^{(x+t)^2/2}
tag{1} \
&=
lim_{trightarrow 0} frac{d^n}{dt^n} e^{(x+t)^2/2} \
&=
e^{x^2/2} lim_{trightarrow 0} frac{d^n}{dt^n} e^{xt + t^2/2},
end{align}
This means that the polynomials $P_n(x)$ we are looking for
are just the $n$th coefficients of the Taylor expansion of
$e^{xt+t^2/2}$. In other words,
begin{align}
e^{xt + t^2/2} = sum_{n = 0}^infty frac{P_n(x)}{n!} t^n.
tag{2}
end{align}
The left-hand side is the exponential generating function of $P_n(x)$.
Recurrence relations
If we differentiate (2) with respect to $t$,
$$
(x + t) , e^{xt + t^2/2} = sum_{n = 1}^infty frac{P_n(x)}{(n-1)!} t^{n-1}
= sum_{n = 0}^infty frac{P_{n+1}(x)}{n!} t^n.
tag{3}
$$
where the previous n = 0 term is simply equal to zero, allowing for a shift in indices.
Expanding the left-hand side,
begin{align}
(x + t) , e^{xt + t^2/2}
&=
(x + t) sum_{n = 0}^infty frac{ P_n(x) }{n!} t^n
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 0}^infty frac{ P_n(x) }{n!} t^{n+1}
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 1}^infty frac{ n , P_{n-1}(x) }{n!} t^{n}.
tag{4}
end{align}
Comparing the coefficients of $t^n/n!$ in (3) and (4) yields
$$
P_{n+1}(x) = x , P_n(x) + n P_{n-1}(x).
tag{5}
$$
Similarly, by differentiating (2) with respect to $x$, we get
$$
P'_n(x) = n P_{n-1}(x),
tag{6}
$$
which is noted by Barry Cipra.
Combining the two yields the relation by hermes.
Explicit formula
An explicit formula is more readily derived from the generating function instead of the recurrence relations:
begin{align}
e^{xt+t^2/2}
&= e^{xt} , e^{t^2/2}\
&= sum_{s = 0}^infty frac{(xt)^s}{s!} sum_{m = 0}^infty frac{t^{2m}}{2^m , m!} \
&= sum_{n = 0}^infty
left(
sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }
right) frac{t^n}{n!},
end{align}
where $[n/2]$ denotes the largest integer not exceeding $n/2$.
Comparing this to (2), we get
begin{align}
P_n(x) = sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }.
end{align}
Relations to the standard definitions
For reference, $P_n(x)$ is related to the standard Hermite polynomials as
begin{align}
P_n(x)
&= (-i)^n , mathrm{He}_n(ix) \
&= frac{1}{(sqrt{2} , i)^n} , H_nleft(frac{ix}{sqrt{2}}right).
end{align}
Notes
I recently discovered that the problem is related to Find an expression for the $n$-th derivative of $f(x)=e^{x^2}$, and the above solution is essentially the same as this one.
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
add a comment |
As noted before this is a variant of the Hermite polynomials. Due to subtle differences, we will adapt the standard derivation from Arfken.
Generating function
First
begin{align}
frac{d^n}{dx^n} e^{x^2/2}
&=
lim_{trightarrow 0} frac{d^n}{dx^n} e^{(x+t)^2/2}
tag{1} \
&=
lim_{trightarrow 0} frac{d^n}{dt^n} e^{(x+t)^2/2} \
&=
e^{x^2/2} lim_{trightarrow 0} frac{d^n}{dt^n} e^{xt + t^2/2},
end{align}
This means that the polynomials $P_n(x)$ we are looking for
are just the $n$th coefficients of the Taylor expansion of
$e^{xt+t^2/2}$. In other words,
begin{align}
e^{xt + t^2/2} = sum_{n = 0}^infty frac{P_n(x)}{n!} t^n.
tag{2}
end{align}
The left-hand side is the exponential generating function of $P_n(x)$.
Recurrence relations
If we differentiate (2) with respect to $t$,
$$
(x + t) , e^{xt + t^2/2} = sum_{n = 1}^infty frac{P_n(x)}{(n-1)!} t^{n-1}
= sum_{n = 0}^infty frac{P_{n+1}(x)}{n!} t^n.
tag{3}
$$
where the previous n = 0 term is simply equal to zero, allowing for a shift in indices.
Expanding the left-hand side,
begin{align}
(x + t) , e^{xt + t^2/2}
&=
(x + t) sum_{n = 0}^infty frac{ P_n(x) }{n!} t^n
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 0}^infty frac{ P_n(x) }{n!} t^{n+1}
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 1}^infty frac{ n , P_{n-1}(x) }{n!} t^{n}.
tag{4}
end{align}
Comparing the coefficients of $t^n/n!$ in (3) and (4) yields
$$
P_{n+1}(x) = x , P_n(x) + n P_{n-1}(x).
tag{5}
$$
Similarly, by differentiating (2) with respect to $x$, we get
$$
P'_n(x) = n P_{n-1}(x),
tag{6}
$$
which is noted by Barry Cipra.
Combining the two yields the relation by hermes.
Explicit formula
An explicit formula is more readily derived from the generating function instead of the recurrence relations:
begin{align}
e^{xt+t^2/2}
&= e^{xt} , e^{t^2/2}\
&= sum_{s = 0}^infty frac{(xt)^s}{s!} sum_{m = 0}^infty frac{t^{2m}}{2^m , m!} \
&= sum_{n = 0}^infty
left(
sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }
right) frac{t^n}{n!},
end{align}
where $[n/2]$ denotes the largest integer not exceeding $n/2$.
Comparing this to (2), we get
begin{align}
P_n(x) = sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }.
end{align}
Relations to the standard definitions
For reference, $P_n(x)$ is related to the standard Hermite polynomials as
begin{align}
P_n(x)
&= (-i)^n , mathrm{He}_n(ix) \
&= frac{1}{(sqrt{2} , i)^n} , H_nleft(frac{ix}{sqrt{2}}right).
end{align}
Notes
I recently discovered that the problem is related to Find an expression for the $n$-th derivative of $f(x)=e^{x^2}$, and the above solution is essentially the same as this one.
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
add a comment |
As noted before this is a variant of the Hermite polynomials. Due to subtle differences, we will adapt the standard derivation from Arfken.
Generating function
First
begin{align}
frac{d^n}{dx^n} e^{x^2/2}
&=
lim_{trightarrow 0} frac{d^n}{dx^n} e^{(x+t)^2/2}
tag{1} \
&=
lim_{trightarrow 0} frac{d^n}{dt^n} e^{(x+t)^2/2} \
&=
e^{x^2/2} lim_{trightarrow 0} frac{d^n}{dt^n} e^{xt + t^2/2},
end{align}
This means that the polynomials $P_n(x)$ we are looking for
are just the $n$th coefficients of the Taylor expansion of
$e^{xt+t^2/2}$. In other words,
begin{align}
e^{xt + t^2/2} = sum_{n = 0}^infty frac{P_n(x)}{n!} t^n.
tag{2}
end{align}
The left-hand side is the exponential generating function of $P_n(x)$.
Recurrence relations
If we differentiate (2) with respect to $t$,
$$
(x + t) , e^{xt + t^2/2} = sum_{n = 1}^infty frac{P_n(x)}{(n-1)!} t^{n-1}
= sum_{n = 0}^infty frac{P_{n+1}(x)}{n!} t^n.
tag{3}
$$
where the previous n = 0 term is simply equal to zero, allowing for a shift in indices.
Expanding the left-hand side,
begin{align}
(x + t) , e^{xt + t^2/2}
&=
(x + t) sum_{n = 0}^infty frac{ P_n(x) }{n!} t^n
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 0}^infty frac{ P_n(x) }{n!} t^{n+1}
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 1}^infty frac{ n , P_{n-1}(x) }{n!} t^{n}.
tag{4}
end{align}
Comparing the coefficients of $t^n/n!$ in (3) and (4) yields
$$
P_{n+1}(x) = x , P_n(x) + n P_{n-1}(x).
tag{5}
$$
Similarly, by differentiating (2) with respect to $x$, we get
$$
P'_n(x) = n P_{n-1}(x),
tag{6}
$$
which is noted by Barry Cipra.
Combining the two yields the relation by hermes.
Explicit formula
An explicit formula is more readily derived from the generating function instead of the recurrence relations:
begin{align}
e^{xt+t^2/2}
&= e^{xt} , e^{t^2/2}\
&= sum_{s = 0}^infty frac{(xt)^s}{s!} sum_{m = 0}^infty frac{t^{2m}}{2^m , m!} \
&= sum_{n = 0}^infty
left(
sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }
right) frac{t^n}{n!},
end{align}
where $[n/2]$ denotes the largest integer not exceeding $n/2$.
Comparing this to (2), we get
begin{align}
P_n(x) = sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }.
end{align}
Relations to the standard definitions
For reference, $P_n(x)$ is related to the standard Hermite polynomials as
begin{align}
P_n(x)
&= (-i)^n , mathrm{He}_n(ix) \
&= frac{1}{(sqrt{2} , i)^n} , H_nleft(frac{ix}{sqrt{2}}right).
end{align}
Notes
I recently discovered that the problem is related to Find an expression for the $n$-th derivative of $f(x)=e^{x^2}$, and the above solution is essentially the same as this one.
As noted before this is a variant of the Hermite polynomials. Due to subtle differences, we will adapt the standard derivation from Arfken.
Generating function
First
begin{align}
frac{d^n}{dx^n} e^{x^2/2}
&=
lim_{trightarrow 0} frac{d^n}{dx^n} e^{(x+t)^2/2}
tag{1} \
&=
lim_{trightarrow 0} frac{d^n}{dt^n} e^{(x+t)^2/2} \
&=
e^{x^2/2} lim_{trightarrow 0} frac{d^n}{dt^n} e^{xt + t^2/2},
end{align}
This means that the polynomials $P_n(x)$ we are looking for
are just the $n$th coefficients of the Taylor expansion of
$e^{xt+t^2/2}$. In other words,
begin{align}
e^{xt + t^2/2} = sum_{n = 0}^infty frac{P_n(x)}{n!} t^n.
tag{2}
end{align}
The left-hand side is the exponential generating function of $P_n(x)$.
Recurrence relations
If we differentiate (2) with respect to $t$,
$$
(x + t) , e^{xt + t^2/2} = sum_{n = 1}^infty frac{P_n(x)}{(n-1)!} t^{n-1}
= sum_{n = 0}^infty frac{P_{n+1}(x)}{n!} t^n.
tag{3}
$$
where the previous n = 0 term is simply equal to zero, allowing for a shift in indices.
Expanding the left-hand side,
begin{align}
(x + t) , e^{xt + t^2/2}
&=
(x + t) sum_{n = 0}^infty frac{ P_n(x) }{n!} t^n
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 0}^infty frac{ P_n(x) }{n!} t^{n+1}
\
&=
sum_{n = 0}^infty frac{ x , P_n(x) }{n!} t^n
+
sum_{n = 1}^infty frac{ n , P_{n-1}(x) }{n!} t^{n}.
tag{4}
end{align}
Comparing the coefficients of $t^n/n!$ in (3) and (4) yields
$$
P_{n+1}(x) = x , P_n(x) + n P_{n-1}(x).
tag{5}
$$
Similarly, by differentiating (2) with respect to $x$, we get
$$
P'_n(x) = n P_{n-1}(x),
tag{6}
$$
which is noted by Barry Cipra.
Combining the two yields the relation by hermes.
Explicit formula
An explicit formula is more readily derived from the generating function instead of the recurrence relations:
begin{align}
e^{xt+t^2/2}
&= e^{xt} , e^{t^2/2}\
&= sum_{s = 0}^infty frac{(xt)^s}{s!} sum_{m = 0}^infty frac{t^{2m}}{2^m , m!} \
&= sum_{n = 0}^infty
left(
sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }
right) frac{t^n}{n!},
end{align}
where $[n/2]$ denotes the largest integer not exceeding $n/2$.
Comparing this to (2), we get
begin{align}
P_n(x) = sum_{m = 0}^{[n/2]} frac{ n! , x^{n-2m} }{ 2^m , m! , (n-2m)! }.
end{align}
Relations to the standard definitions
For reference, $P_n(x)$ is related to the standard Hermite polynomials as
begin{align}
P_n(x)
&= (-i)^n , mathrm{He}_n(ix) \
&= frac{1}{(sqrt{2} , i)^n} , H_nleft(frac{ix}{sqrt{2}}right).
end{align}
Notes
I recently discovered that the problem is related to Find an expression for the $n$-th derivative of $f(x)=e^{x^2}$, and the above solution is essentially the same as this one.
edited Nov 29 '18 at 1:21
community wiki
7 revs, 2 users 96%
hbp
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
add a comment |
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
This is a good and clear answer! You deserve some points for it so you shouldn't have made it community wiki:)
– Winther
Oct 20 '15 at 0:22
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
@Winther. Thank you, sir. They have the option, and I like options :-)
– hbp
Oct 20 '15 at 0:38
add a comment |
Try proving (by induction) that
$$P_{n+1}=xP_n+nP_{n-1}$$
(Note, this is equivalent to proving that $P_n'=nP_{n-1}$.)
add a comment |
Try proving (by induction) that
$$P_{n+1}=xP_n+nP_{n-1}$$
(Note, this is equivalent to proving that $P_n'=nP_{n-1}$.)
add a comment |
Try proving (by induction) that
$$P_{n+1}=xP_n+nP_{n-1}$$
(Note, this is equivalent to proving that $P_n'=nP_{n-1}$.)
Try proving (by induction) that
$$P_{n+1}=xP_n+nP_{n-1}$$
(Note, this is equivalent to proving that $P_n'=nP_{n-1}$.)
answered Oct 19 '15 at 22:27
Barry Cipra
59.3k653125
59.3k653125
add a comment |
add a comment |
Here is another method which will work. You have shown by a simple argument that
$${P_i}(x) = {P'_{i - 1}}(x) + x{P_{i - 1}}(x),,,,,,,,,,,{P_0}(x) = 1,,,,,,,,,,,i = 1,2,...,ntag{1}$$
Now let's take a look at the derivatives of $f(x)$ directly
$$eqalign{
& {f^{(0)}}(x) = {e^{{{{x^2}} over 2}}} cr
& {f^{(1)}}(x) = x{e^{{{{x^2}} over 2}}} cr
& {f^{(2)}}(x) = left( {1 + {x^2}} right){e^{{{{x^2}} over 2}}} cr
& {f^{(3)}}(x) = left( {3x + {x^3}} right){e^{{{{x^2}} over 2}}} cr
& . cr
& . cr
& . cr
& {f^{(n)}}(x) = {P_n}(x){e^{{{{x^2}} over 2}}} cr} tag{2}$$
The first few terms suggest the formula ${{P'}_i}(x) = i{P_{i - 1}}(x)$ to be true. We may prove this easily by using $(1)$ and induction. According to $(2)$, it is clear that the formula is true for $i=1$. Now suppose it is true for $i=k$, ${{P'}_k}(x) = k{P_{k - 1}}(x)$, and then we shall prove it is also true for $i=k+1$. For this purpose, consider the following
$$eqalign{
& {P_{k + 1}}(x) = {{P'}_k}(x) + x{P_k}(x) = k{P_{k - 1}}(x) + x{P_k}(x) cr
& {{P'}_{k + 1}}(x) = k{{P'}_{k - 1}}(x) + {left( {x{P_k}(x)} right)^prime } cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + x{{P'}_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + kx{P_{k - 1}}(x) cr
& ,,,,,,,,,,,,,,,,, = kleft( {{{P'}_{k - 1}}(x) + x{P_{k - 1}}(x)} right) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{P_k}(x) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = left( {k + 1} right){P_k}(x) cr}tag{3} $$
Now combining this new result with $(1)$ we can conclude
$${P_i}(x) = x{P_{i - 1}}(x) + i{P_{i - 2}}(x),,,,,,{P_0}(x) = 1,,,,{P_1}(x)=x,,,,,,i = 2,3,...,ntag{4}$$
Finally, you can use $(4)$ as a recursive relation to derive $P_n(x)$ by a usual systematic procedure. That's all.
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
add a comment |
Here is another method which will work. You have shown by a simple argument that
$${P_i}(x) = {P'_{i - 1}}(x) + x{P_{i - 1}}(x),,,,,,,,,,,{P_0}(x) = 1,,,,,,,,,,,i = 1,2,...,ntag{1}$$
Now let's take a look at the derivatives of $f(x)$ directly
$$eqalign{
& {f^{(0)}}(x) = {e^{{{{x^2}} over 2}}} cr
& {f^{(1)}}(x) = x{e^{{{{x^2}} over 2}}} cr
& {f^{(2)}}(x) = left( {1 + {x^2}} right){e^{{{{x^2}} over 2}}} cr
& {f^{(3)}}(x) = left( {3x + {x^3}} right){e^{{{{x^2}} over 2}}} cr
& . cr
& . cr
& . cr
& {f^{(n)}}(x) = {P_n}(x){e^{{{{x^2}} over 2}}} cr} tag{2}$$
The first few terms suggest the formula ${{P'}_i}(x) = i{P_{i - 1}}(x)$ to be true. We may prove this easily by using $(1)$ and induction. According to $(2)$, it is clear that the formula is true for $i=1$. Now suppose it is true for $i=k$, ${{P'}_k}(x) = k{P_{k - 1}}(x)$, and then we shall prove it is also true for $i=k+1$. For this purpose, consider the following
$$eqalign{
& {P_{k + 1}}(x) = {{P'}_k}(x) + x{P_k}(x) = k{P_{k - 1}}(x) + x{P_k}(x) cr
& {{P'}_{k + 1}}(x) = k{{P'}_{k - 1}}(x) + {left( {x{P_k}(x)} right)^prime } cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + x{{P'}_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + kx{P_{k - 1}}(x) cr
& ,,,,,,,,,,,,,,,,, = kleft( {{{P'}_{k - 1}}(x) + x{P_{k - 1}}(x)} right) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{P_k}(x) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = left( {k + 1} right){P_k}(x) cr}tag{3} $$
Now combining this new result with $(1)$ we can conclude
$${P_i}(x) = x{P_{i - 1}}(x) + i{P_{i - 2}}(x),,,,,,{P_0}(x) = 1,,,,{P_1}(x)=x,,,,,,i = 2,3,...,ntag{4}$$
Finally, you can use $(4)$ as a recursive relation to derive $P_n(x)$ by a usual systematic procedure. That's all.
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
add a comment |
Here is another method which will work. You have shown by a simple argument that
$${P_i}(x) = {P'_{i - 1}}(x) + x{P_{i - 1}}(x),,,,,,,,,,,{P_0}(x) = 1,,,,,,,,,,,i = 1,2,...,ntag{1}$$
Now let's take a look at the derivatives of $f(x)$ directly
$$eqalign{
& {f^{(0)}}(x) = {e^{{{{x^2}} over 2}}} cr
& {f^{(1)}}(x) = x{e^{{{{x^2}} over 2}}} cr
& {f^{(2)}}(x) = left( {1 + {x^2}} right){e^{{{{x^2}} over 2}}} cr
& {f^{(3)}}(x) = left( {3x + {x^3}} right){e^{{{{x^2}} over 2}}} cr
& . cr
& . cr
& . cr
& {f^{(n)}}(x) = {P_n}(x){e^{{{{x^2}} over 2}}} cr} tag{2}$$
The first few terms suggest the formula ${{P'}_i}(x) = i{P_{i - 1}}(x)$ to be true. We may prove this easily by using $(1)$ and induction. According to $(2)$, it is clear that the formula is true for $i=1$. Now suppose it is true for $i=k$, ${{P'}_k}(x) = k{P_{k - 1}}(x)$, and then we shall prove it is also true for $i=k+1$. For this purpose, consider the following
$$eqalign{
& {P_{k + 1}}(x) = {{P'}_k}(x) + x{P_k}(x) = k{P_{k - 1}}(x) + x{P_k}(x) cr
& {{P'}_{k + 1}}(x) = k{{P'}_{k - 1}}(x) + {left( {x{P_k}(x)} right)^prime } cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + x{{P'}_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + kx{P_{k - 1}}(x) cr
& ,,,,,,,,,,,,,,,,, = kleft( {{{P'}_{k - 1}}(x) + x{P_{k - 1}}(x)} right) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{P_k}(x) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = left( {k + 1} right){P_k}(x) cr}tag{3} $$
Now combining this new result with $(1)$ we can conclude
$${P_i}(x) = x{P_{i - 1}}(x) + i{P_{i - 2}}(x),,,,,,{P_0}(x) = 1,,,,{P_1}(x)=x,,,,,,i = 2,3,...,ntag{4}$$
Finally, you can use $(4)$ as a recursive relation to derive $P_n(x)$ by a usual systematic procedure. That's all.
Here is another method which will work. You have shown by a simple argument that
$${P_i}(x) = {P'_{i - 1}}(x) + x{P_{i - 1}}(x),,,,,,,,,,,{P_0}(x) = 1,,,,,,,,,,,i = 1,2,...,ntag{1}$$
Now let's take a look at the derivatives of $f(x)$ directly
$$eqalign{
& {f^{(0)}}(x) = {e^{{{{x^2}} over 2}}} cr
& {f^{(1)}}(x) = x{e^{{{{x^2}} over 2}}} cr
& {f^{(2)}}(x) = left( {1 + {x^2}} right){e^{{{{x^2}} over 2}}} cr
& {f^{(3)}}(x) = left( {3x + {x^3}} right){e^{{{{x^2}} over 2}}} cr
& . cr
& . cr
& . cr
& {f^{(n)}}(x) = {P_n}(x){e^{{{{x^2}} over 2}}} cr} tag{2}$$
The first few terms suggest the formula ${{P'}_i}(x) = i{P_{i - 1}}(x)$ to be true. We may prove this easily by using $(1)$ and induction. According to $(2)$, it is clear that the formula is true for $i=1$. Now suppose it is true for $i=k$, ${{P'}_k}(x) = k{P_{k - 1}}(x)$, and then we shall prove it is also true for $i=k+1$. For this purpose, consider the following
$$eqalign{
& {P_{k + 1}}(x) = {{P'}_k}(x) + x{P_k}(x) = k{P_{k - 1}}(x) + x{P_k}(x) cr
& {{P'}_{k + 1}}(x) = k{{P'}_{k - 1}}(x) + {left( {x{P_k}(x)} right)^prime } cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + x{{P'}_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{{P'}_{k - 1}}(x) + {P_k}(x) + kx{P_{k - 1}}(x) cr
& ,,,,,,,,,,,,,,,,, = kleft( {{{P'}_{k - 1}}(x) + x{P_{k - 1}}(x)} right) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = k{P_k}(x) + {P_k}(x) cr
& ,,,,,,,,,,,,,,,,, = left( {k + 1} right){P_k}(x) cr}tag{3} $$
Now combining this new result with $(1)$ we can conclude
$${P_i}(x) = x{P_{i - 1}}(x) + i{P_{i - 2}}(x),,,,,,{P_0}(x) = 1,,,,{P_1}(x)=x,,,,,,i = 2,3,...,ntag{4}$$
Finally, you can use $(4)$ as a recursive relation to derive $P_n(x)$ by a usual systematic procedure. That's all.
edited Oct 21 '15 at 10:58
answered Oct 19 '15 at 22:48
H. R.
9,42093262
9,42093262
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
add a comment |
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
The induction is nice! Thank you.
– hbp
Oct 20 '15 at 23:59
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
@hbp: You're welcome. Thank you too as I noticed some typo errors in Eq.(4) by looking to my answer again due to your notification. I just fixed them! :)
– H. R.
Dec 23 '15 at 17:02
add a comment |
You could try $P_n(x)=Sigma^n_{i=0} a_{n,i}x^i$
Then your equation $(1)$ becomes $Sigma^n_{i=0} a_{n,i}x^i=xSigma^{n-1}_{i=0} a_{n-1,i}x^i+Sigma^n_{i=0} ia_{n-1,i}x^{i-1}$
Comparing coefficients of $x^i$ gives $a_{n,i}=a_{n-1,i-1}+ia_{n-1,i+1}$
Does that help?
add a comment |
You could try $P_n(x)=Sigma^n_{i=0} a_{n,i}x^i$
Then your equation $(1)$ becomes $Sigma^n_{i=0} a_{n,i}x^i=xSigma^{n-1}_{i=0} a_{n-1,i}x^i+Sigma^n_{i=0} ia_{n-1,i}x^{i-1}$
Comparing coefficients of $x^i$ gives $a_{n,i}=a_{n-1,i-1}+ia_{n-1,i+1}$
Does that help?
add a comment |
You could try $P_n(x)=Sigma^n_{i=0} a_{n,i}x^i$
Then your equation $(1)$ becomes $Sigma^n_{i=0} a_{n,i}x^i=xSigma^{n-1}_{i=0} a_{n-1,i}x^i+Sigma^n_{i=0} ia_{n-1,i}x^{i-1}$
Comparing coefficients of $x^i$ gives $a_{n,i}=a_{n-1,i-1}+ia_{n-1,i+1}$
Does that help?
You could try $P_n(x)=Sigma^n_{i=0} a_{n,i}x^i$
Then your equation $(1)$ becomes $Sigma^n_{i=0} a_{n,i}x^i=xSigma^{n-1}_{i=0} a_{n-1,i}x^i+Sigma^n_{i=0} ia_{n-1,i}x^{i-1}$
Comparing coefficients of $x^i$ gives $a_{n,i}=a_{n-1,i-1}+ia_{n-1,i+1}$
Does that help?
edited Oct 19 '15 at 22:23
answered Oct 19 '15 at 22:14
tomi
6,21411132
6,21411132
add a comment |
add a comment |
We have the multiplicative rule for differentiation: $$frac{partial fg}{partial x} = frac{partial f}{partial x} g + f frac{partial g }{partial x}$$
Also the "chain rule" or rule for function composition: $$frac{partial (g(h))}{partial x} = frac{partial g}{partial h}frac{partial h}{partial x}$$
So we let $g = exp(h)$, $h = x^2/2$. We see that $frac{partial h}{partial x} = x$ and $frac{partial g}{partial h} = g$
Now we let $f$ be a polynomial. Differentiating a polynomial and expressing the result is a simple linear thing to do. Each exponent is reduced by one and coefficient multiplied by the old exponent so we can write this as a matrix with the numbers $1,2,3$ in one of the superdiagonals. But multiplication with $x$ is also a matrix operation on the coefficients of a polynomial. So we can rewrite each differentiation as a matrix-vector product and then investigate it's properties in terms of matrix properties and linear algebra.
add a comment |
We have the multiplicative rule for differentiation: $$frac{partial fg}{partial x} = frac{partial f}{partial x} g + f frac{partial g }{partial x}$$
Also the "chain rule" or rule for function composition: $$frac{partial (g(h))}{partial x} = frac{partial g}{partial h}frac{partial h}{partial x}$$
So we let $g = exp(h)$, $h = x^2/2$. We see that $frac{partial h}{partial x} = x$ and $frac{partial g}{partial h} = g$
Now we let $f$ be a polynomial. Differentiating a polynomial and expressing the result is a simple linear thing to do. Each exponent is reduced by one and coefficient multiplied by the old exponent so we can write this as a matrix with the numbers $1,2,3$ in one of the superdiagonals. But multiplication with $x$ is also a matrix operation on the coefficients of a polynomial. So we can rewrite each differentiation as a matrix-vector product and then investigate it's properties in terms of matrix properties and linear algebra.
add a comment |
We have the multiplicative rule for differentiation: $$frac{partial fg}{partial x} = frac{partial f}{partial x} g + f frac{partial g }{partial x}$$
Also the "chain rule" or rule for function composition: $$frac{partial (g(h))}{partial x} = frac{partial g}{partial h}frac{partial h}{partial x}$$
So we let $g = exp(h)$, $h = x^2/2$. We see that $frac{partial h}{partial x} = x$ and $frac{partial g}{partial h} = g$
Now we let $f$ be a polynomial. Differentiating a polynomial and expressing the result is a simple linear thing to do. Each exponent is reduced by one and coefficient multiplied by the old exponent so we can write this as a matrix with the numbers $1,2,3$ in one of the superdiagonals. But multiplication with $x$ is also a matrix operation on the coefficients of a polynomial. So we can rewrite each differentiation as a matrix-vector product and then investigate it's properties in terms of matrix properties and linear algebra.
We have the multiplicative rule for differentiation: $$frac{partial fg}{partial x} = frac{partial f}{partial x} g + f frac{partial g }{partial x}$$
Also the "chain rule" or rule for function composition: $$frac{partial (g(h))}{partial x} = frac{partial g}{partial h}frac{partial h}{partial x}$$
So we let $g = exp(h)$, $h = x^2/2$. We see that $frac{partial h}{partial x} = x$ and $frac{partial g}{partial h} = g$
Now we let $f$ be a polynomial. Differentiating a polynomial and expressing the result is a simple linear thing to do. Each exponent is reduced by one and coefficient multiplied by the old exponent so we can write this as a matrix with the numbers $1,2,3$ in one of the superdiagonals. But multiplication with $x$ is also a matrix operation on the coefficients of a polynomial. So we can rewrite each differentiation as a matrix-vector product and then investigate it's properties in terms of matrix properties and linear algebra.
answered Oct 21 '15 at 13:04
mathreadler
14.8k72160
14.8k72160
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1488210%2fhow-to-get-nth-derivative-of-ex2-2%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
I think we are looking at the Hermite polynomials (coincidence?) with imaginary argument, cf. Eqs. (30) in the link with $x rightarrow i x/sqrt{2}$.
– hbp
Oct 19 '15 at 23:03