Linear independent eigenvectors and eigenvalues
up vote
0
down vote
favorite
I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.
I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?
Also, what will the matrix of T be in respect to this basis?
Thank you for any input!
linear-algebra matrices transformation eigenvalues-eigenvectors
add a comment |
up vote
0
down vote
favorite
I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.
I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?
Also, what will the matrix of T be in respect to this basis?
Thank you for any input!
linear-algebra matrices transformation eigenvalues-eigenvectors
1
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.
I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?
Also, what will the matrix of T be in respect to this basis?
Thank you for any input!
linear-algebra matrices transformation eigenvalues-eigenvectors
I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.
I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?
Also, what will the matrix of T be in respect to this basis?
Thank you for any input!
linear-algebra matrices transformation eigenvalues-eigenvectors
linear-algebra matrices transformation eigenvalues-eigenvectors
edited Nov 20 '13 at 4:44
Mhenni Benghorbal
43k63574
43k63574
asked Nov 20 '13 at 3:10
Akaichan
1,47621836
1,47621836
1
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41
add a comment |
1
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41
1
1
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$
To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$
To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$
add a comment |
up vote
0
down vote
By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$
To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$
add a comment |
up vote
0
down vote
up vote
0
down vote
By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$
To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$
By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$
To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$
edited Nov 21 at 17:19
answered Nov 20 at 17:34
Maurice P
1,3451732
1,3451732
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f574128%2flinear-independent-eigenvectors-and-eigenvalues%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19
@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41