Linear independent eigenvectors and eigenvalues











up vote
0
down vote

favorite













I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.




I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?



Also, what will the matrix of T be in respect to this basis?



Thank you for any input!










share|cite|improve this question




















  • 1




    If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
    – copper.hat
    Nov 20 '13 at 3:19












  • @Akaichan Do you still need help with this or is Copper.Hat's comment enough.
    – Git Gud
    Nov 25 '13 at 21:41















up vote
0
down vote

favorite













I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.




I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?



Also, what will the matrix of T be in respect to this basis?



Thank you for any input!










share|cite|improve this question




















  • 1




    If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
    – copper.hat
    Nov 20 '13 at 3:19












  • @Akaichan Do you still need help with this or is Copper.Hat's comment enough.
    – Git Gud
    Nov 25 '13 at 21:41













up vote
0
down vote

favorite









up vote
0
down vote

favorite












I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.




I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?



Also, what will the matrix of T be in respect to this basis?



Thank you for any input!










share|cite|improve this question
















I have T as a linear transformation from V to V over the field F, V has dimension n. T has the maximum number n distinct eigenvalues, then show that there exists a basis of V consisting of eigenvectors.




I know that if I let $v_1,...,v_r$ be eigenvectors belonging to distinct eigenvalues, then those vectors are linearly independent. Can I make a basis from these linearly independent vector and prove that it spans V?



Also, what will the matrix of T be in respect to this basis?



Thank you for any input!







linear-algebra matrices transformation eigenvalues-eigenvectors






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 20 '13 at 4:44









Mhenni Benghorbal

43k63574




43k63574










asked Nov 20 '13 at 3:10









Akaichan

1,47621836




1,47621836








  • 1




    If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
    – copper.hat
    Nov 20 '13 at 3:19












  • @Akaichan Do you still need help with this or is Copper.Hat's comment enough.
    – Git Gud
    Nov 25 '13 at 21:41














  • 1




    If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
    – copper.hat
    Nov 20 '13 at 3:19












  • @Akaichan Do you still need help with this or is Copper.Hat's comment enough.
    – Git Gud
    Nov 25 '13 at 21:41








1




1




If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19






If there are $n$ linearly independent vectors in $n$-dimensional space, then they must form a basis. To see what $T$ looks like, consider what $T x_k$ looks like in the basis of eigenvectors.
– copper.hat
Nov 20 '13 at 3:19














@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41




@Akaichan Do you still need help with this or is Copper.Hat's comment enough.
– Git Gud
Nov 25 '13 at 21:41










1 Answer
1






active

oldest

votes

















up vote
0
down vote













By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$



To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
$$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
$$begin{bmatrix}
lambda_1 & 0 & cdots & 0\
0 & lambda_2 & cdots & 0\
vdots & vdots & ddots & vdots\
0 & 0 & cdots & lambda_nend{bmatrix}.$$






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f574128%2flinear-independent-eigenvectors-and-eigenvalues%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$



    To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
    $$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
    where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
    $$begin{bmatrix}
    lambda_1 & 0 & cdots & 0\
    0 & lambda_2 & cdots & 0\
    vdots & vdots & ddots & vdots\
    0 & 0 & cdots & lambda_nend{bmatrix}.$$






    share|cite|improve this answer



























      up vote
      0
      down vote













      By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$



      To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
      $$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
      where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
      $$begin{bmatrix}
      lambda_1 & 0 & cdots & 0\
      0 & lambda_2 & cdots & 0\
      vdots & vdots & ddots & vdots\
      0 & 0 & cdots & lambda_nend{bmatrix}.$$






      share|cite|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$



        To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
        $$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
        where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
        $$begin{bmatrix}
        lambda_1 & 0 & cdots & 0\
        0 & lambda_2 & cdots & 0\
        vdots & vdots & ddots & vdots\
        0 & 0 & cdots & lambda_nend{bmatrix}.$$






        share|cite|improve this answer














        By definition, a basis for $V$ is a linearly-independent set of vectors in $V$ that spans the space $V,$ and the dimension of a finite-dimensional vector space is the number of elements in a basis for $V,$ so we know that any basis for $V$ must contain exactly $n$ linearly-independent vectors. We also know (see Theorem 5 on page 45 of Hoffman and Kunze's Linear Algebra) that every linearly-independent subset of $V$ is part of a basis for $V.$ You already know that the $n$ eigenvectors are linearly independent, so it follows that they form a basis for $V.$



        To find the matrix of $T$ with respect your ordered basis $mathscr B$ of eigenvectors, we use the fact that the ith column of that matrix is given by $[Tv_i]_{mathscr B}$ where $[ cdot ]_{mathscr B}$ denotes the coordinate matrix with respect to $mathscr B.$ We therefore compute
        $$[Tv_i]_{mathscr B} = [lambda_i v_i]_{mathscr B} = begin{bmatrix} 0\ vdots\ 0\ lambda_i\ 0\ vdots\ 0end{bmatrix}$$
        where $lambda_i$ is the eigenvalue associated with eigenvector $v_i.$ Thus, the matrix of $T$ is a diagonal matrix with the eigenvalues in the diagonal entries ordered by corresponding eigenvectors:
        $$begin{bmatrix}
        lambda_1 & 0 & cdots & 0\
        0 & lambda_2 & cdots & 0\
        vdots & vdots & ddots & vdots\
        0 & 0 & cdots & lambda_nend{bmatrix}.$$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Nov 21 at 17:19

























        answered Nov 20 at 17:34









        Maurice P

        1,3451732




        1,3451732






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f574128%2flinear-independent-eigenvectors-and-eigenvalues%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Le Mesnil-Réaume

            Ida-Boy-Ed-Garten

            web3.py web3.isConnected() returns false always