What conditions do we need on $A$ in order for the inverse of $AA^top$ to exist?












0












$begingroup$


I have seen proofs showing why the inverse of $AA^top$ exists but what are the conditions on $A$ in order for $(AA^top)^{-1}$ to exist?



Additionally, how can we show/prove that the columns of $A^top A$ are linearly independent if and only if the columns of $A$ are linearly independent?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:52










  • $begingroup$
    Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:54










  • $begingroup$
    math.stackexchange.com/questions/1151491/…
    $endgroup$
    – T. Fo
    Dec 13 '18 at 0:49
















0












$begingroup$


I have seen proofs showing why the inverse of $AA^top$ exists but what are the conditions on $A$ in order for $(AA^top)^{-1}$ to exist?



Additionally, how can we show/prove that the columns of $A^top A$ are linearly independent if and only if the columns of $A$ are linearly independent?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:52










  • $begingroup$
    Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:54










  • $begingroup$
    math.stackexchange.com/questions/1151491/…
    $endgroup$
    – T. Fo
    Dec 13 '18 at 0:49














0












0








0





$begingroup$


I have seen proofs showing why the inverse of $AA^top$ exists but what are the conditions on $A$ in order for $(AA^top)^{-1}$ to exist?



Additionally, how can we show/prove that the columns of $A^top A$ are linearly independent if and only if the columns of $A$ are linearly independent?










share|cite|improve this question











$endgroup$




I have seen proofs showing why the inverse of $AA^top$ exists but what are the conditions on $A$ in order for $(AA^top)^{-1}$ to exist?



Additionally, how can we show/prove that the columns of $A^top A$ are linearly independent if and only if the columns of $A$ are linearly independent?







linear-algebra proof-explanation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 12 '18 at 23:49









Theo Bendit

19k12353




19k12353










asked Dec 12 '18 at 23:46









Eva EliakostasEva Eliakostas

1




1












  • $begingroup$
    Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:52










  • $begingroup$
    Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:54










  • $begingroup$
    math.stackexchange.com/questions/1151491/…
    $endgroup$
    – T. Fo
    Dec 13 '18 at 0:49


















  • $begingroup$
    Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:52










  • $begingroup$
    Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
    $endgroup$
    – Theo Bendit
    Dec 12 '18 at 23:54










  • $begingroup$
    math.stackexchange.com/questions/1151491/…
    $endgroup$
    – T. Fo
    Dec 13 '18 at 0:49
















$begingroup$
Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
$endgroup$
– Theo Bendit
Dec 12 '18 at 23:52




$begingroup$
Hi, welcome to MSE! In order to make your question conform to the standards of the site, I've edited your question to include MathJax formatting. If this is not what you intended, you can roll back the edit. For future questions, please look at the MathJax Tutorial. You can also press "edit" to play around with the changes I've made.
$endgroup$
– Theo Bendit
Dec 12 '18 at 23:52












$begingroup$
Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
$endgroup$
– Theo Bendit
Dec 12 '18 at 23:54




$begingroup$
Also, your first question is confusing. You've seen proofs showing why $(AA^top)^{-1}$ exists, but you want to know what conditions we need on $A$ in order for this inverse to exist? Have you tried referring to the statement of the theorem you've seen proven?
$endgroup$
– Theo Bendit
Dec 12 '18 at 23:54












$begingroup$
math.stackexchange.com/questions/1151491/…
$endgroup$
– T. Fo
Dec 13 '18 at 0:49




$begingroup$
math.stackexchange.com/questions/1151491/…
$endgroup$
– T. Fo
Dec 13 '18 at 0:49










1 Answer
1






active

oldest

votes


















1












$begingroup$

There are different ways to prove the statement, here I use singular value decomposition.



Note on notation: I assume matrix $A_{n times m}$, with $n geq m$ to be consistent with most of litaratures. Note this gives you $A^{intercal}A$ invertible (not $A A^{intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{intercal}$ here.



Any arbitrary $n times m$ matrix $A_{n times m}$ ($n geq m$) admits singular value decomposition,



begin{equation}
A_{n times m} = U_{n times m} Sigma_{m times m} V_{m times m}^{intercal}
end{equation}

where $Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{intercal} U = I_{n times n}$ and $V^{intercal} V = I_{m times m}$, and $I$ is identity matrix.



Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $Sigma$ (the singular values) are non-zero.



Construct



begin{equation}
G_{m times m} = A^{intercal} A = V_{m times m} Sigma_{m times m}^2 V_{m times m}^{intercal}
end{equation}

Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix
begin{equation}
Lambda_{m times m} = Sigma_{m times m}^2.
end{equation}

Since the eigenvalues of $G$ (diagonals of $Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.



Now, construct
begin{equation}
H_{n times n} = A A^{intercal} = U_{n times n}
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}
U_{n times n}^{intercal}
end{equation}

This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues



begin{equation}
Lambda_{n times n} =
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}.
end{equation}



If $n > m$, then some of the eigenvalues of $H$ (diagonals of $Lambda_{n times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $Lambda_{n times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
    $endgroup$
    – Misha Lavrov
    Dec 13 '18 at 1:27











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037402%2fwhat-conditions-do-we-need-on-a-in-order-for-the-inverse-of-aa-top-to-exist%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

There are different ways to prove the statement, here I use singular value decomposition.



Note on notation: I assume matrix $A_{n times m}$, with $n geq m$ to be consistent with most of litaratures. Note this gives you $A^{intercal}A$ invertible (not $A A^{intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{intercal}$ here.



Any arbitrary $n times m$ matrix $A_{n times m}$ ($n geq m$) admits singular value decomposition,



begin{equation}
A_{n times m} = U_{n times m} Sigma_{m times m} V_{m times m}^{intercal}
end{equation}

where $Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{intercal} U = I_{n times n}$ and $V^{intercal} V = I_{m times m}$, and $I$ is identity matrix.



Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $Sigma$ (the singular values) are non-zero.



Construct



begin{equation}
G_{m times m} = A^{intercal} A = V_{m times m} Sigma_{m times m}^2 V_{m times m}^{intercal}
end{equation}

Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix
begin{equation}
Lambda_{m times m} = Sigma_{m times m}^2.
end{equation}

Since the eigenvalues of $G$ (diagonals of $Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.



Now, construct
begin{equation}
H_{n times n} = A A^{intercal} = U_{n times n}
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}
U_{n times n}^{intercal}
end{equation}

This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues



begin{equation}
Lambda_{n times n} =
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}.
end{equation}



If $n > m$, then some of the eigenvalues of $H$ (diagonals of $Lambda_{n times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $Lambda_{n times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
    $endgroup$
    – Misha Lavrov
    Dec 13 '18 at 1:27
















1












$begingroup$

There are different ways to prove the statement, here I use singular value decomposition.



Note on notation: I assume matrix $A_{n times m}$, with $n geq m$ to be consistent with most of litaratures. Note this gives you $A^{intercal}A$ invertible (not $A A^{intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{intercal}$ here.



Any arbitrary $n times m$ matrix $A_{n times m}$ ($n geq m$) admits singular value decomposition,



begin{equation}
A_{n times m} = U_{n times m} Sigma_{m times m} V_{m times m}^{intercal}
end{equation}

where $Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{intercal} U = I_{n times n}$ and $V^{intercal} V = I_{m times m}$, and $I$ is identity matrix.



Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $Sigma$ (the singular values) are non-zero.



Construct



begin{equation}
G_{m times m} = A^{intercal} A = V_{m times m} Sigma_{m times m}^2 V_{m times m}^{intercal}
end{equation}

Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix
begin{equation}
Lambda_{m times m} = Sigma_{m times m}^2.
end{equation}

Since the eigenvalues of $G$ (diagonals of $Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.



Now, construct
begin{equation}
H_{n times n} = A A^{intercal} = U_{n times n}
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}
U_{n times n}^{intercal}
end{equation}

This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues



begin{equation}
Lambda_{n times n} =
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}.
end{equation}



If $n > m$, then some of the eigenvalues of $H$ (diagonals of $Lambda_{n times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $Lambda_{n times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
    $endgroup$
    – Misha Lavrov
    Dec 13 '18 at 1:27














1












1








1





$begingroup$

There are different ways to prove the statement, here I use singular value decomposition.



Note on notation: I assume matrix $A_{n times m}$, with $n geq m$ to be consistent with most of litaratures. Note this gives you $A^{intercal}A$ invertible (not $A A^{intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{intercal}$ here.



Any arbitrary $n times m$ matrix $A_{n times m}$ ($n geq m$) admits singular value decomposition,



begin{equation}
A_{n times m} = U_{n times m} Sigma_{m times m} V_{m times m}^{intercal}
end{equation}

where $Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{intercal} U = I_{n times n}$ and $V^{intercal} V = I_{m times m}$, and $I$ is identity matrix.



Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $Sigma$ (the singular values) are non-zero.



Construct



begin{equation}
G_{m times m} = A^{intercal} A = V_{m times m} Sigma_{m times m}^2 V_{m times m}^{intercal}
end{equation}

Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix
begin{equation}
Lambda_{m times m} = Sigma_{m times m}^2.
end{equation}

Since the eigenvalues of $G$ (diagonals of $Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.



Now, construct
begin{equation}
H_{n times n} = A A^{intercal} = U_{n times n}
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}
U_{n times n}^{intercal}
end{equation}

This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues



begin{equation}
Lambda_{n times n} =
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}.
end{equation}



If $n > m$, then some of the eigenvalues of $H$ (diagonals of $Lambda_{n times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $Lambda_{n times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.






share|cite|improve this answer









$endgroup$



There are different ways to prove the statement, here I use singular value decomposition.



Note on notation: I assume matrix $A_{n times m}$, with $n geq m$ to be consistent with most of litaratures. Note this gives you $A^{intercal}A$ invertible (not $A A^{intercal}$), which is opposite of your question. You may assume the matrix $A$ in your question is $A^{intercal}$ here.



Any arbitrary $n times m$ matrix $A_{n times m}$ ($n geq m$) admits singular value decomposition,



begin{equation}
A_{n times m} = U_{n times m} Sigma_{m times m} V_{m times m}^{intercal}
end{equation}

where $Sigma$ is diagonal matrix with positive diagonals, $U$ and $V$ have orthogonal columns, that is $U^{intercal} U = I_{n times n}$ and $V^{intercal} V = I_{m times m}$, and $I$ is identity matrix.



Note that this is the reduced singular value decomposition, meaning that we excluded zero singular values from diagonals of $Sigma$. This happens when $A$ is singular, for example when $n > m$. If $A$ is full rank, then $n = m$ and all diagonals of $Sigma$ (the singular values) are non-zero.



Construct



begin{equation}
G_{m times m} = A^{intercal} A = V_{m times m} Sigma_{m times m}^2 V_{m times m}^{intercal}
end{equation}

Above is indeed eigenvalue decomposition of symmetric positive definite matrix $G$, with eigenvalue matrix
begin{equation}
Lambda_{m times m} = Sigma_{m times m}^2.
end{equation}

Since the eigenvalues of $G$ (diagonals of $Lambda$) are all positive numbers, the matrix $G$ is full rank, hence invertible.



Now, construct
begin{equation}
H_{n times n} = A A^{intercal} = U_{n times n}
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}
U_{n times n}^{intercal}
end{equation}

This is the eigenvalue decomposition of symmetric semi-positive definite matrix $H$, with eigenvalues



begin{equation}
Lambda_{n times n} =
begin{bmatrix}
Sigma_{m times m}^2 & O_{m times (n-m)} \
O_{(n-m) times m} & O_{(n-m) times (n-m)}
end{bmatrix}.
end{equation}



If $n > m$, then some of the eigenvalues of $H$ (diagonals of $Lambda_{n times n}$) are zero, hence $H$ is also singular and not invertible. Conversely, if $H$ has linearly independent columns, it's full rank, so $Lambda_{n times n}$ must have non-zero eigenvalues. This happens with $n = m$ and all diagonals of $Sigma$ must be non-zero. Therefore, $A$ has no zero singular values and is full rank.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Dec 13 '18 at 1:22









SiaSia

1064




1064








  • 1




    $begingroup$
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
    $endgroup$
    – Misha Lavrov
    Dec 13 '18 at 1:27














  • 1




    $begingroup$
    It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
    $endgroup$
    – Misha Lavrov
    Dec 13 '18 at 1:27








1




1




$begingroup$
It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
$endgroup$
– Misha Lavrov
Dec 13 '18 at 1:27




$begingroup$
It seems a bit silly to say that it's "inconsistent with the literature" to ever have a matrix with more columns than rows.
$endgroup$
– Misha Lavrov
Dec 13 '18 at 1:27


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037402%2fwhat-conditions-do-we-need-on-a-in-order-for-the-inverse-of-aa-top-to-exist%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Le Mesnil-Réaume

Ida-Boy-Ed-Garten

web3.py web3.isConnected() returns false always