Covariance matrices
$begingroup$
Consider two matrices of random variables, X and W, of same dimension.
Now, consider the product X'W.
I've been told that this matrix gives us the covariances between the elements of X and W. However, this is not immediately apparent to me. You do get sums of products, but these aren't exactly covariances.
statistics
$endgroup$
|
show 1 more comment
$begingroup$
Consider two matrices of random variables, X and W, of same dimension.
Now, consider the product X'W.
I've been told that this matrix gives us the covariances between the elements of X and W. However, this is not immediately apparent to me. You do get sums of products, but these aren't exactly covariances.
statistics
$endgroup$
1
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57
|
show 1 more comment
$begingroup$
Consider two matrices of random variables, X and W, of same dimension.
Now, consider the product X'W.
I've been told that this matrix gives us the covariances between the elements of X and W. However, this is not immediately apparent to me. You do get sums of products, but these aren't exactly covariances.
statistics
$endgroup$
Consider two matrices of random variables, X and W, of same dimension.
Now, consider the product X'W.
I've been told that this matrix gives us the covariances between the elements of X and W. However, this is not immediately apparent to me. You do get sums of products, but these aren't exactly covariances.
statistics
statistics
asked Dec 5 '18 at 2:44
StudentStudent
5811
5811
1
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57
|
show 1 more comment
1
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57
1
1
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57
|
show 1 more comment
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026525%2fcovariance-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3026525%2fcovariance-matrices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Usually, the covariance matrix refers to the expectation of the outer product of two vectors of random variables, i.e. the $n times n$ real valued matrix $mathbb{E}[bf{X} bf{X}^t]$ where $bf{X}$ $= [X_1,...,X_n]$ is an $n times 1$ vector of random variables.
$endgroup$
– zoidberg
Dec 5 '18 at 2:56
$begingroup$
What you are referring to sounds like a $textit{random}$ sample covariance matrix. See Wishart matrix
$endgroup$
– zoidberg
Dec 5 '18 at 2:59
$begingroup$
Thanks, but does the matrix that I have mentioned give some sort of indication about covariances? Intuitively, I think it does, but am not convinced.
$endgroup$
– Student
Dec 5 '18 at 3:06
$begingroup$
It does if the columns of X and W are i.i.d. random variables. Then the $ab$ entry of $X^tW$ is $sum_i X_{ai} W_{bi}$, which can be thought of as a sample covariance between the random variables $X_a$ and $W_b$ where $X_a$ is distributed like the entries in column $a$ of $X$ and $W_b$ is distributed like the entries in column $b$ of $W$.
$endgroup$
– zoidberg
Dec 5 '18 at 3:38
$begingroup$
Not exactly the sample covariances right? Need to subtract sample averages.
$endgroup$
– Student
Dec 5 '18 at 8:57