Find the covariance of two jointly distributed RV's when you don't know the joint pmf
$begingroup$
I have two correlated random variables, X and Y, and I am trying to find their covariance. I know: their individual expected values $langle X rangle$ and $langle Y rangle$ and their individual variances $sigma^2_X$ and $sigma^2_Y$. If the covariance is given as
$cov(X,Y) = langle XY rangle -langle X rangle langle Y rangle$,
then I already know the second term but I have no idea how to compute the first term. I don't know the joint pmf. Do I need the joint pmf to calculate $langle XY rangle$?
probability probability-theory covariance
$endgroup$
add a comment |
$begingroup$
I have two correlated random variables, X and Y, and I am trying to find their covariance. I know: their individual expected values $langle X rangle$ and $langle Y rangle$ and their individual variances $sigma^2_X$ and $sigma^2_Y$. If the covariance is given as
$cov(X,Y) = langle XY rangle -langle X rangle langle Y rangle$,
then I already know the second term but I have no idea how to compute the first term. I don't know the joint pmf. Do I need the joint pmf to calculate $langle XY rangle$?
probability probability-theory covariance
$endgroup$
add a comment |
$begingroup$
I have two correlated random variables, X and Y, and I am trying to find their covariance. I know: their individual expected values $langle X rangle$ and $langle Y rangle$ and their individual variances $sigma^2_X$ and $sigma^2_Y$. If the covariance is given as
$cov(X,Y) = langle XY rangle -langle X rangle langle Y rangle$,
then I already know the second term but I have no idea how to compute the first term. I don't know the joint pmf. Do I need the joint pmf to calculate $langle XY rangle$?
probability probability-theory covariance
$endgroup$
I have two correlated random variables, X and Y, and I am trying to find their covariance. I know: their individual expected values $langle X rangle$ and $langle Y rangle$ and their individual variances $sigma^2_X$ and $sigma^2_Y$. If the covariance is given as
$cov(X,Y) = langle XY rangle -langle X rangle langle Y rangle$,
then I already know the second term but I have no idea how to compute the first term. I don't know the joint pmf. Do I need the joint pmf to calculate $langle XY rangle$?
probability probability-theory covariance
probability probability-theory covariance
asked Dec 18 '18 at 2:12
SabrinaChoiceSabrinaChoice
246
246
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Consider two normal random variables $X$ and $Y$ each having mean zero and unit variance.
If they are independent, then their covariance is $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X]mathbb{E}[Y]-mathbb{E}[X]mathbb{E}[Y]=0.$$
However, if $X=Y$ (i.e., the two variables are perfectly correlated), then $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X^{2}]-mathbb{E}[X]mathbb{E}[Y]=1.$$
What does this suggest?
$endgroup$
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
|
show 3 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3044705%2ffind-the-covariance-of-two-jointly-distributed-rvs-when-you-dont-know-the-join%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Consider two normal random variables $X$ and $Y$ each having mean zero and unit variance.
If they are independent, then their covariance is $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X]mathbb{E}[Y]-mathbb{E}[X]mathbb{E}[Y]=0.$$
However, if $X=Y$ (i.e., the two variables are perfectly correlated), then $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X^{2}]-mathbb{E}[X]mathbb{E}[Y]=1.$$
What does this suggest?
$endgroup$
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
|
show 3 more comments
$begingroup$
Consider two normal random variables $X$ and $Y$ each having mean zero and unit variance.
If they are independent, then their covariance is $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X]mathbb{E}[Y]-mathbb{E}[X]mathbb{E}[Y]=0.$$
However, if $X=Y$ (i.e., the two variables are perfectly correlated), then $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X^{2}]-mathbb{E}[X]mathbb{E}[Y]=1.$$
What does this suggest?
$endgroup$
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
|
show 3 more comments
$begingroup$
Consider two normal random variables $X$ and $Y$ each having mean zero and unit variance.
If they are independent, then their covariance is $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X]mathbb{E}[Y]-mathbb{E}[X]mathbb{E}[Y]=0.$$
However, if $X=Y$ (i.e., the two variables are perfectly correlated), then $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X^{2}]-mathbb{E}[X]mathbb{E}[Y]=1.$$
What does this suggest?
$endgroup$
Consider two normal random variables $X$ and $Y$ each having mean zero and unit variance.
If they are independent, then their covariance is $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X]mathbb{E}[Y]-mathbb{E}[X]mathbb{E}[Y]=0.$$
However, if $X=Y$ (i.e., the two variables are perfectly correlated), then $$mathbb{E}[XY]-mathbb{E}[X]mathbb{E}[Y]=mathbb{E}[X^{2}]-mathbb{E}[X]mathbb{E}[Y]=1.$$
What does this suggest?
answered Dec 18 '18 at 2:17
parsiadparsiad
18.3k32453
18.3k32453
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
|
show 3 more comments
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
$begingroup$
This suggests that $mathbb{E}[X^2]$=1, but I can't see much beyond that, sorry!
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:21
1
1
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
But what does it suggest about the covariance? Is it uniquely determined by mean and variance?
$endgroup$
– parsiad
Dec 18 '18 at 2:30
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
$begingroup$
I'm not sure because the bottom expression you wrote is the expression for the variance. So it states that the variance is equal to the variance. I'm not seeing how it can add any other constraint.
$endgroup$
– SabrinaChoice
Dec 18 '18 at 2:34
4
4
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
$begingroup$
I was trying to get you to come to the conclusion that you have insufficient information to determine the covariance.
$endgroup$
– parsiad
Dec 18 '18 at 2:37
1
1
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
$begingroup$
Put differently: the marginals do not characterize the joint distribution. Because $p_Xtimes p_Y$ has, by definition, the same marginals as $p_{XY}$.
$endgroup$
– Clement C.
Dec 18 '18 at 3:08
|
show 3 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3044705%2ffind-the-covariance-of-two-jointly-distributed-rvs-when-you-dont-know-the-join%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown