Expectation, variance and conditional probability of combined discrete and continuous random variables
up vote
0
down vote
favorite
Category: Introductory Probability
I have seen many of the other questions with similar titles (there are quite a few!), but unfortunately I am struggling to apply the concepts and knowledge I have learned to different problems/examples.
Problem Info: A black and white screen has pixels which either take the value $1$ with probability $p$ or a value of $0$ with probability $1-p$, with $p$ being the value of a random variable $P$ which is uniform on $[0,1]$. $X_j$ is the value of pixel $j$, but we observe $Y_j = X_j + noise$ for each pixel. The noise is normal and has mean $= 2$ and unit variance. The $p$ and $noise$ is the same for every pixel, and conditioned on $P$, every $X_j$ is independent. The noise is independent of $P$ and every $X_j$. $A$ is the event that the actual values of $X_1$ and $X_2$, (the first and second pixels) are $= 0$.
I want to find:$ E[Y_j]$, $Var[Y_j]$ and $f_{P|A}(p)$ for $0 leq p leq 1$
My attempt:
$$ E[Y_j] = E[X_j + noise]$$
$$ E[Y_j] = E[X_j]+E[noise] ,,,,,(independent)$$
Now, from the problem info, it seems that $X_j$ is a Bernoulli random variable with a mean of $p$, so:
$$E[Y_j]= p + 2$$
Since $p$ is the realized value of $P$ which is uniformly distributed, it has an expected value of $frac{1}{2}(a+b) = frac{1}{2}$
So, $E[Y_j] = 2.5$
Similarly, $$Var[Y_j]=Var[X_j] + Var[noise] $$
$$Var[Y_j] = p(1-p) + 1$$
$$Var[Y_j] = 0.25 + 1 = 1.25 $$
The last part is where I am having the most trouble. I know that the probability of the first pixel being $0$ is $(1-p)$, and the same for the second. I am not quite sure if I have to use Bayes rule, or even how to use it in this case. I know the answer will be a function of $p$.
$$f_{P|A}(p)= frac{f_P(p)*f_{A|P}(p)}{f_{A}(p)}$$
$$f_{P|A}(p)= frac{1*(1-p)(1-p)}{f_{A}(p)}$$
$$f_A(p) = int_{0}^{1} f_p(p) *f_{A|P}(p)dp$$
$$f_{P|A}(p)= frac{(1-p)(1-p)}{int_{0}^{1} 1 *(1-p)(1-p)dp}$$
$$f_{P|A}(p)= frac{(1-p)^2}{frac{1}{3}}$$
Edits:
$$E[X_j] = E[E[X_j|P]] = E[P] = p $$
$$E[Y_j] = 0.5 + 2 = 2.5$$
$$$$
$$Var(X_j) = (E[Var(X_j|P)] + Var(E[X_j|P]))$$
$$Var(Y_j) = Var(X_j) + var(noise)$$
$$Var(Y_j) = (E[Var(X_j|P)] + Var(E[X_j|P])) + 1$$
$$Var(Y_j) = E[P(1-P)] + Var(P) + 1$$
$$Var(Y_j) = E[P]-E[P^2] + Var(P) + 1$$
$$Var(Y_j) = E[P]-(Var(P)+(E[P])^2) + Var(P) + 1$$
$$Var(Y_j) = E[P]-Var(P)-(E[P])^2 + Var(P) + 1$$
$$Var(Y_j) = 0.5-frac{1}{12}-0.25 +frac{1}{12} + 1$$
$$Var(Y_j) = 1.25$$
same answers as before, but correct notation I believe.
Still having a hard time trying to get an answer for $f_{P|A}(p) $ though.
probability conditional-probability variance expected-value
New contributor
|
show 4 more comments
up vote
0
down vote
favorite
Category: Introductory Probability
I have seen many of the other questions with similar titles (there are quite a few!), but unfortunately I am struggling to apply the concepts and knowledge I have learned to different problems/examples.
Problem Info: A black and white screen has pixels which either take the value $1$ with probability $p$ or a value of $0$ with probability $1-p$, with $p$ being the value of a random variable $P$ which is uniform on $[0,1]$. $X_j$ is the value of pixel $j$, but we observe $Y_j = X_j + noise$ for each pixel. The noise is normal and has mean $= 2$ and unit variance. The $p$ and $noise$ is the same for every pixel, and conditioned on $P$, every $X_j$ is independent. The noise is independent of $P$ and every $X_j$. $A$ is the event that the actual values of $X_1$ and $X_2$, (the first and second pixels) are $= 0$.
I want to find:$ E[Y_j]$, $Var[Y_j]$ and $f_{P|A}(p)$ for $0 leq p leq 1$
My attempt:
$$ E[Y_j] = E[X_j + noise]$$
$$ E[Y_j] = E[X_j]+E[noise] ,,,,,(independent)$$
Now, from the problem info, it seems that $X_j$ is a Bernoulli random variable with a mean of $p$, so:
$$E[Y_j]= p + 2$$
Since $p$ is the realized value of $P$ which is uniformly distributed, it has an expected value of $frac{1}{2}(a+b) = frac{1}{2}$
So, $E[Y_j] = 2.5$
Similarly, $$Var[Y_j]=Var[X_j] + Var[noise] $$
$$Var[Y_j] = p(1-p) + 1$$
$$Var[Y_j] = 0.25 + 1 = 1.25 $$
The last part is where I am having the most trouble. I know that the probability of the first pixel being $0$ is $(1-p)$, and the same for the second. I am not quite sure if I have to use Bayes rule, or even how to use it in this case. I know the answer will be a function of $p$.
$$f_{P|A}(p)= frac{f_P(p)*f_{A|P}(p)}{f_{A}(p)}$$
$$f_{P|A}(p)= frac{1*(1-p)(1-p)}{f_{A}(p)}$$
$$f_A(p) = int_{0}^{1} f_p(p) *f_{A|P}(p)dp$$
$$f_{P|A}(p)= frac{(1-p)(1-p)}{int_{0}^{1} 1 *(1-p)(1-p)dp}$$
$$f_{P|A}(p)= frac{(1-p)^2}{frac{1}{3}}$$
Edits:
$$E[X_j] = E[E[X_j|P]] = E[P] = p $$
$$E[Y_j] = 0.5 + 2 = 2.5$$
$$$$
$$Var(X_j) = (E[Var(X_j|P)] + Var(E[X_j|P]))$$
$$Var(Y_j) = Var(X_j) + var(noise)$$
$$Var(Y_j) = (E[Var(X_j|P)] + Var(E[X_j|P])) + 1$$
$$Var(Y_j) = E[P(1-P)] + Var(P) + 1$$
$$Var(Y_j) = E[P]-E[P^2] + Var(P) + 1$$
$$Var(Y_j) = E[P]-(Var(P)+(E[P])^2) + Var(P) + 1$$
$$Var(Y_j) = E[P]-Var(P)-(E[P])^2 + Var(P) + 1$$
$$Var(Y_j) = 0.5-frac{1}{12}-0.25 +frac{1}{12} + 1$$
$$Var(Y_j) = 1.25$$
same answers as before, but correct notation I believe.
Still having a hard time trying to get an answer for $f_{P|A}(p) $ though.
probability conditional-probability variance expected-value
New contributor
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday
|
show 4 more comments
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Category: Introductory Probability
I have seen many of the other questions with similar titles (there are quite a few!), but unfortunately I am struggling to apply the concepts and knowledge I have learned to different problems/examples.
Problem Info: A black and white screen has pixels which either take the value $1$ with probability $p$ or a value of $0$ with probability $1-p$, with $p$ being the value of a random variable $P$ which is uniform on $[0,1]$. $X_j$ is the value of pixel $j$, but we observe $Y_j = X_j + noise$ for each pixel. The noise is normal and has mean $= 2$ and unit variance. The $p$ and $noise$ is the same for every pixel, and conditioned on $P$, every $X_j$ is independent. The noise is independent of $P$ and every $X_j$. $A$ is the event that the actual values of $X_1$ and $X_2$, (the first and second pixels) are $= 0$.
I want to find:$ E[Y_j]$, $Var[Y_j]$ and $f_{P|A}(p)$ for $0 leq p leq 1$
My attempt:
$$ E[Y_j] = E[X_j + noise]$$
$$ E[Y_j] = E[X_j]+E[noise] ,,,,,(independent)$$
Now, from the problem info, it seems that $X_j$ is a Bernoulli random variable with a mean of $p$, so:
$$E[Y_j]= p + 2$$
Since $p$ is the realized value of $P$ which is uniformly distributed, it has an expected value of $frac{1}{2}(a+b) = frac{1}{2}$
So, $E[Y_j] = 2.5$
Similarly, $$Var[Y_j]=Var[X_j] + Var[noise] $$
$$Var[Y_j] = p(1-p) + 1$$
$$Var[Y_j] = 0.25 + 1 = 1.25 $$
The last part is where I am having the most trouble. I know that the probability of the first pixel being $0$ is $(1-p)$, and the same for the second. I am not quite sure if I have to use Bayes rule, or even how to use it in this case. I know the answer will be a function of $p$.
$$f_{P|A}(p)= frac{f_P(p)*f_{A|P}(p)}{f_{A}(p)}$$
$$f_{P|A}(p)= frac{1*(1-p)(1-p)}{f_{A}(p)}$$
$$f_A(p) = int_{0}^{1} f_p(p) *f_{A|P}(p)dp$$
$$f_{P|A}(p)= frac{(1-p)(1-p)}{int_{0}^{1} 1 *(1-p)(1-p)dp}$$
$$f_{P|A}(p)= frac{(1-p)^2}{frac{1}{3}}$$
Edits:
$$E[X_j] = E[E[X_j|P]] = E[P] = p $$
$$E[Y_j] = 0.5 + 2 = 2.5$$
$$$$
$$Var(X_j) = (E[Var(X_j|P)] + Var(E[X_j|P]))$$
$$Var(Y_j) = Var(X_j) + var(noise)$$
$$Var(Y_j) = (E[Var(X_j|P)] + Var(E[X_j|P])) + 1$$
$$Var(Y_j) = E[P(1-P)] + Var(P) + 1$$
$$Var(Y_j) = E[P]-E[P^2] + Var(P) + 1$$
$$Var(Y_j) = E[P]-(Var(P)+(E[P])^2) + Var(P) + 1$$
$$Var(Y_j) = E[P]-Var(P)-(E[P])^2 + Var(P) + 1$$
$$Var(Y_j) = 0.5-frac{1}{12}-0.25 +frac{1}{12} + 1$$
$$Var(Y_j) = 1.25$$
same answers as before, but correct notation I believe.
Still having a hard time trying to get an answer for $f_{P|A}(p) $ though.
probability conditional-probability variance expected-value
New contributor
Category: Introductory Probability
I have seen many of the other questions with similar titles (there are quite a few!), but unfortunately I am struggling to apply the concepts and knowledge I have learned to different problems/examples.
Problem Info: A black and white screen has pixels which either take the value $1$ with probability $p$ or a value of $0$ with probability $1-p$, with $p$ being the value of a random variable $P$ which is uniform on $[0,1]$. $X_j$ is the value of pixel $j$, but we observe $Y_j = X_j + noise$ for each pixel. The noise is normal and has mean $= 2$ and unit variance. The $p$ and $noise$ is the same for every pixel, and conditioned on $P$, every $X_j$ is independent. The noise is independent of $P$ and every $X_j$. $A$ is the event that the actual values of $X_1$ and $X_2$, (the first and second pixels) are $= 0$.
I want to find:$ E[Y_j]$, $Var[Y_j]$ and $f_{P|A}(p)$ for $0 leq p leq 1$
My attempt:
$$ E[Y_j] = E[X_j + noise]$$
$$ E[Y_j] = E[X_j]+E[noise] ,,,,,(independent)$$
Now, from the problem info, it seems that $X_j$ is a Bernoulli random variable with a mean of $p$, so:
$$E[Y_j]= p + 2$$
Since $p$ is the realized value of $P$ which is uniformly distributed, it has an expected value of $frac{1}{2}(a+b) = frac{1}{2}$
So, $E[Y_j] = 2.5$
Similarly, $$Var[Y_j]=Var[X_j] + Var[noise] $$
$$Var[Y_j] = p(1-p) + 1$$
$$Var[Y_j] = 0.25 + 1 = 1.25 $$
The last part is where I am having the most trouble. I know that the probability of the first pixel being $0$ is $(1-p)$, and the same for the second. I am not quite sure if I have to use Bayes rule, or even how to use it in this case. I know the answer will be a function of $p$.
$$f_{P|A}(p)= frac{f_P(p)*f_{A|P}(p)}{f_{A}(p)}$$
$$f_{P|A}(p)= frac{1*(1-p)(1-p)}{f_{A}(p)}$$
$$f_A(p) = int_{0}^{1} f_p(p) *f_{A|P}(p)dp$$
$$f_{P|A}(p)= frac{(1-p)(1-p)}{int_{0}^{1} 1 *(1-p)(1-p)dp}$$
$$f_{P|A}(p)= frac{(1-p)^2}{frac{1}{3}}$$
Edits:
$$E[X_j] = E[E[X_j|P]] = E[P] = p $$
$$E[Y_j] = 0.5 + 2 = 2.5$$
$$$$
$$Var(X_j) = (E[Var(X_j|P)] + Var(E[X_j|P]))$$
$$Var(Y_j) = Var(X_j) + var(noise)$$
$$Var(Y_j) = (E[Var(X_j|P)] + Var(E[X_j|P])) + 1$$
$$Var(Y_j) = E[P(1-P)] + Var(P) + 1$$
$$Var(Y_j) = E[P]-E[P^2] + Var(P) + 1$$
$$Var(Y_j) = E[P]-(Var(P)+(E[P])^2) + Var(P) + 1$$
$$Var(Y_j) = E[P]-Var(P)-(E[P])^2 + Var(P) + 1$$
$$Var(Y_j) = 0.5-frac{1}{12}-0.25 +frac{1}{12} + 1$$
$$Var(Y_j) = 1.25$$
same answers as before, but correct notation I believe.
Still having a hard time trying to get an answer for $f_{P|A}(p) $ though.
probability conditional-probability variance expected-value
probability conditional-probability variance expected-value
New contributor
New contributor
edited Nov 16 at 5:48
New contributor
asked Nov 15 at 6:50
ChocolateChip
12
12
New contributor
New contributor
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday
|
show 4 more comments
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday
|
show 4 more comments
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
ChocolateChip is a new contributor. Be nice, and check out our Code of Conduct.
ChocolateChip is a new contributor. Be nice, and check out our Code of Conduct.
ChocolateChip is a new contributor. Be nice, and check out our Code of Conduct.
ChocolateChip is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999325%2fexpectation-variance-and-conditional-probability-of-combined-discrete-and-conti%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$Y_j$ is the observed value of pixel $j$. $X_j$ is the actual value of pixel $j$. What does $f_{P|A}(p)$ mean?
– rrv
Nov 15 at 7:15
$f_{P|A}(p)$ is the conditional probability of $P$ given the event $A$, where "the actual values of $X_1$ and $X_2,$ (the first and second pixels) are $=0$."
– ChocolateChip
Nov 15 at 7:40
It is not clear what event are you talking about.
– rrv
Nov 15 at 13:13
I said in the question and above, that A is the event in which the value of the first and second pixels are each zero.Sorry but I am not sure how else I can say that.
– ChocolateChip
Nov 16 at 5:21
okay. One event is A={first pixel=0 and second pixel=0}. What the is the other event?
– rrv
yesterday