Random Variable Transformation normal/binomial
$begingroup$
I have the following problem I can not solve:
We have two indipendent random Variables given by:
$$
X sim N_{(0,1)}
$$
and
$$
Y_p sim B_{(1,p)}
$$
Now I want to show, that $Z_p sim N_{(0,1)}$ $forall p in (0,1)$ , with
$$
Z_p = (-1)^{Y_p}cdot X
$$
Additionally there is a Hint whoch says, that:
$$
mathbb{P}(A) = mathbb{P}(A cap { Y_p = 1}) + mathbb{P}(A cap { Y_p = 0})
$$
I know that a transformation is given by:
$$
F_Z(z) = mathbb{P}(Z_p < z) = mathbb{P}((-1)^{Y_p}cdot X < z)
$$
And this I somehow get my boundarys for the integration. But I cannot figure out how this is done.
random-variables
$endgroup$
add a comment |
$begingroup$
I have the following problem I can not solve:
We have two indipendent random Variables given by:
$$
X sim N_{(0,1)}
$$
and
$$
Y_p sim B_{(1,p)}
$$
Now I want to show, that $Z_p sim N_{(0,1)}$ $forall p in (0,1)$ , with
$$
Z_p = (-1)^{Y_p}cdot X
$$
Additionally there is a Hint whoch says, that:
$$
mathbb{P}(A) = mathbb{P}(A cap { Y_p = 1}) + mathbb{P}(A cap { Y_p = 0})
$$
I know that a transformation is given by:
$$
F_Z(z) = mathbb{P}(Z_p < z) = mathbb{P}((-1)^{Y_p}cdot X < z)
$$
And this I somehow get my boundarys for the integration. But I cannot figure out how this is done.
random-variables
$endgroup$
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10
add a comment |
$begingroup$
I have the following problem I can not solve:
We have two indipendent random Variables given by:
$$
X sim N_{(0,1)}
$$
and
$$
Y_p sim B_{(1,p)}
$$
Now I want to show, that $Z_p sim N_{(0,1)}$ $forall p in (0,1)$ , with
$$
Z_p = (-1)^{Y_p}cdot X
$$
Additionally there is a Hint whoch says, that:
$$
mathbb{P}(A) = mathbb{P}(A cap { Y_p = 1}) + mathbb{P}(A cap { Y_p = 0})
$$
I know that a transformation is given by:
$$
F_Z(z) = mathbb{P}(Z_p < z) = mathbb{P}((-1)^{Y_p}cdot X < z)
$$
And this I somehow get my boundarys for the integration. But I cannot figure out how this is done.
random-variables
$endgroup$
I have the following problem I can not solve:
We have two indipendent random Variables given by:
$$
X sim N_{(0,1)}
$$
and
$$
Y_p sim B_{(1,p)}
$$
Now I want to show, that $Z_p sim N_{(0,1)}$ $forall p in (0,1)$ , with
$$
Z_p = (-1)^{Y_p}cdot X
$$
Additionally there is a Hint whoch says, that:
$$
mathbb{P}(A) = mathbb{P}(A cap { Y_p = 1}) + mathbb{P}(A cap { Y_p = 0})
$$
I know that a transformation is given by:
$$
F_Z(z) = mathbb{P}(Z_p < z) = mathbb{P}((-1)^{Y_p}cdot X < z)
$$
And this I somehow get my boundarys for the integration. But I cannot figure out how this is done.
random-variables
random-variables
edited Dec 16 '18 at 17:44
RedCrayon
asked Dec 16 '18 at 15:47
RedCrayonRedCrayon
62
62
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10
add a comment |
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
I preassume that $X$ and $Y_p$ are independent.
For every Borel measurable $A$:
$$begin{aligned}Pleft(Z_{p}in Aright) & =Pleft(Z_{p}in Amid Y_{p}=0right)Pleft(Y_{p}=0right)+Pleft(Z_{p}in Amid Y_{p}=1right)Pleft(Y_{p}=1right)\
& =Pleft(Xin Amid Y_{p}=0right)left(1-pright)+Pleft(-Xin Amid Y_{p}=1right)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(-Xin Aright)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(Xin Aright)p\
& =Pleft(Xin Aright)
end{aligned}
$$
The third equality rests on independence and the fourth equality rests on the fact that $X$ and $-X$ have the same distribution.
$endgroup$
add a comment |
$begingroup$
Welcome to MSE! :-)
First note, that if $X sim mathrm{N}(0,1)$, that also $-X sim mathrm{N}(0,1)$, you can see this when noting, that the probability density function of $mathrm{N}(0,1)$ is symmetric around $0$. Which to me would be an intuitive explanation of why this statement is true. However, this explanation works only, if $X$ and $Y_p$ are independent, hence I will assume that in the following.
Anyway, let's dig more into detail. Let $A subseteq mathbb{R}$ (measurable) and
begin{align*}mathbb{P}(Z_p in A) &= mathbb{P}({Z_p in A} cap {Y_p = 1}) + mathbb{P}({Z_p in A} cap {Y_p = 0}) \ &= mathbb{P}({-X in A} cap {Y_p = 1}) + mathbb{P}({X in A} cap {Y_p = 0}).end{align*}
The last statement is true, since we know whether $Z_p = -X$ or $Z_p = X$, if we fix $Y_p$. Since $X$ and $Y_p$ are independent, we obtain
begin{align*}mathbb{P}({-X in A} cap {Y_p = 1}) &= pmathbb{P}({-X in A}, \ mathbb{P}({X in A} cap {Y_p = 0}) &= (1-p)mathbb{P}({X in A}.end{align*}
The symmetry of $mathrm{N}(0,1)$ implies that $mathbb{P}(X in A) = mathbb{P}(-X in A)$. Applying that first displayed formula, we obtain
$$mathbb{P}(Z_p in A) = (p + 1 -p)mathbb{P}(X in A) = mathbb{P}(X in A) = mathrm{N}(0,1)(A).$$
Hence, $Z_p sim mathrm{N}(0,1)$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042748%2frandom-variable-transformation-normal-binomial%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I preassume that $X$ and $Y_p$ are independent.
For every Borel measurable $A$:
$$begin{aligned}Pleft(Z_{p}in Aright) & =Pleft(Z_{p}in Amid Y_{p}=0right)Pleft(Y_{p}=0right)+Pleft(Z_{p}in Amid Y_{p}=1right)Pleft(Y_{p}=1right)\
& =Pleft(Xin Amid Y_{p}=0right)left(1-pright)+Pleft(-Xin Amid Y_{p}=1right)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(-Xin Aright)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(Xin Aright)p\
& =Pleft(Xin Aright)
end{aligned}
$$
The third equality rests on independence and the fourth equality rests on the fact that $X$ and $-X$ have the same distribution.
$endgroup$
add a comment |
$begingroup$
I preassume that $X$ and $Y_p$ are independent.
For every Borel measurable $A$:
$$begin{aligned}Pleft(Z_{p}in Aright) & =Pleft(Z_{p}in Amid Y_{p}=0right)Pleft(Y_{p}=0right)+Pleft(Z_{p}in Amid Y_{p}=1right)Pleft(Y_{p}=1right)\
& =Pleft(Xin Amid Y_{p}=0right)left(1-pright)+Pleft(-Xin Amid Y_{p}=1right)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(-Xin Aright)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(Xin Aright)p\
& =Pleft(Xin Aright)
end{aligned}
$$
The third equality rests on independence and the fourth equality rests on the fact that $X$ and $-X$ have the same distribution.
$endgroup$
add a comment |
$begingroup$
I preassume that $X$ and $Y_p$ are independent.
For every Borel measurable $A$:
$$begin{aligned}Pleft(Z_{p}in Aright) & =Pleft(Z_{p}in Amid Y_{p}=0right)Pleft(Y_{p}=0right)+Pleft(Z_{p}in Amid Y_{p}=1right)Pleft(Y_{p}=1right)\
& =Pleft(Xin Amid Y_{p}=0right)left(1-pright)+Pleft(-Xin Amid Y_{p}=1right)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(-Xin Aright)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(Xin Aright)p\
& =Pleft(Xin Aright)
end{aligned}
$$
The third equality rests on independence and the fourth equality rests on the fact that $X$ and $-X$ have the same distribution.
$endgroup$
I preassume that $X$ and $Y_p$ are independent.
For every Borel measurable $A$:
$$begin{aligned}Pleft(Z_{p}in Aright) & =Pleft(Z_{p}in Amid Y_{p}=0right)Pleft(Y_{p}=0right)+Pleft(Z_{p}in Amid Y_{p}=1right)Pleft(Y_{p}=1right)\
& =Pleft(Xin Amid Y_{p}=0right)left(1-pright)+Pleft(-Xin Amid Y_{p}=1right)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(-Xin Aright)p\
& =Pleft(Xin Aright)left(1-pright)+Pleft(Xin Aright)p\
& =Pleft(Xin Aright)
end{aligned}
$$
The third equality rests on independence and the fourth equality rests on the fact that $X$ and $-X$ have the same distribution.
answered Dec 16 '18 at 16:08
drhabdrhab
103k545136
103k545136
add a comment |
add a comment |
$begingroup$
Welcome to MSE! :-)
First note, that if $X sim mathrm{N}(0,1)$, that also $-X sim mathrm{N}(0,1)$, you can see this when noting, that the probability density function of $mathrm{N}(0,1)$ is symmetric around $0$. Which to me would be an intuitive explanation of why this statement is true. However, this explanation works only, if $X$ and $Y_p$ are independent, hence I will assume that in the following.
Anyway, let's dig more into detail. Let $A subseteq mathbb{R}$ (measurable) and
begin{align*}mathbb{P}(Z_p in A) &= mathbb{P}({Z_p in A} cap {Y_p = 1}) + mathbb{P}({Z_p in A} cap {Y_p = 0}) \ &= mathbb{P}({-X in A} cap {Y_p = 1}) + mathbb{P}({X in A} cap {Y_p = 0}).end{align*}
The last statement is true, since we know whether $Z_p = -X$ or $Z_p = X$, if we fix $Y_p$. Since $X$ and $Y_p$ are independent, we obtain
begin{align*}mathbb{P}({-X in A} cap {Y_p = 1}) &= pmathbb{P}({-X in A}, \ mathbb{P}({X in A} cap {Y_p = 0}) &= (1-p)mathbb{P}({X in A}.end{align*}
The symmetry of $mathrm{N}(0,1)$ implies that $mathbb{P}(X in A) = mathbb{P}(-X in A)$. Applying that first displayed formula, we obtain
$$mathbb{P}(Z_p in A) = (p + 1 -p)mathbb{P}(X in A) = mathbb{P}(X in A) = mathrm{N}(0,1)(A).$$
Hence, $Z_p sim mathrm{N}(0,1)$.
$endgroup$
add a comment |
$begingroup$
Welcome to MSE! :-)
First note, that if $X sim mathrm{N}(0,1)$, that also $-X sim mathrm{N}(0,1)$, you can see this when noting, that the probability density function of $mathrm{N}(0,1)$ is symmetric around $0$. Which to me would be an intuitive explanation of why this statement is true. However, this explanation works only, if $X$ and $Y_p$ are independent, hence I will assume that in the following.
Anyway, let's dig more into detail. Let $A subseteq mathbb{R}$ (measurable) and
begin{align*}mathbb{P}(Z_p in A) &= mathbb{P}({Z_p in A} cap {Y_p = 1}) + mathbb{P}({Z_p in A} cap {Y_p = 0}) \ &= mathbb{P}({-X in A} cap {Y_p = 1}) + mathbb{P}({X in A} cap {Y_p = 0}).end{align*}
The last statement is true, since we know whether $Z_p = -X$ or $Z_p = X$, if we fix $Y_p$. Since $X$ and $Y_p$ are independent, we obtain
begin{align*}mathbb{P}({-X in A} cap {Y_p = 1}) &= pmathbb{P}({-X in A}, \ mathbb{P}({X in A} cap {Y_p = 0}) &= (1-p)mathbb{P}({X in A}.end{align*}
The symmetry of $mathrm{N}(0,1)$ implies that $mathbb{P}(X in A) = mathbb{P}(-X in A)$. Applying that first displayed formula, we obtain
$$mathbb{P}(Z_p in A) = (p + 1 -p)mathbb{P}(X in A) = mathbb{P}(X in A) = mathrm{N}(0,1)(A).$$
Hence, $Z_p sim mathrm{N}(0,1)$.
$endgroup$
add a comment |
$begingroup$
Welcome to MSE! :-)
First note, that if $X sim mathrm{N}(0,1)$, that also $-X sim mathrm{N}(0,1)$, you can see this when noting, that the probability density function of $mathrm{N}(0,1)$ is symmetric around $0$. Which to me would be an intuitive explanation of why this statement is true. However, this explanation works only, if $X$ and $Y_p$ are independent, hence I will assume that in the following.
Anyway, let's dig more into detail. Let $A subseteq mathbb{R}$ (measurable) and
begin{align*}mathbb{P}(Z_p in A) &= mathbb{P}({Z_p in A} cap {Y_p = 1}) + mathbb{P}({Z_p in A} cap {Y_p = 0}) \ &= mathbb{P}({-X in A} cap {Y_p = 1}) + mathbb{P}({X in A} cap {Y_p = 0}).end{align*}
The last statement is true, since we know whether $Z_p = -X$ or $Z_p = X$, if we fix $Y_p$. Since $X$ and $Y_p$ are independent, we obtain
begin{align*}mathbb{P}({-X in A} cap {Y_p = 1}) &= pmathbb{P}({-X in A}, \ mathbb{P}({X in A} cap {Y_p = 0}) &= (1-p)mathbb{P}({X in A}.end{align*}
The symmetry of $mathrm{N}(0,1)$ implies that $mathbb{P}(X in A) = mathbb{P}(-X in A)$. Applying that first displayed formula, we obtain
$$mathbb{P}(Z_p in A) = (p + 1 -p)mathbb{P}(X in A) = mathbb{P}(X in A) = mathrm{N}(0,1)(A).$$
Hence, $Z_p sim mathrm{N}(0,1)$.
$endgroup$
Welcome to MSE! :-)
First note, that if $X sim mathrm{N}(0,1)$, that also $-X sim mathrm{N}(0,1)$, you can see this when noting, that the probability density function of $mathrm{N}(0,1)$ is symmetric around $0$. Which to me would be an intuitive explanation of why this statement is true. However, this explanation works only, if $X$ and $Y_p$ are independent, hence I will assume that in the following.
Anyway, let's dig more into detail. Let $A subseteq mathbb{R}$ (measurable) and
begin{align*}mathbb{P}(Z_p in A) &= mathbb{P}({Z_p in A} cap {Y_p = 1}) + mathbb{P}({Z_p in A} cap {Y_p = 0}) \ &= mathbb{P}({-X in A} cap {Y_p = 1}) + mathbb{P}({X in A} cap {Y_p = 0}).end{align*}
The last statement is true, since we know whether $Z_p = -X$ or $Z_p = X$, if we fix $Y_p$. Since $X$ and $Y_p$ are independent, we obtain
begin{align*}mathbb{P}({-X in A} cap {Y_p = 1}) &= pmathbb{P}({-X in A}, \ mathbb{P}({X in A} cap {Y_p = 0}) &= (1-p)mathbb{P}({X in A}.end{align*}
The symmetry of $mathrm{N}(0,1)$ implies that $mathbb{P}(X in A) = mathbb{P}(-X in A)$. Applying that first displayed formula, we obtain
$$mathbb{P}(Z_p in A) = (p + 1 -p)mathbb{P}(X in A) = mathbb{P}(X in A) = mathrm{N}(0,1)(A).$$
Hence, $Z_p sim mathrm{N}(0,1)$.
answered Dec 16 '18 at 16:08
JonasJonas
398212
398212
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3042748%2frandom-variable-transformation-normal-binomial%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
You should add to your question that $X$ and $Y_p$ are independent.
$endgroup$
– drhab
Dec 16 '18 at 16:10