Calculating the expectation of a sum of dependent random variables












3












$begingroup$


Let $(X_i)_{i=1}^m$ be a sequence of i.i.d. Bernoulli random variables such that $Pr(X_i=1)=p<0.5$ and $Pr(X_i=0)=1-p$. Let $(Y_i)_{i=1}^m$ be defined as follows: $Y_1=X_1$, and for $2leq ileq m$
$$
Y_i = begin{cases}
1, &mathrm{if }pleft(1-frac{1}{i-1}sum_{j=1}^{i-1}Y_jright)+(1-p)frac{1}{i-1}sum_{j=1}^{i-1}Y_j<frac{1}{i}sum_{j=1}^iX_j\
0, & mathrm{otherwise}
end{cases}
$$

Finally, define
$$
Z_i=begin{cases}
Y_i, &mathrm{w.p. } 1-p\
1-Y_i, & mathrm{w.p. } p
end{cases}
$$

for $i=1,2,ldots,m$. I want to calculate or get an upper bound on the expectation $mathbb{E}left(sum_{i=1}^mZ_iright)$ and the variance $mathrm{Var}left(sum_{i=1}^mZ_iright)$.



For any $i$, we have $mathbb{E}Z_i = p+(1-2p)mathbb{E}Y_i$. So a trivial upper bound is
$$
mathbb{E}left(sum_{i=1}^mZ_iright) =mp+(1-2p)sum_{i=1}^mmathbb{E}Y_ileq mp+m(1-2p) = m(1-p)
$$

becasue $mathbb{E}Y_ileq1$. Unfortunately, when trying to calculate explicitly $mathbb{E}Y_i$ things quickly become quite involved. Is there any way to get better bounds? Numerical calculation (as well as analytical calculation of the first few) of the means of the random variables $Y_i$ suggest that $mathbb{E}Y_ileq mathbb{E}Y_1 = p$ for any $i$, which would give
$$
mathbb{E}left(sum_{i=1}^mZ_iright)leq 2mp(1-p)
$$

However, it is not clear if $mathbb{E}Y_ileq mathbb{E}Y_1$ is indeed true.










share|cite|improve this question











$endgroup$












  • $begingroup$
    So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
    $endgroup$
    – leonbloy
    Dec 10 '18 at 16:30










  • $begingroup$
    @leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 16:33












  • $begingroup$
    This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 16:56










  • $begingroup$
    @LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 17:11












  • $begingroup$
    oh, binary convolution, I see. Thanks for clarifying.
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 17:13
















3












$begingroup$


Let $(X_i)_{i=1}^m$ be a sequence of i.i.d. Bernoulli random variables such that $Pr(X_i=1)=p<0.5$ and $Pr(X_i=0)=1-p$. Let $(Y_i)_{i=1}^m$ be defined as follows: $Y_1=X_1$, and for $2leq ileq m$
$$
Y_i = begin{cases}
1, &mathrm{if }pleft(1-frac{1}{i-1}sum_{j=1}^{i-1}Y_jright)+(1-p)frac{1}{i-1}sum_{j=1}^{i-1}Y_j<frac{1}{i}sum_{j=1}^iX_j\
0, & mathrm{otherwise}
end{cases}
$$

Finally, define
$$
Z_i=begin{cases}
Y_i, &mathrm{w.p. } 1-p\
1-Y_i, & mathrm{w.p. } p
end{cases}
$$

for $i=1,2,ldots,m$. I want to calculate or get an upper bound on the expectation $mathbb{E}left(sum_{i=1}^mZ_iright)$ and the variance $mathrm{Var}left(sum_{i=1}^mZ_iright)$.



For any $i$, we have $mathbb{E}Z_i = p+(1-2p)mathbb{E}Y_i$. So a trivial upper bound is
$$
mathbb{E}left(sum_{i=1}^mZ_iright) =mp+(1-2p)sum_{i=1}^mmathbb{E}Y_ileq mp+m(1-2p) = m(1-p)
$$

becasue $mathbb{E}Y_ileq1$. Unfortunately, when trying to calculate explicitly $mathbb{E}Y_i$ things quickly become quite involved. Is there any way to get better bounds? Numerical calculation (as well as analytical calculation of the first few) of the means of the random variables $Y_i$ suggest that $mathbb{E}Y_ileq mathbb{E}Y_1 = p$ for any $i$, which would give
$$
mathbb{E}left(sum_{i=1}^mZ_iright)leq 2mp(1-p)
$$

However, it is not clear if $mathbb{E}Y_ileq mathbb{E}Y_1$ is indeed true.










share|cite|improve this question











$endgroup$












  • $begingroup$
    So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
    $endgroup$
    – leonbloy
    Dec 10 '18 at 16:30










  • $begingroup$
    @leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 16:33












  • $begingroup$
    This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 16:56










  • $begingroup$
    @LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 17:11












  • $begingroup$
    oh, binary convolution, I see. Thanks for clarifying.
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 17:13














3












3








3


2



$begingroup$


Let $(X_i)_{i=1}^m$ be a sequence of i.i.d. Bernoulli random variables such that $Pr(X_i=1)=p<0.5$ and $Pr(X_i=0)=1-p$. Let $(Y_i)_{i=1}^m$ be defined as follows: $Y_1=X_1$, and for $2leq ileq m$
$$
Y_i = begin{cases}
1, &mathrm{if }pleft(1-frac{1}{i-1}sum_{j=1}^{i-1}Y_jright)+(1-p)frac{1}{i-1}sum_{j=1}^{i-1}Y_j<frac{1}{i}sum_{j=1}^iX_j\
0, & mathrm{otherwise}
end{cases}
$$

Finally, define
$$
Z_i=begin{cases}
Y_i, &mathrm{w.p. } 1-p\
1-Y_i, & mathrm{w.p. } p
end{cases}
$$

for $i=1,2,ldots,m$. I want to calculate or get an upper bound on the expectation $mathbb{E}left(sum_{i=1}^mZ_iright)$ and the variance $mathrm{Var}left(sum_{i=1}^mZ_iright)$.



For any $i$, we have $mathbb{E}Z_i = p+(1-2p)mathbb{E}Y_i$. So a trivial upper bound is
$$
mathbb{E}left(sum_{i=1}^mZ_iright) =mp+(1-2p)sum_{i=1}^mmathbb{E}Y_ileq mp+m(1-2p) = m(1-p)
$$

becasue $mathbb{E}Y_ileq1$. Unfortunately, when trying to calculate explicitly $mathbb{E}Y_i$ things quickly become quite involved. Is there any way to get better bounds? Numerical calculation (as well as analytical calculation of the first few) of the means of the random variables $Y_i$ suggest that $mathbb{E}Y_ileq mathbb{E}Y_1 = p$ for any $i$, which would give
$$
mathbb{E}left(sum_{i=1}^mZ_iright)leq 2mp(1-p)
$$

However, it is not clear if $mathbb{E}Y_ileq mathbb{E}Y_1$ is indeed true.










share|cite|improve this question











$endgroup$




Let $(X_i)_{i=1}^m$ be a sequence of i.i.d. Bernoulli random variables such that $Pr(X_i=1)=p<0.5$ and $Pr(X_i=0)=1-p$. Let $(Y_i)_{i=1}^m$ be defined as follows: $Y_1=X_1$, and for $2leq ileq m$
$$
Y_i = begin{cases}
1, &mathrm{if }pleft(1-frac{1}{i-1}sum_{j=1}^{i-1}Y_jright)+(1-p)frac{1}{i-1}sum_{j=1}^{i-1}Y_j<frac{1}{i}sum_{j=1}^iX_j\
0, & mathrm{otherwise}
end{cases}
$$

Finally, define
$$
Z_i=begin{cases}
Y_i, &mathrm{w.p. } 1-p\
1-Y_i, & mathrm{w.p. } p
end{cases}
$$

for $i=1,2,ldots,m$. I want to calculate or get an upper bound on the expectation $mathbb{E}left(sum_{i=1}^mZ_iright)$ and the variance $mathrm{Var}left(sum_{i=1}^mZ_iright)$.



For any $i$, we have $mathbb{E}Z_i = p+(1-2p)mathbb{E}Y_i$. So a trivial upper bound is
$$
mathbb{E}left(sum_{i=1}^mZ_iright) =mp+(1-2p)sum_{i=1}^mmathbb{E}Y_ileq mp+m(1-2p) = m(1-p)
$$

becasue $mathbb{E}Y_ileq1$. Unfortunately, when trying to calculate explicitly $mathbb{E}Y_i$ things quickly become quite involved. Is there any way to get better bounds? Numerical calculation (as well as analytical calculation of the first few) of the means of the random variables $Y_i$ suggest that $mathbb{E}Y_ileq mathbb{E}Y_1 = p$ for any $i$, which would give
$$
mathbb{E}left(sum_{i=1}^mZ_iright)leq 2mp(1-p)
$$

However, it is not clear if $mathbb{E}Y_ileq mathbb{E}Y_1$ is indeed true.







probability probability-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 11 '18 at 12:21







Mike_D

















asked Dec 10 '18 at 14:42









Mike_DMike_D

162




162












  • $begingroup$
    So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
    $endgroup$
    – leonbloy
    Dec 10 '18 at 16:30










  • $begingroup$
    @leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 16:33












  • $begingroup$
    This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 16:56










  • $begingroup$
    @LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 17:11












  • $begingroup$
    oh, binary convolution, I see. Thanks for clarifying.
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 17:13


















  • $begingroup$
    So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
    $endgroup$
    – leonbloy
    Dec 10 '18 at 16:30










  • $begingroup$
    @leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 16:33












  • $begingroup$
    This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 16:56










  • $begingroup$
    @LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
    $endgroup$
    – Mike_D
    Dec 10 '18 at 17:11












  • $begingroup$
    oh, binary convolution, I see. Thanks for clarifying.
    $endgroup$
    – Lee David Chung Lin
    Dec 10 '18 at 17:13
















$begingroup$
So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
$endgroup$
– leonbloy
Dec 10 '18 at 16:30




$begingroup$
So actually the actual question here is to compute (or bound) $E[Y_i]$, what follows is pretty irrelevant, right?
$endgroup$
– leonbloy
Dec 10 '18 at 16:30












$begingroup$
@leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
$endgroup$
– Mike_D
Dec 10 '18 at 16:33






$begingroup$
@leonbloy Yes, for the calculation of the mean of $sum_iZ_i$ (I'm also curious about the variance) it is suffice to compute or bound $E[Y_i]$ for any $i$.
$endgroup$
– Mike_D
Dec 10 '18 at 16:33














$begingroup$
This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
$endgroup$
– Lee David Chung Lin
Dec 10 '18 at 16:56




$begingroup$
This looks interesting. Could you elaborate on the interpretation of the success criteria for $Y_i~$, the $p(1-frac{Y_{i-1}}{i-1})+(1-p)frac{Y_{i-1}}{i-1}<frac{1}{i}sum_{j=1}^iX_j$? Does it come from a geometric or combinatorial scheme?
$endgroup$
– Lee David Chung Lin
Dec 10 '18 at 16:56












$begingroup$
@LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
$endgroup$
– Mike_D
Dec 10 '18 at 17:11






$begingroup$
@LeeDavidChungLin Yes. First note that I've fixed a typo in this condition. The meaning condition is if the relative number of 1's up to time $i$ in the sequence $(X_1,ldots,X_i)$ is greater than the binary convolution of the parameter $p$ with the relative number of 1's up to time $i-1$ in the sequence $(Y_1,ldots,Y_{i-1})$, then $Y_i=1$. Note that $sum_iZ_i$ is in fact just $mathrm{Binomial}(sum_{i=1}^mY_i,1-p)+mathrm{Binomial}(m-sum_{i=1}^mY_i,p)$ whose expectation, given the $sum_{i=1}^mY_i$ is exactly $m$ times the left hand side of the success criteria for $Y_m$.
$endgroup$
– Mike_D
Dec 10 '18 at 17:11














$begingroup$
oh, binary convolution, I see. Thanks for clarifying.
$endgroup$
– Lee David Chung Lin
Dec 10 '18 at 17:13




$begingroup$
oh, binary convolution, I see. Thanks for clarifying.
$endgroup$
– Lee David Chung Lin
Dec 10 '18 at 17:13










1 Answer
1






active

oldest

votes


















0












$begingroup$

Not an answer. From simulations, it seems that $E[Y_i]$ has a suprisingly wild (chaotic?) behaviour, we should not expect to get a closed formula for it - nor even attempt to prove it's monotonous. Here's a graph for $p=0.15$ and $p=0.19$.



Now, in case you're thinking: "statistical noise"... it doesn't seem to be. If you look closely, I have drawn two graphs (orange and blue lines) for $p=0.15$ , for different runs ($10^6$ tries each), they are almost undistinguishable (look around $i=85$).



What seems to be true (already conjectured) is that $E[Y_i]le p$, for $p < frac12$ and $E[Y_i]ge p$, for $p > frac12$



enter image description here






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
    $endgroup$
    – Mike_D
    Dec 11 '18 at 19:04













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3033998%2fcalculating-the-expectation-of-a-sum-of-dependent-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

Not an answer. From simulations, it seems that $E[Y_i]$ has a suprisingly wild (chaotic?) behaviour, we should not expect to get a closed formula for it - nor even attempt to prove it's monotonous. Here's a graph for $p=0.15$ and $p=0.19$.



Now, in case you're thinking: "statistical noise"... it doesn't seem to be. If you look closely, I have drawn two graphs (orange and blue lines) for $p=0.15$ , for different runs ($10^6$ tries each), they are almost undistinguishable (look around $i=85$).



What seems to be true (already conjectured) is that $E[Y_i]le p$, for $p < frac12$ and $E[Y_i]ge p$, for $p > frac12$



enter image description here






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
    $endgroup$
    – Mike_D
    Dec 11 '18 at 19:04


















0












$begingroup$

Not an answer. From simulations, it seems that $E[Y_i]$ has a suprisingly wild (chaotic?) behaviour, we should not expect to get a closed formula for it - nor even attempt to prove it's monotonous. Here's a graph for $p=0.15$ and $p=0.19$.



Now, in case you're thinking: "statistical noise"... it doesn't seem to be. If you look closely, I have drawn two graphs (orange and blue lines) for $p=0.15$ , for different runs ($10^6$ tries each), they are almost undistinguishable (look around $i=85$).



What seems to be true (already conjectured) is that $E[Y_i]le p$, for $p < frac12$ and $E[Y_i]ge p$, for $p > frac12$



enter image description here






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
    $endgroup$
    – Mike_D
    Dec 11 '18 at 19:04
















0












0








0





$begingroup$

Not an answer. From simulations, it seems that $E[Y_i]$ has a suprisingly wild (chaotic?) behaviour, we should not expect to get a closed formula for it - nor even attempt to prove it's monotonous. Here's a graph for $p=0.15$ and $p=0.19$.



Now, in case you're thinking: "statistical noise"... it doesn't seem to be. If you look closely, I have drawn two graphs (orange and blue lines) for $p=0.15$ , for different runs ($10^6$ tries each), they are almost undistinguishable (look around $i=85$).



What seems to be true (already conjectured) is that $E[Y_i]le p$, for $p < frac12$ and $E[Y_i]ge p$, for $p > frac12$



enter image description here






share|cite|improve this answer











$endgroup$



Not an answer. From simulations, it seems that $E[Y_i]$ has a suprisingly wild (chaotic?) behaviour, we should not expect to get a closed formula for it - nor even attempt to prove it's monotonous. Here's a graph for $p=0.15$ and $p=0.19$.



Now, in case you're thinking: "statistical noise"... it doesn't seem to be. If you look closely, I have drawn two graphs (orange and blue lines) for $p=0.15$ , for different runs ($10^6$ tries each), they are almost undistinguishable (look around $i=85$).



What seems to be true (already conjectured) is that $E[Y_i]le p$, for $p < frac12$ and $E[Y_i]ge p$, for $p > frac12$



enter image description here







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 11 '18 at 20:47

























answered Dec 11 '18 at 16:36









leonbloyleonbloy

41.1k645107




41.1k645107












  • $begingroup$
    Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
    $endgroup$
    – Mike_D
    Dec 11 '18 at 19:04




















  • $begingroup$
    Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
    $endgroup$
    – Mike_D
    Dec 11 '18 at 19:04


















$begingroup$
Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
$endgroup$
– Mike_D
Dec 11 '18 at 19:04






$begingroup$
Thanks! Indeed this is exactly the behavior that I notices from my simulations too. The question is then whether it can be shown that $E(Y_i)leq p$ for $p<0.5$.
$endgroup$
– Mike_D
Dec 11 '18 at 19:04




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3033998%2fcalculating-the-expectation-of-a-sum-of-dependent-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bundesstraße 106

Verónica Boquete

Ida-Boy-Ed-Garten