If $E[X|Y]=Y$ almost surely and $E[Y|X]=X$ almost surely then $X=Y$ almost surely












23












$begingroup$



Assume that $X$ and $Y$ are two random variables such that $Y=E[X|Y]$ almost surely and $X= E[Y|X]$ almost surely. Prove that $X=Y$ almost surely.




The hint I was given is to evaluate:
$$E[X-Y;X>a,Yleq a] + E[X-Y;Xleq a,Yleq a]$$



which I can write as: $$int_A(X-Y)dP +int_B(X-Y)dP$$ where $A={X>a, Yleq a}$ and $B={Xleq a,Yleq a}$.



But I need some more hints.










share|cite|improve this question











$endgroup$

















    23












    $begingroup$



    Assume that $X$ and $Y$ are two random variables such that $Y=E[X|Y]$ almost surely and $X= E[Y|X]$ almost surely. Prove that $X=Y$ almost surely.




    The hint I was given is to evaluate:
    $$E[X-Y;X>a,Yleq a] + E[X-Y;Xleq a,Yleq a]$$



    which I can write as: $$int_A(X-Y)dP +int_B(X-Y)dP$$ where $A={X>a, Yleq a}$ and $B={Xleq a,Yleq a}$.



    But I need some more hints.










    share|cite|improve this question











    $endgroup$















      23












      23








      23


      14



      $begingroup$



      Assume that $X$ and $Y$ are two random variables such that $Y=E[X|Y]$ almost surely and $X= E[Y|X]$ almost surely. Prove that $X=Y$ almost surely.




      The hint I was given is to evaluate:
      $$E[X-Y;X>a,Yleq a] + E[X-Y;Xleq a,Yleq a]$$



      which I can write as: $$int_A(X-Y)dP +int_B(X-Y)dP$$ where $A={X>a, Yleq a}$ and $B={Xleq a,Yleq a}$.



      But I need some more hints.










      share|cite|improve this question











      $endgroup$





      Assume that $X$ and $Y$ are two random variables such that $Y=E[X|Y]$ almost surely and $X= E[Y|X]$ almost surely. Prove that $X=Y$ almost surely.




      The hint I was given is to evaluate:
      $$E[X-Y;X>a,Yleq a] + E[X-Y;Xleq a,Yleq a]$$



      which I can write as: $$int_A(X-Y)dP +int_B(X-Y)dP$$ where $A={X>a, Yleq a}$ and $B={Xleq a,Yleq a}$.



      But I need some more hints.







      probability-theory conditional-expectation






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 9 '15 at 7:10









      Did

      248k23226465




      248k23226465










      asked Feb 7 '14 at 1:45









      PeterPeter

      86211126




      86211126






















          3 Answers
          3






          active

          oldest

          votes


















          17












          $begingroup$

          Simply follow the hint... First note that, since $E(Xmid Y)=Y$ almost surely, for every $c$, $$E(X-Y;Yleqslant c)=E(E(Xmid Y)-Y;Yleqslant c)=0,$$ and that, decomposing the event $[Yleqslant c]$ into the disjoint union of the events $[X>c,Yleqslant c]$ and $[Xleqslant c,Yleqslant c]$, one has $$E(X-Y;Yleqslant c)=U_c+E(X-Y;Xleqslant c,Yleqslant c),$$ with $$U_c=E(X-Y;X>c,Yleqslant c).$$ Since $U_cgeqslant0$, this shows that $$E(X-Y;Xleqslant c,Yleqslant c)leqslant 0.$$ Exchanging $X$ and $Y$ and following the same steps, using the hypothesis that $E(Ymid X)=X$ almost surely instead of $E(Xmid Y)=Y$ almost surely, one gets $$E(Y-X;Xleqslant c,Yleqslant c)leqslant0,$$ that is $$E(Y-X;Xleqslant c,Yleqslant c)=0,$$ which, coming back to the first decomposition of an expectation above, yields $U_c=0$, that is, $$E(X-Y;X>cgeqslant Y)=0.$$ This is the expectation of a nonnegative random variable hence $(X-Y)mathbf 1_{X>cgeqslant Y}=0$ almost surely, which can only happen if the event $[X>cgeqslant Y]$ has probability zero. Now, $$[X>Y]=bigcup_{cinmathbb Q}[X>cgeqslant Y],$$ hence all this proves that $P(X>Y)=0$. By symmetry, $P(Y>X)=0$ and we are done.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
            $endgroup$
            – Did
            Nov 9 '15 at 7:58












          • $begingroup$
            is this question a duplicate, then? We should mark it as such.
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:09










          • $begingroup$
            @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
            $endgroup$
            – Did
            Nov 9 '15 at 14:10












          • $begingroup$
            Can you give a link to the previous instance(s) of the $L^2$ case?
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:14






          • 1




            $begingroup$
            Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
            $endgroup$
            – bunny
            Dec 8 '17 at 19:55



















          6












          $begingroup$

          If $X,Y$ are square-integrable we can give a quick proof.



          Consider the random variable $(X-Y)^2 = X^2 - 2XY + Y^2$. Let's compute its expectation by conditioning on $X$. We have
          $$begin{align*} E[(X-Y)^2] &= E[E[(X-Y)^2 mid X]] \
          &= E[E[X^2 - 2XY + Y^2 mid X]] \
          &= E[X^2 - 2 X E[Y mid X] + E[Y^2 mid X]] \
          &= E[X^2 - 2 X^2 + E[Y^2 mid X]] \
          &= E[-X^2 + Y^2]end{align*}$$
          If we condition on $Y$ instead we get $E[(X-Y)^2] = E[X^2 - Y^2]$. Comparing these, we see that we have $E[(X-Y)^2] = -E[(X-Y)^2]$, i.e. $E[(X-Y)^2]=0$. This means $X=Y$ almost surely.



          Unfortunately I don't quite see a way to handle the case where $X,Y$ are merely integrable, since in that case some of the expectations used above may be undefined.



          Acknowledgement of priority. After writing this I found (thanks to Did) that essentially the same proof was given by Michael Hardy in Conditional expectation and almost sure equality.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
            $endgroup$
            – user428487
            Mar 21 '18 at 3:50






          • 1




            $begingroup$
            @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 4:32










          • $begingroup$
            I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
            $endgroup$
            – user428487
            Mar 21 '18 at 10:55






          • 1




            $begingroup$
            @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 13:38



















          1












          $begingroup$

          Let $h:{mathbb R}to{mathbb R}$ be bounded and strictly increasing. (For example, $h(x)=1/(1+e^{-x})$.) Since $X$ and $Y$ are integrable and $h$ is bounded, the random variable $Z:=(X-Y)(h(X)-h(Y))$ is integrable, with expectation
          $$
          E[Xh(X) -Yh(X)-Xh(Y) +Yh(Y)].tag1
          $$

          But $E[Yh(X)]=Eleft[E(Yh(X)mid X)right]=E[h(X)E(Ymid X)]=E[Xh(X)]$ and similarly $E[Xh(Y)]=E[Yh(Y)]$. Plugging into (1), we find the expectation of $Z$ is zero. But $Z$ is non-negative, since $h$ is increasing. It follows that $Z$ is zero almost surely. To finish the proof, the fact that $h$ is one-to-one implies the set inclusion
          $$left{(X-Y)(h(X)-h(Y))=0right}subsetleft{X=Yright}.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Yep, this is the other way.
            $endgroup$
            – Did
            Dec 19 '18 at 18:15











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f666843%2fif-exy-y-almost-surely-and-eyx-x-almost-surely-then-x-y-almost-surel%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          3 Answers
          3






          active

          oldest

          votes








          3 Answers
          3






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          17












          $begingroup$

          Simply follow the hint... First note that, since $E(Xmid Y)=Y$ almost surely, for every $c$, $$E(X-Y;Yleqslant c)=E(E(Xmid Y)-Y;Yleqslant c)=0,$$ and that, decomposing the event $[Yleqslant c]$ into the disjoint union of the events $[X>c,Yleqslant c]$ and $[Xleqslant c,Yleqslant c]$, one has $$E(X-Y;Yleqslant c)=U_c+E(X-Y;Xleqslant c,Yleqslant c),$$ with $$U_c=E(X-Y;X>c,Yleqslant c).$$ Since $U_cgeqslant0$, this shows that $$E(X-Y;Xleqslant c,Yleqslant c)leqslant 0.$$ Exchanging $X$ and $Y$ and following the same steps, using the hypothesis that $E(Ymid X)=X$ almost surely instead of $E(Xmid Y)=Y$ almost surely, one gets $$E(Y-X;Xleqslant c,Yleqslant c)leqslant0,$$ that is $$E(Y-X;Xleqslant c,Yleqslant c)=0,$$ which, coming back to the first decomposition of an expectation above, yields $U_c=0$, that is, $$E(X-Y;X>cgeqslant Y)=0.$$ This is the expectation of a nonnegative random variable hence $(X-Y)mathbf 1_{X>cgeqslant Y}=0$ almost surely, which can only happen if the event $[X>cgeqslant Y]$ has probability zero. Now, $$[X>Y]=bigcup_{cinmathbb Q}[X>cgeqslant Y],$$ hence all this proves that $P(X>Y)=0$. By symmetry, $P(Y>X)=0$ and we are done.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
            $endgroup$
            – Did
            Nov 9 '15 at 7:58












          • $begingroup$
            is this question a duplicate, then? We should mark it as such.
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:09










          • $begingroup$
            @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
            $endgroup$
            – Did
            Nov 9 '15 at 14:10












          • $begingroup$
            Can you give a link to the previous instance(s) of the $L^2$ case?
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:14






          • 1




            $begingroup$
            Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
            $endgroup$
            – bunny
            Dec 8 '17 at 19:55
















          17












          $begingroup$

          Simply follow the hint... First note that, since $E(Xmid Y)=Y$ almost surely, for every $c$, $$E(X-Y;Yleqslant c)=E(E(Xmid Y)-Y;Yleqslant c)=0,$$ and that, decomposing the event $[Yleqslant c]$ into the disjoint union of the events $[X>c,Yleqslant c]$ and $[Xleqslant c,Yleqslant c]$, one has $$E(X-Y;Yleqslant c)=U_c+E(X-Y;Xleqslant c,Yleqslant c),$$ with $$U_c=E(X-Y;X>c,Yleqslant c).$$ Since $U_cgeqslant0$, this shows that $$E(X-Y;Xleqslant c,Yleqslant c)leqslant 0.$$ Exchanging $X$ and $Y$ and following the same steps, using the hypothesis that $E(Ymid X)=X$ almost surely instead of $E(Xmid Y)=Y$ almost surely, one gets $$E(Y-X;Xleqslant c,Yleqslant c)leqslant0,$$ that is $$E(Y-X;Xleqslant c,Yleqslant c)=0,$$ which, coming back to the first decomposition of an expectation above, yields $U_c=0$, that is, $$E(X-Y;X>cgeqslant Y)=0.$$ This is the expectation of a nonnegative random variable hence $(X-Y)mathbf 1_{X>cgeqslant Y}=0$ almost surely, which can only happen if the event $[X>cgeqslant Y]$ has probability zero. Now, $$[X>Y]=bigcup_{cinmathbb Q}[X>cgeqslant Y],$$ hence all this proves that $P(X>Y)=0$. By symmetry, $P(Y>X)=0$ and we are done.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
            $endgroup$
            – Did
            Nov 9 '15 at 7:58












          • $begingroup$
            is this question a duplicate, then? We should mark it as such.
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:09










          • $begingroup$
            @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
            $endgroup$
            – Did
            Nov 9 '15 at 14:10












          • $begingroup$
            Can you give a link to the previous instance(s) of the $L^2$ case?
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:14






          • 1




            $begingroup$
            Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
            $endgroup$
            – bunny
            Dec 8 '17 at 19:55














          17












          17








          17





          $begingroup$

          Simply follow the hint... First note that, since $E(Xmid Y)=Y$ almost surely, for every $c$, $$E(X-Y;Yleqslant c)=E(E(Xmid Y)-Y;Yleqslant c)=0,$$ and that, decomposing the event $[Yleqslant c]$ into the disjoint union of the events $[X>c,Yleqslant c]$ and $[Xleqslant c,Yleqslant c]$, one has $$E(X-Y;Yleqslant c)=U_c+E(X-Y;Xleqslant c,Yleqslant c),$$ with $$U_c=E(X-Y;X>c,Yleqslant c).$$ Since $U_cgeqslant0$, this shows that $$E(X-Y;Xleqslant c,Yleqslant c)leqslant 0.$$ Exchanging $X$ and $Y$ and following the same steps, using the hypothesis that $E(Ymid X)=X$ almost surely instead of $E(Xmid Y)=Y$ almost surely, one gets $$E(Y-X;Xleqslant c,Yleqslant c)leqslant0,$$ that is $$E(Y-X;Xleqslant c,Yleqslant c)=0,$$ which, coming back to the first decomposition of an expectation above, yields $U_c=0$, that is, $$E(X-Y;X>cgeqslant Y)=0.$$ This is the expectation of a nonnegative random variable hence $(X-Y)mathbf 1_{X>cgeqslant Y}=0$ almost surely, which can only happen if the event $[X>cgeqslant Y]$ has probability zero. Now, $$[X>Y]=bigcup_{cinmathbb Q}[X>cgeqslant Y],$$ hence all this proves that $P(X>Y)=0$. By symmetry, $P(Y>X)=0$ and we are done.






          share|cite|improve this answer











          $endgroup$



          Simply follow the hint... First note that, since $E(Xmid Y)=Y$ almost surely, for every $c$, $$E(X-Y;Yleqslant c)=E(E(Xmid Y)-Y;Yleqslant c)=0,$$ and that, decomposing the event $[Yleqslant c]$ into the disjoint union of the events $[X>c,Yleqslant c]$ and $[Xleqslant c,Yleqslant c]$, one has $$E(X-Y;Yleqslant c)=U_c+E(X-Y;Xleqslant c,Yleqslant c),$$ with $$U_c=E(X-Y;X>c,Yleqslant c).$$ Since $U_cgeqslant0$, this shows that $$E(X-Y;Xleqslant c,Yleqslant c)leqslant 0.$$ Exchanging $X$ and $Y$ and following the same steps, using the hypothesis that $E(Ymid X)=X$ almost surely instead of $E(Xmid Y)=Y$ almost surely, one gets $$E(Y-X;Xleqslant c,Yleqslant c)leqslant0,$$ that is $$E(Y-X;Xleqslant c,Yleqslant c)=0,$$ which, coming back to the first decomposition of an expectation above, yields $U_c=0$, that is, $$E(X-Y;X>cgeqslant Y)=0.$$ This is the expectation of a nonnegative random variable hence $(X-Y)mathbf 1_{X>cgeqslant Y}=0$ almost surely, which can only happen if the event $[X>cgeqslant Y]$ has probability zero. Now, $$[X>Y]=bigcup_{cinmathbb Q}[X>cgeqslant Y],$$ hence all this proves that $P(X>Y)=0$. By symmetry, $P(Y>X)=0$ and we are done.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 9 '15 at 14:15


























          community wiki





          2 revs
          Did













          • $begingroup$
            If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
            $endgroup$
            – Did
            Nov 9 '15 at 7:58












          • $begingroup$
            is this question a duplicate, then? We should mark it as such.
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:09










          • $begingroup$
            @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
            $endgroup$
            – Did
            Nov 9 '15 at 14:10












          • $begingroup$
            Can you give a link to the previous instance(s) of the $L^2$ case?
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:14






          • 1




            $begingroup$
            Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
            $endgroup$
            – bunny
            Dec 8 '17 at 19:55


















          • $begingroup$
            If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
            $endgroup$
            – Did
            Nov 9 '15 at 7:58












          • $begingroup$
            is this question a duplicate, then? We should mark it as such.
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:09










          • $begingroup$
            @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
            $endgroup$
            – Did
            Nov 9 '15 at 14:10












          • $begingroup$
            Can you give a link to the previous instance(s) of the $L^2$ case?
            $endgroup$
            – Nate Eldredge
            Nov 9 '15 at 14:14






          • 1




            $begingroup$
            Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
            $endgroup$
            – bunny
            Dec 8 '17 at 19:55
















          $begingroup$
          If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
          $endgroup$
          – Did
          Nov 9 '15 at 7:58






          $begingroup$
          If one assumes furthermore that $X$ and $Y$ are square integrable, the $L^2$ proof in the other answer (already on several other pages of the site) is simpler.
          $endgroup$
          – Did
          Nov 9 '15 at 7:58














          $begingroup$
          is this question a duplicate, then? We should mark it as such.
          $endgroup$
          – Nate Eldredge
          Nov 9 '15 at 14:09




          $begingroup$
          is this question a duplicate, then? We should mark it as such.
          $endgroup$
          – Nate Eldredge
          Nov 9 '15 at 14:09












          $begingroup$
          @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
          $endgroup$
          – Did
          Nov 9 '15 at 14:10






          $begingroup$
          @NateEldredge If the question was assuming square integrability, it would be a duplicate. The general version (assuming only integrability) might also be a duplicate but I am not sure (and I said nothing about that).
          $endgroup$
          – Did
          Nov 9 '15 at 14:10














          $begingroup$
          Can you give a link to the previous instance(s) of the $L^2$ case?
          $endgroup$
          – Nate Eldredge
          Nov 9 '15 at 14:14




          $begingroup$
          Can you give a link to the previous instance(s) of the $L^2$ case?
          $endgroup$
          – Nate Eldredge
          Nov 9 '15 at 14:14




          1




          1




          $begingroup$
          Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
          $endgroup$
          – bunny
          Dec 8 '17 at 19:55




          $begingroup$
          Can you explain me why $E[X-Y;Yle c]=E[E[X|Y]-Y;Yle c]$
          $endgroup$
          – bunny
          Dec 8 '17 at 19:55











          6












          $begingroup$

          If $X,Y$ are square-integrable we can give a quick proof.



          Consider the random variable $(X-Y)^2 = X^2 - 2XY + Y^2$. Let's compute its expectation by conditioning on $X$. We have
          $$begin{align*} E[(X-Y)^2] &= E[E[(X-Y)^2 mid X]] \
          &= E[E[X^2 - 2XY + Y^2 mid X]] \
          &= E[X^2 - 2 X E[Y mid X] + E[Y^2 mid X]] \
          &= E[X^2 - 2 X^2 + E[Y^2 mid X]] \
          &= E[-X^2 + Y^2]end{align*}$$
          If we condition on $Y$ instead we get $E[(X-Y)^2] = E[X^2 - Y^2]$. Comparing these, we see that we have $E[(X-Y)^2] = -E[(X-Y)^2]$, i.e. $E[(X-Y)^2]=0$. This means $X=Y$ almost surely.



          Unfortunately I don't quite see a way to handle the case where $X,Y$ are merely integrable, since in that case some of the expectations used above may be undefined.



          Acknowledgement of priority. After writing this I found (thanks to Did) that essentially the same proof was given by Michael Hardy in Conditional expectation and almost sure equality.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
            $endgroup$
            – user428487
            Mar 21 '18 at 3:50






          • 1




            $begingroup$
            @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 4:32










          • $begingroup$
            I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
            $endgroup$
            – user428487
            Mar 21 '18 at 10:55






          • 1




            $begingroup$
            @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 13:38
















          6












          $begingroup$

          If $X,Y$ are square-integrable we can give a quick proof.



          Consider the random variable $(X-Y)^2 = X^2 - 2XY + Y^2$. Let's compute its expectation by conditioning on $X$. We have
          $$begin{align*} E[(X-Y)^2] &= E[E[(X-Y)^2 mid X]] \
          &= E[E[X^2 - 2XY + Y^2 mid X]] \
          &= E[X^2 - 2 X E[Y mid X] + E[Y^2 mid X]] \
          &= E[X^2 - 2 X^2 + E[Y^2 mid X]] \
          &= E[-X^2 + Y^2]end{align*}$$
          If we condition on $Y$ instead we get $E[(X-Y)^2] = E[X^2 - Y^2]$. Comparing these, we see that we have $E[(X-Y)^2] = -E[(X-Y)^2]$, i.e. $E[(X-Y)^2]=0$. This means $X=Y$ almost surely.



          Unfortunately I don't quite see a way to handle the case where $X,Y$ are merely integrable, since in that case some of the expectations used above may be undefined.



          Acknowledgement of priority. After writing this I found (thanks to Did) that essentially the same proof was given by Michael Hardy in Conditional expectation and almost sure equality.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
            $endgroup$
            – user428487
            Mar 21 '18 at 3:50






          • 1




            $begingroup$
            @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 4:32










          • $begingroup$
            I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
            $endgroup$
            – user428487
            Mar 21 '18 at 10:55






          • 1




            $begingroup$
            @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 13:38














          6












          6








          6





          $begingroup$

          If $X,Y$ are square-integrable we can give a quick proof.



          Consider the random variable $(X-Y)^2 = X^2 - 2XY + Y^2$. Let's compute its expectation by conditioning on $X$. We have
          $$begin{align*} E[(X-Y)^2] &= E[E[(X-Y)^2 mid X]] \
          &= E[E[X^2 - 2XY + Y^2 mid X]] \
          &= E[X^2 - 2 X E[Y mid X] + E[Y^2 mid X]] \
          &= E[X^2 - 2 X^2 + E[Y^2 mid X]] \
          &= E[-X^2 + Y^2]end{align*}$$
          If we condition on $Y$ instead we get $E[(X-Y)^2] = E[X^2 - Y^2]$. Comparing these, we see that we have $E[(X-Y)^2] = -E[(X-Y)^2]$, i.e. $E[(X-Y)^2]=0$. This means $X=Y$ almost surely.



          Unfortunately I don't quite see a way to handle the case where $X,Y$ are merely integrable, since in that case some of the expectations used above may be undefined.



          Acknowledgement of priority. After writing this I found (thanks to Did) that essentially the same proof was given by Michael Hardy in Conditional expectation and almost sure equality.






          share|cite|improve this answer











          $endgroup$



          If $X,Y$ are square-integrable we can give a quick proof.



          Consider the random variable $(X-Y)^2 = X^2 - 2XY + Y^2$. Let's compute its expectation by conditioning on $X$. We have
          $$begin{align*} E[(X-Y)^2] &= E[E[(X-Y)^2 mid X]] \
          &= E[E[X^2 - 2XY + Y^2 mid X]] \
          &= E[X^2 - 2 X E[Y mid X] + E[Y^2 mid X]] \
          &= E[X^2 - 2 X^2 + E[Y^2 mid X]] \
          &= E[-X^2 + Y^2]end{align*}$$
          If we condition on $Y$ instead we get $E[(X-Y)^2] = E[X^2 - Y^2]$. Comparing these, we see that we have $E[(X-Y)^2] = -E[(X-Y)^2]$, i.e. $E[(X-Y)^2]=0$. This means $X=Y$ almost surely.



          Unfortunately I don't quite see a way to handle the case where $X,Y$ are merely integrable, since in that case some of the expectations used above may be undefined.



          Acknowledgement of priority. After writing this I found (thanks to Did) that essentially the same proof was given by Michael Hardy in Conditional expectation and almost sure equality.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 13 '17 at 12:21









          Community

          1




          1










          answered Nov 9 '15 at 7:30









          Nate EldredgeNate Eldredge

          64.2k682174




          64.2k682174












          • $begingroup$
            Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
            $endgroup$
            – user428487
            Mar 21 '18 at 3:50






          • 1




            $begingroup$
            @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 4:32










          • $begingroup$
            I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
            $endgroup$
            – user428487
            Mar 21 '18 at 10:55






          • 1




            $begingroup$
            @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 13:38


















          • $begingroup$
            Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
            $endgroup$
            – user428487
            Mar 21 '18 at 3:50






          • 1




            $begingroup$
            @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 4:32










          • $begingroup$
            I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
            $endgroup$
            – user428487
            Mar 21 '18 at 10:55






          • 1




            $begingroup$
            @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
            $endgroup$
            – Nate Eldredge
            Mar 21 '18 at 13:38
















          $begingroup$
          Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
          $endgroup$
          – user428487
          Mar 21 '18 at 3:50




          $begingroup$
          Sorry if this is a really stupid question, but why is $E[(X-Y)^2] = E[E[(X-Y)^2 mid X]]$?
          $endgroup$
          – user428487
          Mar 21 '18 at 3:50




          1




          1




          $begingroup$
          @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
          $endgroup$
          – Nate Eldredge
          Mar 21 '18 at 4:32




          $begingroup$
          @user428487: Basic property of conditional expectation. $E[E[Z mid mathcal{F}]] = E[Z]$. You should be able to see it as an immediate consequence of whatever you take as the definition.
          $endgroup$
          – Nate Eldredge
          Mar 21 '18 at 4:32












          $begingroup$
          I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
          $endgroup$
          – user428487
          Mar 21 '18 at 10:55




          $begingroup$
          I can see that $E[(X-Y)^2] = 0 implies X = Y a.s.$is a fact used in lots of places, would you mind pointing me to a proof?
          $endgroup$
          – user428487
          Mar 21 '18 at 10:55




          1




          1




          $begingroup$
          @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
          $endgroup$
          – Nate Eldredge
          Mar 21 '18 at 13:38




          $begingroup$
          @user428487: It's the fact that if $Z ge 0$ and $E[Z] = 0$ then $Z=0$ a.s. It's very basic measure theory and probably left as an exercise in most books. One way to see it: for any $n$ you have $P(Z ge 1/n) = 0$ using Markov's inequality. So by countable additivity $0 = P(bigcup_n {Z ge 1/n}) = P(Z > 0) = 0$.
          $endgroup$
          – Nate Eldredge
          Mar 21 '18 at 13:38











          1












          $begingroup$

          Let $h:{mathbb R}to{mathbb R}$ be bounded and strictly increasing. (For example, $h(x)=1/(1+e^{-x})$.) Since $X$ and $Y$ are integrable and $h$ is bounded, the random variable $Z:=(X-Y)(h(X)-h(Y))$ is integrable, with expectation
          $$
          E[Xh(X) -Yh(X)-Xh(Y) +Yh(Y)].tag1
          $$

          But $E[Yh(X)]=Eleft[E(Yh(X)mid X)right]=E[h(X)E(Ymid X)]=E[Xh(X)]$ and similarly $E[Xh(Y)]=E[Yh(Y)]$. Plugging into (1), we find the expectation of $Z$ is zero. But $Z$ is non-negative, since $h$ is increasing. It follows that $Z$ is zero almost surely. To finish the proof, the fact that $h$ is one-to-one implies the set inclusion
          $$left{(X-Y)(h(X)-h(Y))=0right}subsetleft{X=Yright}.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Yep, this is the other way.
            $endgroup$
            – Did
            Dec 19 '18 at 18:15
















          1












          $begingroup$

          Let $h:{mathbb R}to{mathbb R}$ be bounded and strictly increasing. (For example, $h(x)=1/(1+e^{-x})$.) Since $X$ and $Y$ are integrable and $h$ is bounded, the random variable $Z:=(X-Y)(h(X)-h(Y))$ is integrable, with expectation
          $$
          E[Xh(X) -Yh(X)-Xh(Y) +Yh(Y)].tag1
          $$

          But $E[Yh(X)]=Eleft[E(Yh(X)mid X)right]=E[h(X)E(Ymid X)]=E[Xh(X)]$ and similarly $E[Xh(Y)]=E[Yh(Y)]$. Plugging into (1), we find the expectation of $Z$ is zero. But $Z$ is non-negative, since $h$ is increasing. It follows that $Z$ is zero almost surely. To finish the proof, the fact that $h$ is one-to-one implies the set inclusion
          $$left{(X-Y)(h(X)-h(Y))=0right}subsetleft{X=Yright}.
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Yep, this is the other way.
            $endgroup$
            – Did
            Dec 19 '18 at 18:15














          1












          1








          1





          $begingroup$

          Let $h:{mathbb R}to{mathbb R}$ be bounded and strictly increasing. (For example, $h(x)=1/(1+e^{-x})$.) Since $X$ and $Y$ are integrable and $h$ is bounded, the random variable $Z:=(X-Y)(h(X)-h(Y))$ is integrable, with expectation
          $$
          E[Xh(X) -Yh(X)-Xh(Y) +Yh(Y)].tag1
          $$

          But $E[Yh(X)]=Eleft[E(Yh(X)mid X)right]=E[h(X)E(Ymid X)]=E[Xh(X)]$ and similarly $E[Xh(Y)]=E[Yh(Y)]$. Plugging into (1), we find the expectation of $Z$ is zero. But $Z$ is non-negative, since $h$ is increasing. It follows that $Z$ is zero almost surely. To finish the proof, the fact that $h$ is one-to-one implies the set inclusion
          $$left{(X-Y)(h(X)-h(Y))=0right}subsetleft{X=Yright}.
          $$






          share|cite|improve this answer









          $endgroup$



          Let $h:{mathbb R}to{mathbb R}$ be bounded and strictly increasing. (For example, $h(x)=1/(1+e^{-x})$.) Since $X$ and $Y$ are integrable and $h$ is bounded, the random variable $Z:=(X-Y)(h(X)-h(Y))$ is integrable, with expectation
          $$
          E[Xh(X) -Yh(X)-Xh(Y) +Yh(Y)].tag1
          $$

          But $E[Yh(X)]=Eleft[E(Yh(X)mid X)right]=E[h(X)E(Ymid X)]=E[Xh(X)]$ and similarly $E[Xh(Y)]=E[Yh(Y)]$. Plugging into (1), we find the expectation of $Z$ is zero. But $Z$ is non-negative, since $h$ is increasing. It follows that $Z$ is zero almost surely. To finish the proof, the fact that $h$ is one-to-one implies the set inclusion
          $$left{(X-Y)(h(X)-h(Y))=0right}subsetleft{X=Yright}.
          $$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 19 '18 at 18:04









          grand_chatgrand_chat

          20.4k11327




          20.4k11327












          • $begingroup$
            Yep, this is the other way.
            $endgroup$
            – Did
            Dec 19 '18 at 18:15


















          • $begingroup$
            Yep, this is the other way.
            $endgroup$
            – Did
            Dec 19 '18 at 18:15
















          $begingroup$
          Yep, this is the other way.
          $endgroup$
          – Did
          Dec 19 '18 at 18:15




          $begingroup$
          Yep, this is the other way.
          $endgroup$
          – Did
          Dec 19 '18 at 18:15


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f666843%2fif-exy-y-almost-surely-and-eyx-x-almost-surely-then-x-y-almost-surel%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Le Mesnil-Réaume

          Ida-Boy-Ed-Garten

          web3.py web3.isConnected() returns false always