The characteristic function of a multivariate normal distributed random variable












5












$begingroup$


The characteristic function of a random variable $X$ is defined as $hat{X}(theta)=mathbb{E}(e^{itheta X})$. If $X$ is a normally distributed random variable with mean $mu$ and standard deviation $sigmage 0$, then its characteristic function can be found as follows:



$$hat{X}(theta)=mathbb{E}(e^{itheta X})
=int_{-infty}^{infty}frac{e^{itheta x-frac{(x-mu)^2}{2sigma^2}}}{sigmasqrt{2pi}}dx=ldots=e^{imutheta-frac{sigma^2theta^2}{2}}$$



(to be honest, I have no idea what to put instead of the "$ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)



Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $ntimes n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.



I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:



$$hat{X}=mathbb{E}(e^{i<theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:



$$hat{X}(theta) = mathbb{E}(e^{i<theta,X>})
= mathbb{E}(e^{i<theta,AY>}cdot e^{i<theta,M>})\
=e^{i<theta,M>}cdot mathbb{E}(e^{i<theta,AY>}) $$



And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.










share|cite|improve this question











$endgroup$

















    5












    $begingroup$


    The characteristic function of a random variable $X$ is defined as $hat{X}(theta)=mathbb{E}(e^{itheta X})$. If $X$ is a normally distributed random variable with mean $mu$ and standard deviation $sigmage 0$, then its characteristic function can be found as follows:



    $$hat{X}(theta)=mathbb{E}(e^{itheta X})
    =int_{-infty}^{infty}frac{e^{itheta x-frac{(x-mu)^2}{2sigma^2}}}{sigmasqrt{2pi}}dx=ldots=e^{imutheta-frac{sigma^2theta^2}{2}}$$



    (to be honest, I have no idea what to put instead of the "$ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)



    Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $ntimes n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.



    I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:



    $$hat{X}=mathbb{E}(e^{i<theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:



    $$hat{X}(theta) = mathbb{E}(e^{i<theta,X>})
    = mathbb{E}(e^{i<theta,AY>}cdot e^{i<theta,M>})\
    =e^{i<theta,M>}cdot mathbb{E}(e^{i<theta,AY>}) $$



    And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.










    share|cite|improve this question











    $endgroup$















      5












      5








      5


      1



      $begingroup$


      The characteristic function of a random variable $X$ is defined as $hat{X}(theta)=mathbb{E}(e^{itheta X})$. If $X$ is a normally distributed random variable with mean $mu$ and standard deviation $sigmage 0$, then its characteristic function can be found as follows:



      $$hat{X}(theta)=mathbb{E}(e^{itheta X})
      =int_{-infty}^{infty}frac{e^{itheta x-frac{(x-mu)^2}{2sigma^2}}}{sigmasqrt{2pi}}dx=ldots=e^{imutheta-frac{sigma^2theta^2}{2}}$$



      (to be honest, I have no idea what to put instead of the "$ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)



      Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $ntimes n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.



      I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:



      $$hat{X}=mathbb{E}(e^{i<theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:



      $$hat{X}(theta) = mathbb{E}(e^{i<theta,X>})
      = mathbb{E}(e^{i<theta,AY>}cdot e^{i<theta,M>})\
      =e^{i<theta,M>}cdot mathbb{E}(e^{i<theta,AY>}) $$



      And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.










      share|cite|improve this question











      $endgroup$




      The characteristic function of a random variable $X$ is defined as $hat{X}(theta)=mathbb{E}(e^{itheta X})$. If $X$ is a normally distributed random variable with mean $mu$ and standard deviation $sigmage 0$, then its characteristic function can be found as follows:



      $$hat{X}(theta)=mathbb{E}(e^{itheta X})
      =int_{-infty}^{infty}frac{e^{itheta x-frac{(x-mu)^2}{2sigma^2}}}{sigmasqrt{2pi}}dx=ldots=e^{imutheta-frac{sigma^2theta^2}{2}}$$



      (to be honest, I have no idea what to put instead of the "$ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)



      Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $ntimes n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.



      I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:



      $$hat{X}=mathbb{E}(e^{i<theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:



      $$hat{X}(theta) = mathbb{E}(e^{i<theta,X>})
      = mathbb{E}(e^{i<theta,AY>}cdot e^{i<theta,M>})\
      =e^{i<theta,M>}cdot mathbb{E}(e^{i<theta,AY>}) $$



      And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.







      probability-distributions normal-distribution






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Apr 13 '17 at 12:21









      Community

      1




      1










      asked Mar 13 '14 at 14:29









      BachBach

      1,072924




      1,072924






















          2 Answers
          2






          active

          oldest

          votes


















          3












          $begingroup$

          You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $mathbb{E}left[e^{itheta^{T}X}right] = mathbb{E}left[e^{itheta^{T}left(AY+Mright)}right] = e^{itheta^{T}M}mathbb{E}left[e^{itheta^{T}AY}right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution.
          $$
          begin{align*}X &sim mathcal{N}left(mu, Sigmaright)\ mathbb{E}left[e^{is^{T}X}right] &= exp left{imu^{T}s - frac{1}{2}s^{T}Sigma s right} end{align*}
          $$
          Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y sim mathcal{N}left(mu_{Y}, Sigma_{Y}right)$, then
          $$
          begin{align*} mathbb{E}left[AYright] &= Amu_{Y} \ operatorname{Var}left[AYright] &= ASigma_{Y} A^{T} . end{align*}
          $$



          Using the relationship between $X$ and $Y$,
          $$
          begin{align*} AY &= X-M \ mathbb{E}left[AYright] &= mu_{X} - M \operatorname{Var}left[AYright] &= Sigma_{X}\ mathbb{E}left[e^{itheta^{T}AY}right] &= exp left{ileft(mu_{X}-Mright)^{T}theta - frac{1}{2}theta^{T}Sigma_{X} theta right} . end{align*}
          $$
          This is as far as I can get with the information you gave.






          share|cite|improve this answer











          $endgroup$





















            0












            $begingroup$

            You are basically finished! See, you obtained
            $$
            Psi_X(theta) = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangletheta,AYrangle)})
            $$

            What is left is noticing that A can move to the other side of the inner product
            $$
            = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangle A'theta,Yrangle)})
            = e^{(ilangletheta,Mrangle)}Psi_Y(A'theta)
            $$

            All you have left is plugging in the characteristic function of the multivariate normal distribution.






            share|cite|improve this answer









            $endgroup$














              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f710905%2fthe-characteristic-function-of-a-multivariate-normal-distributed-random-variable%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              3












              $begingroup$

              You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $mathbb{E}left[e^{itheta^{T}X}right] = mathbb{E}left[e^{itheta^{T}left(AY+Mright)}right] = e^{itheta^{T}M}mathbb{E}left[e^{itheta^{T}AY}right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution.
              $$
              begin{align*}X &sim mathcal{N}left(mu, Sigmaright)\ mathbb{E}left[e^{is^{T}X}right] &= exp left{imu^{T}s - frac{1}{2}s^{T}Sigma s right} end{align*}
              $$
              Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y sim mathcal{N}left(mu_{Y}, Sigma_{Y}right)$, then
              $$
              begin{align*} mathbb{E}left[AYright] &= Amu_{Y} \ operatorname{Var}left[AYright] &= ASigma_{Y} A^{T} . end{align*}
              $$



              Using the relationship between $X$ and $Y$,
              $$
              begin{align*} AY &= X-M \ mathbb{E}left[AYright] &= mu_{X} - M \operatorname{Var}left[AYright] &= Sigma_{X}\ mathbb{E}left[e^{itheta^{T}AY}right] &= exp left{ileft(mu_{X}-Mright)^{T}theta - frac{1}{2}theta^{T}Sigma_{X} theta right} . end{align*}
              $$
              This is as far as I can get with the information you gave.






              share|cite|improve this answer











              $endgroup$


















                3












                $begingroup$

                You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $mathbb{E}left[e^{itheta^{T}X}right] = mathbb{E}left[e^{itheta^{T}left(AY+Mright)}right] = e^{itheta^{T}M}mathbb{E}left[e^{itheta^{T}AY}right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution.
                $$
                begin{align*}X &sim mathcal{N}left(mu, Sigmaright)\ mathbb{E}left[e^{is^{T}X}right] &= exp left{imu^{T}s - frac{1}{2}s^{T}Sigma s right} end{align*}
                $$
                Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y sim mathcal{N}left(mu_{Y}, Sigma_{Y}right)$, then
                $$
                begin{align*} mathbb{E}left[AYright] &= Amu_{Y} \ operatorname{Var}left[AYright] &= ASigma_{Y} A^{T} . end{align*}
                $$



                Using the relationship between $X$ and $Y$,
                $$
                begin{align*} AY &= X-M \ mathbb{E}left[AYright] &= mu_{X} - M \operatorname{Var}left[AYright] &= Sigma_{X}\ mathbb{E}left[e^{itheta^{T}AY}right] &= exp left{ileft(mu_{X}-Mright)^{T}theta - frac{1}{2}theta^{T}Sigma_{X} theta right} . end{align*}
                $$
                This is as far as I can get with the information you gave.






                share|cite|improve this answer











                $endgroup$
















                  3












                  3








                  3





                  $begingroup$

                  You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $mathbb{E}left[e^{itheta^{T}X}right] = mathbb{E}left[e^{itheta^{T}left(AY+Mright)}right] = e^{itheta^{T}M}mathbb{E}left[e^{itheta^{T}AY}right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution.
                  $$
                  begin{align*}X &sim mathcal{N}left(mu, Sigmaright)\ mathbb{E}left[e^{is^{T}X}right] &= exp left{imu^{T}s - frac{1}{2}s^{T}Sigma s right} end{align*}
                  $$
                  Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y sim mathcal{N}left(mu_{Y}, Sigma_{Y}right)$, then
                  $$
                  begin{align*} mathbb{E}left[AYright] &= Amu_{Y} \ operatorname{Var}left[AYright] &= ASigma_{Y} A^{T} . end{align*}
                  $$



                  Using the relationship between $X$ and $Y$,
                  $$
                  begin{align*} AY &= X-M \ mathbb{E}left[AYright] &= mu_{X} - M \operatorname{Var}left[AYright] &= Sigma_{X}\ mathbb{E}left[e^{itheta^{T}AY}right] &= exp left{ileft(mu_{X}-Mright)^{T}theta - frac{1}{2}theta^{T}Sigma_{X} theta right} . end{align*}
                  $$
                  This is as far as I can get with the information you gave.






                  share|cite|improve this answer











                  $endgroup$



                  You wouldn't want to use the bracket notation for inner product when you're essentially dealing with matrices. Instead, write $mathbb{E}left[e^{itheta^{T}X}right] = mathbb{E}left[e^{itheta^{T}left(AY+Mright)}right] = e^{itheta^{T}M}mathbb{E}left[e^{itheta^{T}AY}right]$. You're only left with computing the characteristic function of a multivariate Gaussian distribution.
                  $$
                  begin{align*}X &sim mathcal{N}left(mu, Sigmaright)\ mathbb{E}left[e^{is^{T}X}right] &= exp left{imu^{T}s - frac{1}{2}s^{T}Sigma s right} end{align*}
                  $$
                  Just find out the mean vector and the covariance matrix of $AY$ since Gaussian variables have the affine property which means they don't change under linear transformation (They're still Gaussian completely defined by the mean vector and covariance matrix). If $Y sim mathcal{N}left(mu_{Y}, Sigma_{Y}right)$, then
                  $$
                  begin{align*} mathbb{E}left[AYright] &= Amu_{Y} \ operatorname{Var}left[AYright] &= ASigma_{Y} A^{T} . end{align*}
                  $$



                  Using the relationship between $X$ and $Y$,
                  $$
                  begin{align*} AY &= X-M \ mathbb{E}left[AYright] &= mu_{X} - M \operatorname{Var}left[AYright] &= Sigma_{X}\ mathbb{E}left[e^{itheta^{T}AY}right] &= exp left{ileft(mu_{X}-Mright)^{T}theta - frac{1}{2}theta^{T}Sigma_{X} theta right} . end{align*}
                  $$
                  This is as far as I can get with the information you gave.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jan 29 '16 at 4:33

























                  answered Jan 29 '16 at 4:22









                  Daeyoung LimDaeyoung Lim

                  463314




                  463314























                      0












                      $begingroup$

                      You are basically finished! See, you obtained
                      $$
                      Psi_X(theta) = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangletheta,AYrangle)})
                      $$

                      What is left is noticing that A can move to the other side of the inner product
                      $$
                      = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangle A'theta,Yrangle)})
                      = e^{(ilangletheta,Mrangle)}Psi_Y(A'theta)
                      $$

                      All you have left is plugging in the characteristic function of the multivariate normal distribution.






                      share|cite|improve this answer









                      $endgroup$


















                        0












                        $begingroup$

                        You are basically finished! See, you obtained
                        $$
                        Psi_X(theta) = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangletheta,AYrangle)})
                        $$

                        What is left is noticing that A can move to the other side of the inner product
                        $$
                        = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangle A'theta,Yrangle)})
                        = e^{(ilangletheta,Mrangle)}Psi_Y(A'theta)
                        $$

                        All you have left is plugging in the characteristic function of the multivariate normal distribution.






                        share|cite|improve this answer









                        $endgroup$
















                          0












                          0








                          0





                          $begingroup$

                          You are basically finished! See, you obtained
                          $$
                          Psi_X(theta) = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangletheta,AYrangle)})
                          $$

                          What is left is noticing that A can move to the other side of the inner product
                          $$
                          = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangle A'theta,Yrangle)})
                          = e^{(ilangletheta,Mrangle)}Psi_Y(A'theta)
                          $$

                          All you have left is plugging in the characteristic function of the multivariate normal distribution.






                          share|cite|improve this answer









                          $endgroup$



                          You are basically finished! See, you obtained
                          $$
                          Psi_X(theta) = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangletheta,AYrangle)})
                          $$

                          What is left is noticing that A can move to the other side of the inner product
                          $$
                          = e^{(ilangletheta,Mrangle)}mathbb{E}(e^{(ilangle A'theta,Yrangle)})
                          = e^{(ilangletheta,Mrangle)}Psi_Y(A'theta)
                          $$

                          All you have left is plugging in the characteristic function of the multivariate normal distribution.







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Dec 22 '18 at 17:27









                          Carlos LlosaCarlos Llosa

                          12




                          12






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f710905%2fthe-characteristic-function-of-a-multivariate-normal-distributed-random-variable%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Le Mesnil-Réaume

                              Ida-Boy-Ed-Garten

                              web3.py web3.isConnected() returns false always