Lipschitz function of independent sub-Gaussian random variables












6












$begingroup$


If $Xsim mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here).



Suppose $X = (X_1,X_2,ldots, X_n)$ consisted of independent sub-Gaussian random variables themselves, which are not normally distributed. Does the above property still hold?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Hi, any news on your question? I'm also interested.
    $endgroup$
    – nullgeppetto
    Jan 22 '17 at 20:02












  • $begingroup$
    No, I have not been able to resolve this yet.
    $endgroup$
    – Hedonist
    Jan 28 '17 at 16:43
















6












$begingroup$


If $Xsim mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here).



Suppose $X = (X_1,X_2,ldots, X_n)$ consisted of independent sub-Gaussian random variables themselves, which are not normally distributed. Does the above property still hold?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Hi, any news on your question? I'm also interested.
    $endgroup$
    – nullgeppetto
    Jan 22 '17 at 20:02












  • $begingroup$
    No, I have not been able to resolve this yet.
    $endgroup$
    – Hedonist
    Jan 28 '17 at 16:43














6












6








6


3



$begingroup$


If $Xsim mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here).



Suppose $X = (X_1,X_2,ldots, X_n)$ consisted of independent sub-Gaussian random variables themselves, which are not normally distributed. Does the above property still hold?










share|cite|improve this question









$endgroup$




If $Xsim mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here).



Suppose $X = (X_1,X_2,ldots, X_n)$ consisted of independent sub-Gaussian random variables themselves, which are not normally distributed. Does the above property still hold?







concentration-of-measure






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 22 '16 at 15:06









HedonistHedonist

650312




650312












  • $begingroup$
    Hi, any news on your question? I'm also interested.
    $endgroup$
    – nullgeppetto
    Jan 22 '17 at 20:02












  • $begingroup$
    No, I have not been able to resolve this yet.
    $endgroup$
    – Hedonist
    Jan 28 '17 at 16:43


















  • $begingroup$
    Hi, any news on your question? I'm also interested.
    $endgroup$
    – nullgeppetto
    Jan 22 '17 at 20:02












  • $begingroup$
    No, I have not been able to resolve this yet.
    $endgroup$
    – Hedonist
    Jan 28 '17 at 16:43
















$begingroup$
Hi, any news on your question? I'm also interested.
$endgroup$
– nullgeppetto
Jan 22 '17 at 20:02






$begingroup$
Hi, any news on your question? I'm also interested.
$endgroup$
– nullgeppetto
Jan 22 '17 at 20:02














$begingroup$
No, I have not been able to resolve this yet.
$endgroup$
– Hedonist
Jan 28 '17 at 16:43




$begingroup$
No, I have not been able to resolve this yet.
$endgroup$
– Hedonist
Jan 28 '17 at 16:43










2 Answers
2






active

oldest

votes


















2












$begingroup$

Try the following extension of McDiarmid’s inequality for metric spaces
with unbounded diameter:
https://arxiv.org/pdf/1309.1007.pdf






share|cite|improve this answer









$endgroup$













  • $begingroup$
    If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
    $endgroup$
    – ie86
    May 25 '17 at 6:30



















0












$begingroup$

Here are two options that may suit your needs.




  1. Concentration inequality for convex functions of bounded random variables.
    If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then [P(f(X) > m+t) le 2e^{-t^2/4},
    P(f(X) < m - t) le 2e^{-t^2/4}
    qquad
    ] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence
    by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.


  2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = fcirc Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $Phi:R^nto R^n$ applies $Phi$ on every coordinate.

    Then the Tsirelson-Ibragimov-Sudakov inequality applies to $fcirc Phi$, and the Lipschitz norm of $fcirc Phi$ is at most $|f|_{Lip} |Phi|_{Lip}$. Now, the question is whether $|Phi|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $Phi$ is Lipschitz at all, otherwise $|Phi|_{Lip}=+infty$ and we do not get anything).
    Inequality $|Phi|_{Lip}<M+infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.


  3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^nto R^n$.
    If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (partial/partial x_i)(partial/partial x_i) U$ satisfies for all $xin R^n$ that $H(x) - kappa I_{ntimes n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$,
    [ P( |f(X) - E[f(X)] | > t) le 2 exp(-kappa c t^2) ]
    for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.







share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1622523%2flipschitz-function-of-independent-sub-gaussian-random-variables%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2












    $begingroup$

    Try the following extension of McDiarmid’s inequality for metric spaces
    with unbounded diameter:
    https://arxiv.org/pdf/1309.1007.pdf






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
      $endgroup$
      – ie86
      May 25 '17 at 6:30
















    2












    $begingroup$

    Try the following extension of McDiarmid’s inequality for metric spaces
    with unbounded diameter:
    https://arxiv.org/pdf/1309.1007.pdf






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
      $endgroup$
      – ie86
      May 25 '17 at 6:30














    2












    2








    2





    $begingroup$

    Try the following extension of McDiarmid’s inequality for metric spaces
    with unbounded diameter:
    https://arxiv.org/pdf/1309.1007.pdf






    share|cite|improve this answer









    $endgroup$



    Try the following extension of McDiarmid’s inequality for metric spaces
    with unbounded diameter:
    https://arxiv.org/pdf/1309.1007.pdf







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Mar 23 '17 at 21:41









    UriUri

    362




    362












    • $begingroup$
      If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
      $endgroup$
      – ie86
      May 25 '17 at 6:30


















    • $begingroup$
      If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
      $endgroup$
      – ie86
      May 25 '17 at 6:30
















    $begingroup$
    If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
    $endgroup$
    – ie86
    May 25 '17 at 6:30




    $begingroup$
    If $X$ is d-dimensional sub-gaussian vector with $Cov(X) = sigma^2 I$ and $f$ is L-Lipschitz w.r.t Euclidean norm, can we use the result of this paper? If so, what would be the sub-gaussian diameter as defined in the paper?
    $endgroup$
    – ie86
    May 25 '17 at 6:30











    0












    $begingroup$

    Here are two options that may suit your needs.




    1. Concentration inequality for convex functions of bounded random variables.
      If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then [P(f(X) > m+t) le 2e^{-t^2/4},
      P(f(X) < m - t) le 2e^{-t^2/4}
      qquad
      ] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence
      by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.


    2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = fcirc Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $Phi:R^nto R^n$ applies $Phi$ on every coordinate.

      Then the Tsirelson-Ibragimov-Sudakov inequality applies to $fcirc Phi$, and the Lipschitz norm of $fcirc Phi$ is at most $|f|_{Lip} |Phi|_{Lip}$. Now, the question is whether $|Phi|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $Phi$ is Lipschitz at all, otherwise $|Phi|_{Lip}=+infty$ and we do not get anything).
      Inequality $|Phi|_{Lip}<M+infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.


    3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^nto R^n$.
      If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (partial/partial x_i)(partial/partial x_i) U$ satisfies for all $xin R^n$ that $H(x) - kappa I_{ntimes n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$,
      [ P( |f(X) - E[f(X)] | > t) le 2 exp(-kappa c t^2) ]
      for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.







    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      Here are two options that may suit your needs.




      1. Concentration inequality for convex functions of bounded random variables.
        If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then [P(f(X) > m+t) le 2e^{-t^2/4},
        P(f(X) < m - t) le 2e^{-t^2/4}
        qquad
        ] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence
        by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.


      2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = fcirc Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $Phi:R^nto R^n$ applies $Phi$ on every coordinate.

        Then the Tsirelson-Ibragimov-Sudakov inequality applies to $fcirc Phi$, and the Lipschitz norm of $fcirc Phi$ is at most $|f|_{Lip} |Phi|_{Lip}$. Now, the question is whether $|Phi|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $Phi$ is Lipschitz at all, otherwise $|Phi|_{Lip}=+infty$ and we do not get anything).
        Inequality $|Phi|_{Lip}<M+infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.


      3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^nto R^n$.
        If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (partial/partial x_i)(partial/partial x_i) U$ satisfies for all $xin R^n$ that $H(x) - kappa I_{ntimes n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$,
        [ P( |f(X) - E[f(X)] | > t) le 2 exp(-kappa c t^2) ]
        for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.







      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        Here are two options that may suit your needs.




        1. Concentration inequality for convex functions of bounded random variables.
          If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then [P(f(X) > m+t) le 2e^{-t^2/4},
          P(f(X) < m - t) le 2e^{-t^2/4}
          qquad
          ] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence
          by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.


        2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = fcirc Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $Phi:R^nto R^n$ applies $Phi$ on every coordinate.

          Then the Tsirelson-Ibragimov-Sudakov inequality applies to $fcirc Phi$, and the Lipschitz norm of $fcirc Phi$ is at most $|f|_{Lip} |Phi|_{Lip}$. Now, the question is whether $|Phi|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $Phi$ is Lipschitz at all, otherwise $|Phi|_{Lip}=+infty$ and we do not get anything).
          Inequality $|Phi|_{Lip}<M+infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.


        3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^nto R^n$.
          If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (partial/partial x_i)(partial/partial x_i) U$ satisfies for all $xin R^n$ that $H(x) - kappa I_{ntimes n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$,
          [ P( |f(X) - E[f(X)] | > t) le 2 exp(-kappa c t^2) ]
          for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.







        share|cite|improve this answer











        $endgroup$



        Here are two options that may suit your needs.




        1. Concentration inequality for convex functions of bounded random variables.
          If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then [P(f(X) > m+t) le 2e^{-t^2/4},
          P(f(X) < m - t) le 2e^{-t^2/4}
          qquad
          ] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence
          by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.


        2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = fcirc Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $Phi:R^nto R^n$ applies $Phi$ on every coordinate.

          Then the Tsirelson-Ibragimov-Sudakov inequality applies to $fcirc Phi$, and the Lipschitz norm of $fcirc Phi$ is at most $|f|_{Lip} |Phi|_{Lip}$. Now, the question is whether $|Phi|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $Phi$ is Lipschitz at all, otherwise $|Phi|_{Lip}=+infty$ and we do not get anything).
          Inequality $|Phi|_{Lip}<M+infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.


        3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^nto R^n$.
          If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (partial/partial x_i)(partial/partial x_i) U$ satisfies for all $xin R^n$ that $H(x) - kappa I_{ntimes n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$,
          [ P( |f(X) - E[f(X)] | > t) le 2 exp(-kappa c t^2) ]
          for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.








        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 1 '18 at 0:39

























        answered Nov 30 '18 at 23:37









        jlewkjlewk

        765




        765






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1622523%2flipschitz-function-of-independent-sub-gaussian-random-variables%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Bundesstraße 106

            Verónica Boquete

            Ida-Boy-Ed-Garten