What does convergence in distribution “in the Gromov–Hausdorff” sense mean? The Next CEO of Stack OverflowEssentially one random metric on $mathbbS^2$?Gromov-Hausdorff convergence for locally finite metric spacesEstimating the variance of error in empirical approximation to a distributionError for the convergence by distributionHausdorff convergence of submanifolds in Riemannian manifoldsGromov-Hausdorff limits of 2-dimensional Riemannian surfaces“Local” functional central limit theorem for the empirical distribution functionCovering numbers of uniformly bounded subsets of Gromov-Hausdorff spaceDoes rate of convergence in probability come from a metric?Generalizing Gromov Hausdorff distance using Vietoris topology

Multi tool use
Multi tool use

What does convergence in distribution “in the Gromov–Hausdorff” sense mean?



The Next CEO of Stack OverflowEssentially one random metric on $mathbbS^2$?Gromov-Hausdorff convergence for locally finite metric spacesEstimating the variance of error in empirical approximation to a distributionError for the convergence by distributionHausdorff convergence of submanifolds in Riemannian manifoldsGromov-Hausdorff limits of 2-dimensional Riemannian surfaces“Local” functional central limit theorem for the empirical distribution functionCovering numbers of uniformly bounded subsets of Gromov-Hausdorff spaceDoes rate of convergence in probability come from a metric?Generalizing Gromov Hausdorff distance using Vietoris topology










9












$begingroup$


I am trying to understand this survey article by Le Gall on Brownian geometry, especially the statement of Theorem 1.



The basic statement of the theorem is
$$(m_n,d_n) to (m_infty, d_infty)$$
"in the Gromov–Hausdorff sense" as $n to infty$, where the convergence is in distribution.



Here $(m_n,d_n)$ and $(m_infty,d_infty)$ are both random compact metric spaces. So how do we interpret this? We might hope for a statement along the lines of the following.



For every compact metric space $(X,d)$ and $R > 0$, we have
$$(*) , , , mathbbP left[ d_GH[ (m_n,d_n), (X,d) ] < R right] to mathbbP left[ d_GH[ (m_infty,d_infty), (X,d) ] < R right]$$
as $n to infty$.



But even when we talk about convergence in distribution for real random variables (instead of compact-metric-space-valued random variables), we have to be careful to restrict our attention to points where the cumulative distribution function is continuous. So I wonder if (*) is too strong?










share|cite|improve this question









$endgroup$
















    9












    $begingroup$


    I am trying to understand this survey article by Le Gall on Brownian geometry, especially the statement of Theorem 1.



    The basic statement of the theorem is
    $$(m_n,d_n) to (m_infty, d_infty)$$
    "in the Gromov–Hausdorff sense" as $n to infty$, where the convergence is in distribution.



    Here $(m_n,d_n)$ and $(m_infty,d_infty)$ are both random compact metric spaces. So how do we interpret this? We might hope for a statement along the lines of the following.



    For every compact metric space $(X,d)$ and $R > 0$, we have
    $$(*) , , , mathbbP left[ d_GH[ (m_n,d_n), (X,d) ] < R right] to mathbbP left[ d_GH[ (m_infty,d_infty), (X,d) ] < R right]$$
    as $n to infty$.



    But even when we talk about convergence in distribution for real random variables (instead of compact-metric-space-valued random variables), we have to be careful to restrict our attention to points where the cumulative distribution function is continuous. So I wonder if (*) is too strong?










    share|cite|improve this question









    $endgroup$














      9












      9








      9


      1



      $begingroup$


      I am trying to understand this survey article by Le Gall on Brownian geometry, especially the statement of Theorem 1.



      The basic statement of the theorem is
      $$(m_n,d_n) to (m_infty, d_infty)$$
      "in the Gromov–Hausdorff sense" as $n to infty$, where the convergence is in distribution.



      Here $(m_n,d_n)$ and $(m_infty,d_infty)$ are both random compact metric spaces. So how do we interpret this? We might hope for a statement along the lines of the following.



      For every compact metric space $(X,d)$ and $R > 0$, we have
      $$(*) , , , mathbbP left[ d_GH[ (m_n,d_n), (X,d) ] < R right] to mathbbP left[ d_GH[ (m_infty,d_infty), (X,d) ] < R right]$$
      as $n to infty$.



      But even when we talk about convergence in distribution for real random variables (instead of compact-metric-space-valued random variables), we have to be careful to restrict our attention to points where the cumulative distribution function is continuous. So I wonder if (*) is too strong?










      share|cite|improve this question









      $endgroup$




      I am trying to understand this survey article by Le Gall on Brownian geometry, especially the statement of Theorem 1.



      The basic statement of the theorem is
      $$(m_n,d_n) to (m_infty, d_infty)$$
      "in the Gromov–Hausdorff sense" as $n to infty$, where the convergence is in distribution.



      Here $(m_n,d_n)$ and $(m_infty,d_infty)$ are both random compact metric spaces. So how do we interpret this? We might hope for a statement along the lines of the following.



      For every compact metric space $(X,d)$ and $R > 0$, we have
      $$(*) , , , mathbbP left[ d_GH[ (m_n,d_n), (X,d) ] < R right] to mathbbP left[ d_GH[ (m_infty,d_infty), (X,d) ] < R right]$$
      as $n to infty$.



      But even when we talk about convergence in distribution for real random variables (instead of compact-metric-space-valued random variables), we have to be careful to restrict our attention to points where the cumulative distribution function is continuous. So I wonder if (*) is too strong?







      pr.probability mg.metric-geometry






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 2 days ago









      Matthew KahleMatthew Kahle

      4,5722950




      4,5722950




















          1 Answer
          1






          active

          oldest

          votes


















          11












          $begingroup$

          Following the notation of the paper, let $mathbbK$ be the metric space of all compact metric spaces, equipped with the Gromov-Hausdorff metric $mathrmd_GH$. Then we can express convergence in distribution in the usual way: for every bounded continuous $F : mathbbK to mathbbR$, we have $mathbbE[F((m_n, d_n))] to mathbbE[F((m_infty, d_infty))]$. The portmanteau theorem gives you several other equivalent statements.



          In other words, this is just the usual notion of convergence in distribution for random variables taking their values in a metric space $S$, where that metric space happens to be $S = (mathbbK, mathrmd_GH)$, the metric space of all compact metric spaces.



          In particular, if $(X,d)$ is a fixed compact metric space, the function $mathrmd_GH(cdot, (X,d)) : mathbbK to mathbbR$ is a continuous function. So if we let $Y_n = mathrmd_GH((m_n, d_n), (X, d))$, then the scalar-valued random variables $Y_n$ converge in distribution to $Y$. So your formula (*) holds, but as you say, only for values of $R$ at which the function $R mapsto mathbbP[mathrmd_GH((m_infty, d_infty), (X,d)) < R]$ is continuous.






          share|cite|improve this answer











          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "504"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f326678%2fwhat-does-convergence-in-distribution-in-the-gromov-hausdorff-sense-mean%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            11












            $begingroup$

            Following the notation of the paper, let $mathbbK$ be the metric space of all compact metric spaces, equipped with the Gromov-Hausdorff metric $mathrmd_GH$. Then we can express convergence in distribution in the usual way: for every bounded continuous $F : mathbbK to mathbbR$, we have $mathbbE[F((m_n, d_n))] to mathbbE[F((m_infty, d_infty))]$. The portmanteau theorem gives you several other equivalent statements.



            In other words, this is just the usual notion of convergence in distribution for random variables taking their values in a metric space $S$, where that metric space happens to be $S = (mathbbK, mathrmd_GH)$, the metric space of all compact metric spaces.



            In particular, if $(X,d)$ is a fixed compact metric space, the function $mathrmd_GH(cdot, (X,d)) : mathbbK to mathbbR$ is a continuous function. So if we let $Y_n = mathrmd_GH((m_n, d_n), (X, d))$, then the scalar-valued random variables $Y_n$ converge in distribution to $Y$. So your formula (*) holds, but as you say, only for values of $R$ at which the function $R mapsto mathbbP[mathrmd_GH((m_infty, d_infty), (X,d)) < R]$ is continuous.






            share|cite|improve this answer











            $endgroup$

















              11












              $begingroup$

              Following the notation of the paper, let $mathbbK$ be the metric space of all compact metric spaces, equipped with the Gromov-Hausdorff metric $mathrmd_GH$. Then we can express convergence in distribution in the usual way: for every bounded continuous $F : mathbbK to mathbbR$, we have $mathbbE[F((m_n, d_n))] to mathbbE[F((m_infty, d_infty))]$. The portmanteau theorem gives you several other equivalent statements.



              In other words, this is just the usual notion of convergence in distribution for random variables taking their values in a metric space $S$, where that metric space happens to be $S = (mathbbK, mathrmd_GH)$, the metric space of all compact metric spaces.



              In particular, if $(X,d)$ is a fixed compact metric space, the function $mathrmd_GH(cdot, (X,d)) : mathbbK to mathbbR$ is a continuous function. So if we let $Y_n = mathrmd_GH((m_n, d_n), (X, d))$, then the scalar-valued random variables $Y_n$ converge in distribution to $Y$. So your formula (*) holds, but as you say, only for values of $R$ at which the function $R mapsto mathbbP[mathrmd_GH((m_infty, d_infty), (X,d)) < R]$ is continuous.






              share|cite|improve this answer











              $endgroup$















                11












                11








                11





                $begingroup$

                Following the notation of the paper, let $mathbbK$ be the metric space of all compact metric spaces, equipped with the Gromov-Hausdorff metric $mathrmd_GH$. Then we can express convergence in distribution in the usual way: for every bounded continuous $F : mathbbK to mathbbR$, we have $mathbbE[F((m_n, d_n))] to mathbbE[F((m_infty, d_infty))]$. The portmanteau theorem gives you several other equivalent statements.



                In other words, this is just the usual notion of convergence in distribution for random variables taking their values in a metric space $S$, where that metric space happens to be $S = (mathbbK, mathrmd_GH)$, the metric space of all compact metric spaces.



                In particular, if $(X,d)$ is a fixed compact metric space, the function $mathrmd_GH(cdot, (X,d)) : mathbbK to mathbbR$ is a continuous function. So if we let $Y_n = mathrmd_GH((m_n, d_n), (X, d))$, then the scalar-valued random variables $Y_n$ converge in distribution to $Y$. So your formula (*) holds, but as you say, only for values of $R$ at which the function $R mapsto mathbbP[mathrmd_GH((m_infty, d_infty), (X,d)) < R]$ is continuous.






                share|cite|improve this answer











                $endgroup$



                Following the notation of the paper, let $mathbbK$ be the metric space of all compact metric spaces, equipped with the Gromov-Hausdorff metric $mathrmd_GH$. Then we can express convergence in distribution in the usual way: for every bounded continuous $F : mathbbK to mathbbR$, we have $mathbbE[F((m_n, d_n))] to mathbbE[F((m_infty, d_infty))]$. The portmanteau theorem gives you several other equivalent statements.



                In other words, this is just the usual notion of convergence in distribution for random variables taking their values in a metric space $S$, where that metric space happens to be $S = (mathbbK, mathrmd_GH)$, the metric space of all compact metric spaces.



                In particular, if $(X,d)$ is a fixed compact metric space, the function $mathrmd_GH(cdot, (X,d)) : mathbbK to mathbbR$ is a continuous function. So if we let $Y_n = mathrmd_GH((m_n, d_n), (X, d))$, then the scalar-valued random variables $Y_n$ converge in distribution to $Y$. So your formula (*) holds, but as you say, only for values of $R$ at which the function $R mapsto mathbbP[mathrmd_GH((m_infty, d_infty), (X,d)) < R]$ is continuous.







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited 2 days ago

























                answered 2 days ago









                Nate EldredgeNate Eldredge

                20.2k371117




                20.2k371117



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to MathOverflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f326678%2fwhat-does-convergence-in-distribution-in-the-gromov-hausdorff-sense-mean%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    CTKmj,0,7bgiTq O8fyxAV8Bo9JycToD6
                    czP2jL,N0v ymO1,xrfy JOc3NId6y XcAfUEse HlNxYIeqxtfgVWa3FEnTs

                    Popular posts from this blog

                    getting Checkpoint VPN SSL Network Extender working in the command lineHow to connect to CheckPoint VPN on Ubuntu 18.04LTS?Will the Linux ( red-hat ) Open VPNC Client connect to checkpoint or nortel VPN gateways?VPN client for linux machine + support checkpoint gatewayVPN SSL Network Extender in FirefoxLinux Checkpoint SNX tool configuration issuesCheck Point - Connect under Linux - snx + OTPSNX VPN Ububuntu 18.XXUsing Checkpoint VPN SSL Network Extender CLI with certificateVPN with network manager (nm-applet) is not workingWill the Linux ( red-hat ) Open VPNC Client connect to checkpoint or nortel VPN gateways?VPN client for linux machine + support checkpoint gatewayImport VPN config files to NetworkManager from command lineTrouble connecting to VPN using network-manager, while command line worksStart a VPN connection with PPTP protocol on command linestarting a docker service daemon breaks the vpn networkCan't connect to vpn with Network-managerVPN SSL Network Extender in FirefoxUsing Checkpoint VPN SSL Network Extender CLI with certificate

                    NetworkManager fails with “Could not find source connection”Trouble connecting to VPN using network-manager, while command line worksHow can I be notified about state changes to a VPN adapterBacktrack 5 R3 - Refuses to connect to VPNFeed all traffic through OpenVPN for a specific network namespace onlyRun daemon on startup in Debian once openvpn connection establishedpfsense tcp connection between openvpn and lan is brokenInternet connection problem with web browsers onlyWhy does NetworkManager explicitly support tun/tap devices?Browser issues with VPNTwo IP addresses assigned to the same network card - OpenVPN issues?Cannot connect to WiFi with nmcli, although secrets are provided

                    Marilyn Monroe Ny fiainany manokana | Jereo koa | Meny fitetezanafanitarana azy.