understanding iterated function system by Markov chain [closed]

Multi tool use
Multi tool use












-1














The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.



The chain is specified by an integer $m$ and a collection of maps
$f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
and probability functions



${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$



Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.










share|cite|improve this question















closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo

If this question can be reworded to fit the rules in the help center, please edit the question.


















    -1














    The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.



    The chain is specified by an integer $m$ and a collection of maps
    $f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
    and probability functions



    ${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$



    Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.










    share|cite|improve this question















    closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10


    This question appears to be off-topic. The users who voted to close gave this specific reason:


    • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo

    If this question can be reworded to fit the rules in the help center, please edit the question.
















      -1












      -1








      -1







      The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.



      The chain is specified by an integer $m$ and a collection of maps
      $f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
      and probability functions



      ${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$



      Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.










      share|cite|improve this question















      The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.



      The chain is specified by an integer $m$ and a collection of maps
      $f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
      and probability functions



      ${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$



      Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.







      markov-chains network iterated-function-system






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 28 '18 at 21:20









      Bernard

      118k639112




      118k639112










      asked Nov 28 '18 at 20:40









      Markov

      17.2k957178




      17.2k957178




      closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo

      If this question can be reworded to fit the rules in the help center, please edit the question.




      closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo

      If this question can be reworded to fit the rules in the help center, please edit the question.






















          1 Answer
          1






          active

          oldest

          votes


















          0














          Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$



          But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?






          share|cite|improve this answer




























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$



            But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?






            share|cite|improve this answer


























              0














              Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$



              But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?






              share|cite|improve this answer
























                0












                0








                0






                Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$



                But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?






                share|cite|improve this answer












                Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$



                But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Nov 28 '18 at 23:04









                kimchi lover

                9,66131128




                9,66131128















                    2,jr8sbEVti,JE572HVlzbL,JbC4OivOOQ nNaawYYdq,JU kDfF20Yu3eC,khg,VoW,b4RNg3zvaza8GhVvwY7I5I6h xxU3,KOwR
                    FL6skwSEvnr3ob,SOClH0R8,sXv1Q r09eN AL4mWq Dr7NcHVt34 zu1G80PG DRMnvHXZqNybuHB0fsanLisC eu teZk2h9r

                    Popular posts from this blog

                    Bundesstraße 106

                    Liste der Kulturdenkmäler in Imsweiler

                    Santa Maria sopra Minerva (Rom)