understanding iterated function system by Markov chain [closed]
The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.
The chain is specified by an integer $m$ and a collection of maps
$f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
and probability functions
${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$
Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.
markov-chains network iterated-function-system
closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.
The chain is specified by an integer $m$ and a collection of maps
$f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
and probability functions
${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$
Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.
markov-chains network iterated-function-system
closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.
The chain is specified by an integer $m$ and a collection of maps
$f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
and probability functions
${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$
Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.
markov-chains network iterated-function-system
The Iterated Function System at node $i$ is a discrete time Markov chain on the state space ${cal S}_i=mathbb{R}^d$.
The chain is specified by an integer $m$ and a collection of maps
$f_j^{(i)}: S_i rightarrow S_i, j=1,dots,m$
and probability functions
${p_j^{(i)}: S_i rightarrow [0,1] }$ , $sum_{j=1}^{m} p_j^{(i)}(x) =1 forall x in S_i$
Could anyone explain me the above statements with some example, I didn't understand at all. Thanks.
markov-chains network iterated-function-system
markov-chains network iterated-function-system
edited Nov 28 '18 at 21:20
Bernard
118k639112
118k639112
asked Nov 28 '18 at 20:40
Markov
17.2k957178
17.2k957178
closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo
If this question can be reworded to fit the rules in the help center, please edit the question.
closed as off-topic by Did, Leucippus, Shailesh, KReiser, Cesareo Nov 29 '18 at 1:10
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please improve the question by providing additional context, which ideally includes your thoughts on the problem and any attempts you have made to solve it. This information helps others identify where you have difficulties and helps them write answers appropriate to your experience level." – Did, Leucippus, Shailesh, KReiser, Cesareo
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?
add a comment |
Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?
add a comment |
Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?
Given all this data you can define a Markov chain with state space ${1,2,dots,m}timesmathbb R^d$, as follows. If the current state is $(i,x)$, you move to state $(j,f_j^{(i)}(x))$ with probability $p_j^{(i)}(x).$
But there is some fishiness in the statement's notation and terminology: the notation $S_i$ seems a little odd, so I am not 100% sure my construction is what your author had in mind. Where did this come up?
answered Nov 28 '18 at 23:04
kimchi lover
9,66131128
9,66131128
add a comment |
add a comment |