Radius of convergence of power series of sub-stochastic matrix
$begingroup$
Suppose I have a connected graph $G = (V,E)$, and a subset of vertices $U$, such that $W$ is the subgraph of $G$ induced by the vertices $V setminus U$, and that $W$ is also connected.
Now, let $X_n$ be a random walk on $G$ and let $P$ be it's transition probability matrix. Now, let $Q$ be the submatrix of $P$ that corresponds to the vertices $W$. So, $Q$ is a sub-stochastic matrix (i.e. at least $1$ row sums to less than $1$)
I have shown that, for any $i,j in W$, $sum_{n=0}^infty (Q^n)_{ij} = mathscr{G}_U(i,j) < infty$, and I have also shown that the power series $sum_{n=0}^infty x^n(Q^n)_{ij}$ converges for some $x > 1$, by showing that all eigenvalues of $Q$ are strictly less than $1$.
Now, I want to show that the radius of convergence, $R$, of $sum_{n=0}^infty x^n(Q^n)_{ij}$ does not depend on $i$ or $j$.
I am stuck here, and not sure how to proceed.
probability probability-theory graph-theory markov-chains random-walk
$endgroup$
add a comment |
$begingroup$
Suppose I have a connected graph $G = (V,E)$, and a subset of vertices $U$, such that $W$ is the subgraph of $G$ induced by the vertices $V setminus U$, and that $W$ is also connected.
Now, let $X_n$ be a random walk on $G$ and let $P$ be it's transition probability matrix. Now, let $Q$ be the submatrix of $P$ that corresponds to the vertices $W$. So, $Q$ is a sub-stochastic matrix (i.e. at least $1$ row sums to less than $1$)
I have shown that, for any $i,j in W$, $sum_{n=0}^infty (Q^n)_{ij} = mathscr{G}_U(i,j) < infty$, and I have also shown that the power series $sum_{n=0}^infty x^n(Q^n)_{ij}$ converges for some $x > 1$, by showing that all eigenvalues of $Q$ are strictly less than $1$.
Now, I want to show that the radius of convergence, $R$, of $sum_{n=0}^infty x^n(Q^n)_{ij}$ does not depend on $i$ or $j$.
I am stuck here, and not sure how to proceed.
probability probability-theory graph-theory markov-chains random-walk
$endgroup$
add a comment |
$begingroup$
Suppose I have a connected graph $G = (V,E)$, and a subset of vertices $U$, such that $W$ is the subgraph of $G$ induced by the vertices $V setminus U$, and that $W$ is also connected.
Now, let $X_n$ be a random walk on $G$ and let $P$ be it's transition probability matrix. Now, let $Q$ be the submatrix of $P$ that corresponds to the vertices $W$. So, $Q$ is a sub-stochastic matrix (i.e. at least $1$ row sums to less than $1$)
I have shown that, for any $i,j in W$, $sum_{n=0}^infty (Q^n)_{ij} = mathscr{G}_U(i,j) < infty$, and I have also shown that the power series $sum_{n=0}^infty x^n(Q^n)_{ij}$ converges for some $x > 1$, by showing that all eigenvalues of $Q$ are strictly less than $1$.
Now, I want to show that the radius of convergence, $R$, of $sum_{n=0}^infty x^n(Q^n)_{ij}$ does not depend on $i$ or $j$.
I am stuck here, and not sure how to proceed.
probability probability-theory graph-theory markov-chains random-walk
$endgroup$
Suppose I have a connected graph $G = (V,E)$, and a subset of vertices $U$, such that $W$ is the subgraph of $G$ induced by the vertices $V setminus U$, and that $W$ is also connected.
Now, let $X_n$ be a random walk on $G$ and let $P$ be it's transition probability matrix. Now, let $Q$ be the submatrix of $P$ that corresponds to the vertices $W$. So, $Q$ is a sub-stochastic matrix (i.e. at least $1$ row sums to less than $1$)
I have shown that, for any $i,j in W$, $sum_{n=0}^infty (Q^n)_{ij} = mathscr{G}_U(i,j) < infty$, and I have also shown that the power series $sum_{n=0}^infty x^n(Q^n)_{ij}$ converges for some $x > 1$, by showing that all eigenvalues of $Q$ are strictly less than $1$.
Now, I want to show that the radius of convergence, $R$, of $sum_{n=0}^infty x^n(Q^n)_{ij}$ does not depend on $i$ or $j$.
I am stuck here, and not sure how to proceed.
probability probability-theory graph-theory markov-chains random-walk
probability probability-theory graph-theory markov-chains random-walk
edited Dec 2 '18 at 18:45
jackson5
asked Dec 2 '18 at 18:23
jackson5jackson5
606512
606512
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I'm pretty sure you're summing over $n$, not $i$.
Assuming that, from diagonalization you have that $(Q^n)_{ij}$ is some linear combination of $lambda_k^n$ where $lambda_k$ are all the eigenvalues of $Q$ (say sorted in decreasing order of magnitude). So the radius of convergence is presumably $1/|lambda_1|$, except for the possibility that somehow $(Q^n)_{ij}$ had no contribution from $lambda_1$. But the Perron-Frobenius theorem (for general nonnegative matrices, not just stochastic matrices) forbids that, because the eigenvector $v_1$ will have everywhere nonzero entries.
$endgroup$
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3023010%2fradius-of-convergence-of-power-series-of-sub-stochastic-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I'm pretty sure you're summing over $n$, not $i$.
Assuming that, from diagonalization you have that $(Q^n)_{ij}$ is some linear combination of $lambda_k^n$ where $lambda_k$ are all the eigenvalues of $Q$ (say sorted in decreasing order of magnitude). So the radius of convergence is presumably $1/|lambda_1|$, except for the possibility that somehow $(Q^n)_{ij}$ had no contribution from $lambda_1$. But the Perron-Frobenius theorem (for general nonnegative matrices, not just stochastic matrices) forbids that, because the eigenvector $v_1$ will have everywhere nonzero entries.
$endgroup$
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
add a comment |
$begingroup$
I'm pretty sure you're summing over $n$, not $i$.
Assuming that, from diagonalization you have that $(Q^n)_{ij}$ is some linear combination of $lambda_k^n$ where $lambda_k$ are all the eigenvalues of $Q$ (say sorted in decreasing order of magnitude). So the radius of convergence is presumably $1/|lambda_1|$, except for the possibility that somehow $(Q^n)_{ij}$ had no contribution from $lambda_1$. But the Perron-Frobenius theorem (for general nonnegative matrices, not just stochastic matrices) forbids that, because the eigenvector $v_1$ will have everywhere nonzero entries.
$endgroup$
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
add a comment |
$begingroup$
I'm pretty sure you're summing over $n$, not $i$.
Assuming that, from diagonalization you have that $(Q^n)_{ij}$ is some linear combination of $lambda_k^n$ where $lambda_k$ are all the eigenvalues of $Q$ (say sorted in decreasing order of magnitude). So the radius of convergence is presumably $1/|lambda_1|$, except for the possibility that somehow $(Q^n)_{ij}$ had no contribution from $lambda_1$. But the Perron-Frobenius theorem (for general nonnegative matrices, not just stochastic matrices) forbids that, because the eigenvector $v_1$ will have everywhere nonzero entries.
$endgroup$
I'm pretty sure you're summing over $n$, not $i$.
Assuming that, from diagonalization you have that $(Q^n)_{ij}$ is some linear combination of $lambda_k^n$ where $lambda_k$ are all the eigenvalues of $Q$ (say sorted in decreasing order of magnitude). So the radius of convergence is presumably $1/|lambda_1|$, except for the possibility that somehow $(Q^n)_{ij}$ had no contribution from $lambda_1$. But the Perron-Frobenius theorem (for general nonnegative matrices, not just stochastic matrices) forbids that, because the eigenvector $v_1$ will have everywhere nonzero entries.
answered Dec 2 '18 at 18:42
IanIan
67.6k25387
67.6k25387
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
add a comment |
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
Yes, summing over $n$, sorry! Also, why are we guaranteed that $Q$ is diagonalizable?
$endgroup$
– jackson5
Dec 2 '18 at 18:49
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
$begingroup$
@jackson5 If it isn't, you can still write down the Jordan normal form version of this statement, but there might be some $n^ell$'s floating around for the generalized eigenvectors. But $v_1$ will still have everywhere positive entries, so that contribution should determine the radius of convergence. However I see a snag in the periodic case: in this situation you need to be worried that the radius of convergence might go up due to cancellations between terms whose eigenvalues have the same magnitude as $lambda_1$. In the aperiodic case what I said goes through, however.
$endgroup$
– Ian
Dec 2 '18 at 18:58
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3023010%2fradius-of-convergence-of-power-series-of-sub-stochastic-matrix%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown