Expected number of rounds, adding $g$ rounds with probability $r$












1












$begingroup$


This is closely related to the one on the link below, but I cannot really map it to mine...



I play a game, with probability $r$ the game adds $g$ more rounds.
What is the expected number of rounds played?



So, I was trying to thinking it as the game in the link, and then considering that to win a player must win with a difference of $g$, but I am not finding a way to put it.



$h=text{number of rounds played}$



$E(h) = 1 + (1-r)cdot b + rcdot c$



where $b$ is $E$(number of games played if on the last round the player didn't get $g$ extra rounds) and



$c$ is $E$(number of games played if on the last round the player did get $g$ more rounds)



But I have no clue if this makes sense because now I cannot model $b$ and $c$.



I have this feeling that this gets somehow recursive...
like $b = g + (1-r)cdot b + rcdot c$



but I am quite stuck :S



Finding the winning probability of the game










share|cite|improve this question









$endgroup$

















    1












    $begingroup$


    This is closely related to the one on the link below, but I cannot really map it to mine...



    I play a game, with probability $r$ the game adds $g$ more rounds.
    What is the expected number of rounds played?



    So, I was trying to thinking it as the game in the link, and then considering that to win a player must win with a difference of $g$, but I am not finding a way to put it.



    $h=text{number of rounds played}$



    $E(h) = 1 + (1-r)cdot b + rcdot c$



    where $b$ is $E$(number of games played if on the last round the player didn't get $g$ extra rounds) and



    $c$ is $E$(number of games played if on the last round the player did get $g$ more rounds)



    But I have no clue if this makes sense because now I cannot model $b$ and $c$.



    I have this feeling that this gets somehow recursive...
    like $b = g + (1-r)cdot b + rcdot c$



    but I am quite stuck :S



    Finding the winning probability of the game










    share|cite|improve this question









    $endgroup$















      1












      1








      1


      1



      $begingroup$


      This is closely related to the one on the link below, but I cannot really map it to mine...



      I play a game, with probability $r$ the game adds $g$ more rounds.
      What is the expected number of rounds played?



      So, I was trying to thinking it as the game in the link, and then considering that to win a player must win with a difference of $g$, but I am not finding a way to put it.



      $h=text{number of rounds played}$



      $E(h) = 1 + (1-r)cdot b + rcdot c$



      where $b$ is $E$(number of games played if on the last round the player didn't get $g$ extra rounds) and



      $c$ is $E$(number of games played if on the last round the player did get $g$ more rounds)



      But I have no clue if this makes sense because now I cannot model $b$ and $c$.



      I have this feeling that this gets somehow recursive...
      like $b = g + (1-r)cdot b + rcdot c$



      but I am quite stuck :S



      Finding the winning probability of the game










      share|cite|improve this question









      $endgroup$




      This is closely related to the one on the link below, but I cannot really map it to mine...



      I play a game, with probability $r$ the game adds $g$ more rounds.
      What is the expected number of rounds played?



      So, I was trying to thinking it as the game in the link, and then considering that to win a player must win with a difference of $g$, but I am not finding a way to put it.



      $h=text{number of rounds played}$



      $E(h) = 1 + (1-r)cdot b + rcdot c$



      where $b$ is $E$(number of games played if on the last round the player didn't get $g$ extra rounds) and



      $c$ is $E$(number of games played if on the last round the player did get $g$ more rounds)



      But I have no clue if this makes sense because now I cannot model $b$ and $c$.



      I have this feeling that this gets somehow recursive...
      like $b = g + (1-r)cdot b + rcdot c$



      but I am quite stuck :S



      Finding the winning probability of the game







      probability combinations mathematical-modeling






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Dec 18 '18 at 16:23









      myradiomyradio

      1267




      1267






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          I am assuming




          • there is a counter keeping track of the number of rounds left,

          • this counter starts at some number $n$,

          • for each step, with probability $(1-r),$ the counter goes down by one, and with probability $r$ it goes up by $g-1$,

          • you want the expected number of steps for the counter to reach zero.


          Correct me if I am wrong.





          I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.



          Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then
          $$
          E_n=1+(1-r)E_{n-1}+rE_{n+g-1}.
          $$



          Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.



          You can combine these two equations to solve for $E_n$. The result is
          $$
          E_n = nE_1=n/(1-rg).
          $$

          When $rg=1$, the game will certainly end, but the expected end time is infinite.





          Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$.
          begin{align}
          text{number of rounds until counter hits $0$}
          &=quad text{number of rounds until counter hits $n-1$ }\
          &quad+ text{number of rounds after that until counter hits $n-2$ }\
          &quad+ text{number of rounds after that until counter hits $n-3$ }\
          &quad;vdots\
          &quad+ text{number of rounds after that until counter hits $1$ }\
          &quad+ text{number of rounds after that until counter hits $0$ }\
          end{align}

          The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
            $endgroup$
            – myradio
            Dec 19 '18 at 1:46












          • $begingroup$
            @myradio I stand corrected, and have made the change.
            $endgroup$
            – Mike Earnest
            Dec 19 '18 at 3:20










          • $begingroup$
            Great!, now, I understand the concept but not your sentence starting with "bevause"
            $endgroup$
            – myradio
            Dec 20 '18 at 1:05










          • $begingroup$
            @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
            $endgroup$
            – Mike Earnest
            Dec 20 '18 at 6:38











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3045359%2fexpected-number-of-rounds-adding-g-rounds-with-probability-r%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          I am assuming




          • there is a counter keeping track of the number of rounds left,

          • this counter starts at some number $n$,

          • for each step, with probability $(1-r),$ the counter goes down by one, and with probability $r$ it goes up by $g-1$,

          • you want the expected number of steps for the counter to reach zero.


          Correct me if I am wrong.





          I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.



          Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then
          $$
          E_n=1+(1-r)E_{n-1}+rE_{n+g-1}.
          $$



          Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.



          You can combine these two equations to solve for $E_n$. The result is
          $$
          E_n = nE_1=n/(1-rg).
          $$

          When $rg=1$, the game will certainly end, but the expected end time is infinite.





          Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$.
          begin{align}
          text{number of rounds until counter hits $0$}
          &=quad text{number of rounds until counter hits $n-1$ }\
          &quad+ text{number of rounds after that until counter hits $n-2$ }\
          &quad+ text{number of rounds after that until counter hits $n-3$ }\
          &quad;vdots\
          &quad+ text{number of rounds after that until counter hits $1$ }\
          &quad+ text{number of rounds after that until counter hits $0$ }\
          end{align}

          The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
            $endgroup$
            – myradio
            Dec 19 '18 at 1:46












          • $begingroup$
            @myradio I stand corrected, and have made the change.
            $endgroup$
            – Mike Earnest
            Dec 19 '18 at 3:20










          • $begingroup$
            Great!, now, I understand the concept but not your sentence starting with "bevause"
            $endgroup$
            – myradio
            Dec 20 '18 at 1:05










          • $begingroup$
            @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
            $endgroup$
            – Mike Earnest
            Dec 20 '18 at 6:38
















          1












          $begingroup$

          I am assuming




          • there is a counter keeping track of the number of rounds left,

          • this counter starts at some number $n$,

          • for each step, with probability $(1-r),$ the counter goes down by one, and with probability $r$ it goes up by $g-1$,

          • you want the expected number of steps for the counter to reach zero.


          Correct me if I am wrong.





          I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.



          Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then
          $$
          E_n=1+(1-r)E_{n-1}+rE_{n+g-1}.
          $$



          Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.



          You can combine these two equations to solve for $E_n$. The result is
          $$
          E_n = nE_1=n/(1-rg).
          $$

          When $rg=1$, the game will certainly end, but the expected end time is infinite.





          Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$.
          begin{align}
          text{number of rounds until counter hits $0$}
          &=quad text{number of rounds until counter hits $n-1$ }\
          &quad+ text{number of rounds after that until counter hits $n-2$ }\
          &quad+ text{number of rounds after that until counter hits $n-3$ }\
          &quad;vdots\
          &quad+ text{number of rounds after that until counter hits $1$ }\
          &quad+ text{number of rounds after that until counter hits $0$ }\
          end{align}

          The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
            $endgroup$
            – myradio
            Dec 19 '18 at 1:46












          • $begingroup$
            @myradio I stand corrected, and have made the change.
            $endgroup$
            – Mike Earnest
            Dec 19 '18 at 3:20










          • $begingroup$
            Great!, now, I understand the concept but not your sentence starting with "bevause"
            $endgroup$
            – myradio
            Dec 20 '18 at 1:05










          • $begingroup$
            @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
            $endgroup$
            – Mike Earnest
            Dec 20 '18 at 6:38














          1












          1








          1





          $begingroup$

          I am assuming




          • there is a counter keeping track of the number of rounds left,

          • this counter starts at some number $n$,

          • for each step, with probability $(1-r),$ the counter goes down by one, and with probability $r$ it goes up by $g-1$,

          • you want the expected number of steps for the counter to reach zero.


          Correct me if I am wrong.





          I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.



          Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then
          $$
          E_n=1+(1-r)E_{n-1}+rE_{n+g-1}.
          $$



          Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.



          You can combine these two equations to solve for $E_n$. The result is
          $$
          E_n = nE_1=n/(1-rg).
          $$

          When $rg=1$, the game will certainly end, but the expected end time is infinite.





          Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$.
          begin{align}
          text{number of rounds until counter hits $0$}
          &=quad text{number of rounds until counter hits $n-1$ }\
          &quad+ text{number of rounds after that until counter hits $n-2$ }\
          &quad+ text{number of rounds after that until counter hits $n-3$ }\
          &quad;vdots\
          &quad+ text{number of rounds after that until counter hits $1$ }\
          &quad+ text{number of rounds after that until counter hits $0$ }\
          end{align}

          The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.






          share|cite|improve this answer











          $endgroup$



          I am assuming




          • there is a counter keeping track of the number of rounds left,

          • this counter starts at some number $n$,

          • for each step, with probability $(1-r),$ the counter goes down by one, and with probability $r$ it goes up by $g-1$,

          • you want the expected number of steps for the counter to reach zero.


          Correct me if I am wrong.





          I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.



          Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then
          $$
          E_n=1+(1-r)E_{n-1}+rE_{n+g-1}.
          $$



          Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.



          You can combine these two equations to solve for $E_n$. The result is
          $$
          E_n = nE_1=n/(1-rg).
          $$

          When $rg=1$, the game will certainly end, but the expected end time is infinite.





          Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$.
          begin{align}
          text{number of rounds until counter hits $0$}
          &=quad text{number of rounds until counter hits $n-1$ }\
          &quad+ text{number of rounds after that until counter hits $n-2$ }\
          &quad+ text{number of rounds after that until counter hits $n-3$ }\
          &quad;vdots\
          &quad+ text{number of rounds after that until counter hits $1$ }\
          &quad+ text{number of rounds after that until counter hits $0$ }\
          end{align}

          The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 20 '18 at 6:36

























          answered Dec 18 '18 at 17:23









          Mike EarnestMike Earnest

          24.9k22151




          24.9k22151








          • 1




            $begingroup$
            Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
            $endgroup$
            – myradio
            Dec 19 '18 at 1:46












          • $begingroup$
            @myradio I stand corrected, and have made the change.
            $endgroup$
            – Mike Earnest
            Dec 19 '18 at 3:20










          • $begingroup$
            Great!, now, I understand the concept but not your sentence starting with "bevause"
            $endgroup$
            – myradio
            Dec 20 '18 at 1:05










          • $begingroup$
            @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
            $endgroup$
            – Mike Earnest
            Dec 20 '18 at 6:38














          • 1




            $begingroup$
            Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
            $endgroup$
            – myradio
            Dec 19 '18 at 1:46












          • $begingroup$
            @myradio I stand corrected, and have made the change.
            $endgroup$
            – Mike Earnest
            Dec 19 '18 at 3:20










          • $begingroup$
            Great!, now, I understand the concept but not your sentence starting with "bevause"
            $endgroup$
            – myradio
            Dec 20 '18 at 1:05










          • $begingroup$
            @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
            $endgroup$
            – Mike Earnest
            Dec 20 '18 at 6:38








          1




          1




          $begingroup$
          Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
          $endgroup$
          – myradio
          Dec 19 '18 at 1:46






          $begingroup$
          Your third statement should read for each step, with probability $textbf{(1-r)}$ goes down by one and with probability $textbf{r}$ goes up by $textbf{g-1}$. (I am changing the last $r-1$ for $g-1$), correct?
          $endgroup$
          – myradio
          Dec 19 '18 at 1:46














          $begingroup$
          @myradio I stand corrected, and have made the change.
          $endgroup$
          – Mike Earnest
          Dec 19 '18 at 3:20




          $begingroup$
          @myradio I stand corrected, and have made the change.
          $endgroup$
          – Mike Earnest
          Dec 19 '18 at 3:20












          $begingroup$
          Great!, now, I understand the concept but not your sentence starting with "bevause"
          $endgroup$
          – myradio
          Dec 20 '18 at 1:05




          $begingroup$
          Great!, now, I understand the concept but not your sentence starting with "bevause"
          $endgroup$
          – myradio
          Dec 20 '18 at 1:05












          $begingroup$
          @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
          $endgroup$
          – Mike Earnest
          Dec 20 '18 at 6:38




          $begingroup$
          @myradio See my more detailed explanation in the edit. Note that this argument works precisely because the counter decreases in steps of size one, so it must hit every intermediate value on its path from $n$ to $0$.
          $endgroup$
          – Mike Earnest
          Dec 20 '18 at 6:38


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3045359%2fexpected-number-of-rounds-adding-g-rounds-with-probability-r%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bundesstraße 106

          Verónica Boquete

          Ida-Boy-Ed-Garten