# Coin toss question

Discussion in 'Probability' started by Dave Aronson, Nov 15, 2011.

1. ### Dave AronsonGuest

Maybe this is a stupid question. For a fair coin the probability of 10
heads in a row is (0.5)^10, right?

But can we say that as the number of tosses approaches infinity the
probability of ever getting a 10 head streak approaches 1?

Dave Aronson, Nov 15, 2011

2. ### danheymanGuest

yes! Any event with positive probability will occur w.p.1 in
infinitely many independent trials. Let p>0 be the probability of the
event. The probability that it doesn't occur in n trials is p^n which -

danheyman, Nov 15, 2011

3. ### Dave AronsonGuest

Did you may mean to say that the probability that it doesn't occur in
n trials is (1-p)^n ?

Dave Aronson, Nov 25, 2011
4. ### danheymanGuest

Yes.

danheyman, Nov 26, 2011
5. ### Dave AronsonGuest

Thanks. The reason I asked is because I read a paper on gambler's ruin
I found in math pages. The link is here:

http://www.mathpages.com/home/kmath084/kmath084.htm

It turns out from the first result in the above link that the
probability of ruin is (1 - prob of reaching N before reaching h ) and
as N goes to infinity, for r < 1, it is equal to r^h, where h is the
starting bankroll (r < 1).

I find this fascinating because if what we discussed holds for
infinite trials and we define as r^h < 1 the probability of some
event, in this case the probability of gambler's ruin, as the number
of trials approaches infinity then ruin is inevitable although the
bankroll of the gambler is becoming infinite.

or am I missing something here?

Dave Aronson, Nov 26, 2011
6. ### cjq70Guest

Yes, you're getting confused over limits. The link you cited says the
probability of successfully reaching N is (1-r^h)/(1-r^N), which is 1
when h = N. So if you set both N and h equal to any arbitrarily large
number Q, the probability of ruin is zero, and this remains true as Q
increases. The probability of ruin is (r^h - r^N)/(1 - r^N), and if
you let both N and h go to the same infinitely large number, the
numerator is zero. Also, r^h goes to zero as h increases (assuming r
is less than 1).

cjq70, Nov 26, 2011
7. ### Dave AronsonGuest

Thanks. That is what I though too until i read the following argument.
The limit in this case is related to h and N. But we can also have a
limit on the number of trials this happens, like when flipping coins.

Let p be equal to the probability of ruin with p < 1, when N is
larger than a very large number Q. Then as Q increases the probability
of ruin converges to r^h < 1, if r < 1. Now, given that N is already
large, the number of trials n increased to infinity, the probability
of not getting ruined is (1- r^h)^n ---> 0, i.e. ruin is certain.
As I said, there is the limit w.r.t. N but also a limit w.r.t n when N

Dave Aronson, Nov 27, 2011
8. ### cjq70Guest

You're still confused. The gambler begins with h dollars, and wants
to know the probability of reaching N before going broke, which for
sufficiently large N (and r<1) is approximately (1-r^h). You're
describing a completely different game, in which, if the gambler
reaches N, he gives back all his gains and starts over with h dollars
for a second "trial". And if he reaches N again, he gives back all
his gains and starts over again with h dollars for a third trial, and
so on, up to n "trials". The probability that he successfully reaches
N in every one of these n trials is (1-r^h)^n, which obviously goes to
zero as n increases. That's trivial and self-evident. For any game
in which our probability of winning is less than 1, it's obvious that
if we keep playing enough times, we will eventually lose a game.
That's basically the definition of the probability being less than 1.

This has nothing in particular to do with the gambler's question,
because our gambler doesn't arbitrarily give all his winnings back and
start over at h each time he reaches N. More interesting is the
opposite point, i.e., even if we set N to some arbitrarily huge number
(essentially infinite), the gambler still has probability of never
being ruined, assuming r is less than 1. So he has some positive
probability of playing forever and never going broke. The inevitable
ruin occurs only with r = 1 or greater.

cjq70, Nov 27, 2011
9. ### Dave AronsonGuest

Has it crossed your mind that you may not understand what I am getting
into after all?
No, that is not what I am describing. This is something you are making
up and it is not related to my point at all.

Think of it in another way. The game progresses as the number of
trials increases. At some point the bankroll is very large but also
the number of trials is very large. After all, how large is infinity?
When does the gambler stop and say now I have an infinite bankroll?
Since inf + inf = inf. the gambler many not stop soon enough and in
the meantime, his probability of ruin will become 1 and lose it all in
the first round.
I answered this above. This is possibly ralated to the paradoxes of
infinity.

I think this is not right. Ruin is inevitable in the limit of infinity
for any r.

Dave Aronson, Nov 27, 2011
10. ### cjq70Guest

If we say N is "infinite", then it is basically a random walk,
beginning at h, and the probability that he will reach zero is r^h.
This means that he has prior probability 1-r^h that he will NEVER go
bankrupt. This is only possible if r<1, which implies he has greater
than 50/50 odds of winning in each round. This shouldn't surprise
you.
No, you're wrong. Suppose you start a walk at h steps away from zero,
and on each step you have a 90% chance of stepping forward and a 10%
chance of stepping backwards. Do you think you will eventually get to
zero? The answer is obviously no, but you seem to think the answer is
yes.

I suspect you're confused by the fact that if we are, say, 100 steps
away from zero, there is a finite probability of getting 100 backward
steps in a row, so if we keep walking, eventually we will get a string
of 100 backward steps, and return to zero. But that's overlooking the
fact that soon you will most likely be at 200, so you really need 200
backward steps in a row, and then a little later you need 2000, and so
on. As you continue walking, your distance from zero almost certainly
is increasing, so by the time you eventually get that 100 backward
steps in a row, you will almost certainly be a trillion steps away
from zero, so it won't bring you back to zero. Granted, you will
eventually get a trillion backward steps in a row, but by that time
you will almost certainly be a gadzillion steps from zero, and so on.

This is all taken into account by the gambler's ruin formulas with N
set to "infinity". Starting from h (with r<1), the probability that
you will NEVER reach zero is 1-r^h. You will just keep getting
further and further from zero, i.e, the gambler's winnings will keep
increasing forever.
Yes, it is exactly what you are describing. You talk about "trials",
but there are no "trials" in the gambler's game, there is only one
game that starts with h dollars and ends when he either goes broke or
reaches N. If you want to talk about a case where there is no N, and
he is going to play indefinitely, then you would just take the limit
of the probabilities as N goes to infinity. So the probability of
eventually going broke is r^h.

But you are talking about a completely different game, involving n
"trials". You haven't actually defined a "trial", but from what
you've said, your "trials" can only be repetitions of the entire
gambler's game, in which he starts with h dollars and either he
eventually goes broke or he never goes broke. So you can think of n
gamblers, each playing this game, and the probability that NONE of the
n gamblers EVER goes broke is (1-r^h)^n.

No, the gambler's game consists of him starting with h dollars, and it
ends when he either goes broke or reaches N. If you want to talk
about playing with no N, then he either eventually goes broke or else
he never goes broke. Both of these are possibilitries. The "trials"
you talk about are for an entirely different game that you invented
yourself, where there are n gamblers, each starting from h, and you're
asking for the probability that at least one of them will go broke as
the number of gamblers goes to infinity. (Yes, this is what you are
describing, you just don't realize it.)
No, you're completely confused. Each of your "trials" has to begin
with h dollars, because otherwise you can't say that the probability
of not going bankrupt on that trial is 1-r^h. There are no separate
"trials" in the gambler's game. If you are talking about the bankroll
continuing to increase, then you are not starting a new trial, you are
simply continuing the gambler's game, and the probabilities for that
are already known. There is no "n" in the gambler's game.

cjq70, Nov 27, 2011
11. ### cjq70Guest

I meant to say you wouldn't *necessarily* get to zero. You have a
positive
probability of never reaching zero.

cjq70, Nov 27, 2011