I’ve initially had the same conclusion of 1/2, with a variation of the same demonstration: since this is a particular case of an N envelops problem, after you opened N-1 envelops; swap (or not) the highest so far with the last one? By reductio ad absurdum, probability of winning goes to 1 as the number of envelops goes to infinity, starting from different than 1/2.

BUT: But where exactly is Andrew wrong then? I really can’t get it. Should be something related with the openness of R?!

Maybe the claim that “the probability that we chose a number in the interval between them is some p > 0” is not true. why? I think that the probability of ending up between 2 random numbers is not defined; this is not the same as 0, or unknown but positive; we would need the entire distribution (the all infinity of tries of random distributions just for having a P defined)

Oh, what if we open the envelope first, and then we generate a random number by an (recurrent maybe) f(x) which converge to a random number, when x->infinity? in this way we will avoid an infinity of tries, and compare the already envelope number with the random one after a finite number of x steps. Well, then Anthony and Andrew are right.

However, it works only for positive numbers. this +/ “diverge” all my examples if an f(x)… ðŸ˜€

I keep 1/2, for now. c.u.

—–

other random thoughts:

If it is to search an “absolute” point:I wouldn’t have a random choice myself, but I would chose 0; once a positive or negative number is reviled, there is omega-1 left on that side (a bit less than an infinite). so I would switch when the number is <=0.

However this classification is arbitrary. We can have an infinite number of classifications (speculation — probably not). I only need to evaluate the number of classifications depending on a distance/norm … too complicated…

What about odd / even numbers … oh, we are on R – not enough Zs; then an (mod 2) variant . … even worst than positive/negative…

]]>Consider an instance where Alice goes broke before Bob. Now swap heads with tails and switch their flip strings. This event has the same probability and has Bob going bankrupt before Alice. Indeed, the number of heads and tails are the same as in the first instance (and are the same).

In the second case we swap the heads and tails (reflect the walk over the x-axis) and get a string of the same length which has 100 less heads and 100 more tails hence is (q/p)^100 as probable and corresponds to Bob going broke before Alice. In particular, the probability of Alice going broke first is p^100/(p^100+q^100).

]]>A sequence of $2k+100$ tosses ($k$ heads and $k+100$ tails) ending in bankruptcy for Alice, has probability $A_k=p^k q^{k+100}$. The ratio of these is a constant $A_k/B_k=(q/p)^100$, exactly the apriori probability that Alice goes broke.

Part 2: By the argument in part 1, the conditional probability $P_B$ that Bob goes broke first (given that Alice also goes broke) is the probability that a biased random walk $S_t$ (counting heads minus tails) going up 1 with probability $q$ and down 1 with probability $p$, will reach 100 before reaching $-100$. Since $(p/q)^S_t$ is a Martingale,

1=(p/q)^{100}P_B+(p/q)^{-100}(1-P_B)$ so $P_B=[1-(p/q)^{-100}]/ [(p/q)^{100}-(p/q)^{-100}]$ which is slightly less than 2 percent.

Bob is very likely (almost surely) going to go bankrupt. On average he loses 0.02 coins a flip so the time he takes to go bankrupt is distributed around 5000 flips.

Alice on the other hand is probably not going to go bankrupt at all. So the question is how long does it take until she goes bankrupt CONDITIONED on her going bankrupt. When she goes bankrupt it probably happens pretty early on (she’s much more likely to have a 100 dollars of loses at the start than 200 dollars of losses after a typical first 5000 flips). So I’d guess that the expected time to bankruptcy given she goes bankrupt is small.

So that’s the first problem.

For the second problem call the process X = total heads – total tails. We know X is an (on average) increasing process starting at 0 that will at some point pass through both -100 and 100. Now getting up to 100 and then losing 200 (to get back to -100) involves the unlikely event of losing 200. To get to -100 from 0 involves the less unlikely event of losing 100 (and then having the coin’s bias handle the rest).

]]>