kolm wrote:I'm sorry, but I don't understand this. How is the probability of coin 2 being heads higher? They're two independant coins, so the probability of each coin being heads is 0.5 (henseforth, the probability of you guessing them both correctly is 0.5 x 0.5 = 0.25), if my maths is right?
Not trying to attack, I just don't understand how you're working this out

It's counter-intuitive, but I assure you it's true (assuming it's presented as I outlined. Having said that,I can offer you a few different ways of looking at it; maybe one of them will resonate with you. The key is that the patter removes heads-heads from consideration.
Way-to-wrap-your-head-around-it-number-one. Bear in mind that we start to consider the "second" coin, which is identical to the "first" coin, only if we know for a fact that at least one of the coins is tails. Flipping 2 coins gives us the following likelihoods of outcomes:
1. Both are heads: 25% (.5)(.5)
2. One is heads, and the other is tails: 50% (.5)(.5) doubled, because EITHER could be heads, and the other could be tails.
3. Both are tails: 25% (.5)(.5)
The patter removes all consideration of a "second" coin from the possibilities; those are treated as instant wins. So, we're only discussing the "second coin" in cases 2 and 3, and when we are, we're having the spectator remove a coin that came up "tails." Notice that case 2, where the other coin is heads, is twice as likely as case 3, where the other coin is tails.
Way-to-wrap-your-head-around-it-number-two. Let's say there was a way to distinguish between the coins...for instance, the dates were different. Let's call them 2006 and 2007. You run through the patter about they're not both heads, and you get the spectator to show you that the 2006 coin was tails. Now you're working on the 2007 coin. If you say the 2007 was tails or if you say it was heads, in a sense, that's 50-50. BUT there's another independent probability calculation to consider. If one coin were heads and the other were tails, then the spectator's actions were forced - you asked him to show you one that was tails, and he showed you the one he had to show you. But if you say the second coin is tails, you're really betting on the 50% chance that it came up tails multiplied by the 50% chance that when you asked the spectator to show you a coin that came up tails, he picked the 2006 coin (because in the tails-tails case, he could have just as easily shown you the 2007 coin).
So betting the 2007 coin is heads was (.5)(1) The independent probabilities are: .5 that it was heads, times the probability that if it WAS heads, that would be the coin he still hadn't shown you. That's 100%, because when one of the coins is heads, that's the one he's not allowed to show you.
Betting the 2007 coin is tails is (.5)(.5). The independent probabilities are: .5 that it was tails, times the probability that if it WAS tails, that would be the coin he still hadn't show you.
Now, that doesn't add up to 100%. That's 50% for heads and 25% for tails. That's because we've altered the equation by eliminating "both are heads." So we're left with (as in explanation 1) the initial odds that both were tails (25%) as compared with one of each (50%). Bayes' Theorem (or something just like it), a useful tool for probability, tells us (essentially) that the actual odds can be calculated in accordance with the relative likelihood of the relevant cases. Since 50% is twice as much as 25%, that means the actual odds are 2/3 heads and 1/3 tails.
Way-to-wrap-your-head-around-it-number-three. Brute force. If you've got 15 or 20 minutes to kill, flip two coins about 100 times. Ignore the times when both are heads. When it's NOT the case that both are heads, set aside a coin that's tails. Keep track of what the other coin is. You'll find fairly quickly that it's heads twice as often as it's tails.
Way-to-wrap-your-head-around-it-number-four. Analogy with larger numbers. Let's say you flip 100 fair coins behind the screen, and I say, "You didn't get more than 50 tails did you?" You tell me that you didn't. I say, "No, I saw 50 heads very clearly. Hand me 50 of the coins the game up heads." You do so. Now I reach behind the screen and touch a random coin. 50-50? No way. The odds are overwhelming that you've got MANY more tails than heads remaining, even though the odds going in were 50-50 for each individual coin. If you have many more tails than heads remaining, any coin I touch at random is much more likely to be tails. The reason, again, is that I "cheated" by getting half of the coins removed from consideration.
In essence, the probabilities are no longer "independent," because I'm biased in the coins I'm getting handed to me. I'm only looking at heads, so I'm altering the randomness of the coins NOT handed to me. If I'd said, "Hand me the first 50 coins you flipped," and they all happened to be heads, the 51st coin would truly be 50-50. The difference between the two cases is subtle, yet significant.
Related issue is called the "Monte Hall Trap" (or, as bridge players know a variation of it, the principle of restricted choice). Here's the idea... three boxes. Inside one is thousand dollars. Inside the other two are rocks. You name a box, 1, 2, or 3. After you do, I'll open one of the OTHER boxes, and the one I open will have a rock in it. Now, you'll have the chance to keep your original box, or trade for the third box.
So, for instance, you pick box #1. We don't open it yet. Instead, I say, "Gee, it's good you didn't pick box #2; that one has a rock in it." I open #2, and, indeed (as I knew beforehand) there's a rock in it. Now, is it better, worse, or indifferent for you to keep #1 or switch for #3?
To be continued...