Think of a coin...

Can't find a suitable category? Post it here!!

Moderators: nickj, Lady of Mystery, Mandrake, bananafish, support

Think of a coin...

Postby LobowolfXXX » Sep 28th, '07, 21:00



Along the lines of my "think of a card" post, I've found this generates surprisingly disproportionate reactions (and is impromptu, assuming you have a couple of identical coins).

You ask the spectator to flip two identical coins behind a screen, and concentrate on the results (both heads, both tails, or one of each).

Now you say, "They're not both heads, are they?" Naturally, if they say yes, you claim credit and have a mini-miracle.

If they say no, you immediately say, "No, I could clearly see at least one tail...hand me that first one, so you can concentrate on the second one."

After they do so, say, "The second one is very difficult, because the other one was so strong...I'm still feeling the tail, but I believe it's from the other coin. Is this one heads?"

Note: Probability theory dictates that it is twice as likely that the second coin will be heads.




From the spectator's point of view, you have a 75% chance of seeming to get BOTH coins right. You have a 25% chance of getting 1 out of 2 right, but even that plays more powerful than it would seem, because you appear to have KNOWN that the first one was a tail...the second one, you were mostly guessing on, as you admitted. So you get credit for "hitting" the first one, but the effect on the second one shouldn't be that you "missed" (thereby negating your success on the first coin), but rather that it wasn't clear; so you were essentially 1 for 1, not 1 for 2.


Probabilistically, there's a 25% chance they're both heads - you take instant credit for a double hit. There's a 50% chance there's one of each - you ultimately get credit for a double hit here, and you may even get "extra" credit by the patter that suggests that the second one was harder than usual, because the FIRST coin was interfering with your read. There's a 25% chance that you hit the first one and miss the second, but as described above, that can be played to appear better than 50-50.

For the daring (if you want to trust your people-reading skills to buck probability theory)...assuming they confirm that the coins aren't both heads, say, "No, I could clearly see one coin was tails. Hand me the tail." If they hand you the tail IMMEDIATELY, it's likely that it was the only tail, so proceed as above, or if you want to get more credit (or take more blame), state CONFIDENTLY that the other one is a tail. If they HESITATE before handing you "the" tail, it's probably because BOTH coins were tails. Now, even though the mathematics of the situation dictate that the other coin should be heads, you may choose to go against that, and call the other one as tails, also (not immediately, as that will suggest that the spectator's hesitation gave it away). Instead, seem to not have noticed the hesitation, and ask the spectator to concentrate on the second coin, perhaps clenching it in his/her fist and holding it up to your head.


The reason this effect plays stronger than it should is this - if the coins were different, say a penny and a nickel, it wouldn't be enough to claim that you saw a tail...if you really "saw" it, you should be able to name which coin was the tail. By making the coins identical, when you claim a tail, and ask that it be brought out and set aside, it appears that you saw that THAT "first coin" was tails, while the "second coin" was heads (assuming there's one of each - the most common result - and you go 2 for 2). Rather than simultaneously claiming "one of each," which is 50-50, you've APPARENTLY made the coins distinguishable, so unlike the penny and nickel situation, it seems that you knew WHICH coin was tails (and it certainly seems like you knew which was HEADS, as by having removed the first coin, the spectator is now focusing only on the remaining coin, while you're merely riding the initial odds that the coins would in fact land on opposite sides).

LobowolfXXX
Full Member
 
Posts: 51
Joined: Sep 21st, '07, 02:38

Postby kolm » Sep 28th, '07, 22:55

I'm sorry, but I don't understand this. How is the probability of coin 2 being heads higher? They're two independant coins, so the probability of each coin being heads is 0.5 (henseforth, the probability of you guessing them both correctly is 0.5 x 0.5 = 0.25), if my maths is right?

Not trying to attack, I just don't understand how you're working this out :)

"People who hail from Manchester cannot possibly be upper class and therefore should not use silly pretentious words"
User avatar
kolm
Advanced Member
 
Posts: 1974
Joined: Apr 18th, '07, 22:58

Postby LobowolfXXX » Sep 29th, '07, 08:55

kolm wrote:I'm sorry, but I don't understand this. How is the probability of coin 2 being heads higher? They're two independant coins, so the probability of each coin being heads is 0.5 (henseforth, the probability of you guessing them both correctly is 0.5 x 0.5 = 0.25), if my maths is right?

Not trying to attack, I just don't understand how you're working this out :)



It's counter-intuitive, but I assure you it's true (assuming it's presented as I outlined. Having said that,I can offer you a few different ways of looking at it; maybe one of them will resonate with you. The key is that the patter removes heads-heads from consideration.


Way-to-wrap-your-head-around-it-number-one. Bear in mind that we start to consider the "second" coin, which is identical to the "first" coin, only if we know for a fact that at least one of the coins is tails. Flipping 2 coins gives us the following likelihoods of outcomes:
1. Both are heads: 25% (.5)(.5)
2. One is heads, and the other is tails: 50% (.5)(.5) doubled, because EITHER could be heads, and the other could be tails.
3. Both are tails: 25% (.5)(.5)

The patter removes all consideration of a "second" coin from the possibilities; those are treated as instant wins. So, we're only discussing the "second coin" in cases 2 and 3, and when we are, we're having the spectator remove a coin that came up "tails." Notice that case 2, where the other coin is heads, is twice as likely as case 3, where the other coin is tails.




Way-to-wrap-your-head-around-it-number-two. Let's say there was a way to distinguish between the coins...for instance, the dates were different. Let's call them 2006 and 2007. You run through the patter about they're not both heads, and you get the spectator to show you that the 2006 coin was tails. Now you're working on the 2007 coin. If you say the 2007 was tails or if you say it was heads, in a sense, that's 50-50. BUT there's another independent probability calculation to consider. If one coin were heads and the other were tails, then the spectator's actions were forced - you asked him to show you one that was tails, and he showed you the one he had to show you. But if you say the second coin is tails, you're really betting on the 50% chance that it came up tails multiplied by the 50% chance that when you asked the spectator to show you a coin that came up tails, he picked the 2006 coin (because in the tails-tails case, he could have just as easily shown you the 2007 coin).

So betting the 2007 coin is heads was (.5)(1) The independent probabilities are: .5 that it was heads, times the probability that if it WAS heads, that would be the coin he still hadn't shown you. That's 100%, because when one of the coins is heads, that's the one he's not allowed to show you.

Betting the 2007 coin is tails is (.5)(.5). The independent probabilities are: .5 that it was tails, times the probability that if it WAS tails, that would be the coin he still hadn't show you.

Now, that doesn't add up to 100%. That's 50% for heads and 25% for tails. That's because we've altered the equation by eliminating "both are heads." So we're left with (as in explanation 1) the initial odds that both were tails (25%) as compared with one of each (50%). Bayes' Theorem (or something just like it), a useful tool for probability, tells us (essentially) that the actual odds can be calculated in accordance with the relative likelihood of the relevant cases. Since 50% is twice as much as 25%, that means the actual odds are 2/3 heads and 1/3 tails.





Way-to-wrap-your-head-around-it-number-three. Brute force. If you've got 15 or 20 minutes to kill, flip two coins about 100 times. Ignore the times when both are heads. When it's NOT the case that both are heads, set aside a coin that's tails. Keep track of what the other coin is. You'll find fairly quickly that it's heads twice as often as it's tails.





Way-to-wrap-your-head-around-it-number-four. Analogy with larger numbers. Let's say you flip 100 fair coins behind the screen, and I say, "You didn't get more than 50 tails did you?" You tell me that you didn't. I say, "No, I saw 50 heads very clearly. Hand me 50 of the coins the game up heads." You do so. Now I reach behind the screen and touch a random coin. 50-50? No way. The odds are overwhelming that you've got MANY more tails than heads remaining, even though the odds going in were 50-50 for each individual coin. If you have many more tails than heads remaining, any coin I touch at random is much more likely to be tails. The reason, again, is that I "cheated" by getting half of the coins removed from consideration.

In essence, the probabilities are no longer "independent," because I'm biased in the coins I'm getting handed to me. I'm only looking at heads, so I'm altering the randomness of the coins NOT handed to me. If I'd said, "Hand me the first 50 coins you flipped," and they all happened to be heads, the 51st coin would truly be 50-50. The difference between the two cases is subtle, yet significant.




Related issue is called the "Monte Hall Trap" (or, as bridge players know a variation of it, the principle of restricted choice). Here's the idea... three boxes. Inside one is thousand dollars. Inside the other two are rocks. You name a box, 1, 2, or 3. After you do, I'll open one of the OTHER boxes, and the one I open will have a rock in it. Now, you'll have the chance to keep your original box, or trade for the third box.

So, for instance, you pick box #1. We don't open it yet. Instead, I say, "Gee, it's good you didn't pick box #2; that one has a rock in it." I open #2, and, indeed (as I knew beforehand) there's a rock in it. Now, is it better, worse, or indifferent for you to keep #1 or switch for #3?

To be continued...

LobowolfXXX
Full Member
 
Posts: 51
Joined: Sep 21st, '07, 02:38

Postby Replicant » Sep 29th, '07, 10:35

Kind of reminds me of the Monty Hall Problem. Interesting stuff.

User avatar
Replicant
Elite Member
 
Posts: 3951
Joined: Jun 7th, '05, 13:46
Location: Hertfordshire, UK (36:AH)

Postby kolm » Sep 29th, '07, 14:23

LobowolfXXX wrote:It's counter-intuitive, but I assure you it's true (assuming it's presented as I outlined. Having said that,I can offer you a few different ways of looking at it; maybe one of them will resonate with you. The key is that the patter removes heads-heads from consideration.


Ah yes, I forgot you was removing the first coin from the equation. Thanks :)

"People who hail from Manchester cannot possibly be upper class and therefore should not use silly pretentious words"
User avatar
kolm
Advanced Member
 
Posts: 1974
Joined: Apr 18th, '07, 22:58

Postby Marvo Marky » Sep 29th, '07, 15:29

Yeah Bicycle808 it reminded me of the Monty Hall problem a lot, as this one seems to hinge on the spectator not realising that TAILS-HEADS is a different outcome from HEADS-TAILS. Naturally this is the difference between permutations and combinations as any GCSE student will tell you.

I've had a read through LobowolfXXX's explanation and at first glance it does seem to be inherently different from the MH effect, at least on a mathematical level. But I might be wrong. I'll have a closer look after my tea.

Good for you Lobo, it's good to see some creative thinking there. I certainly haven't come across this example before. :D
Mind you I have to say that although it is a mathematical curiosity, there are several ways to produce very similar effects, but with a 100% hit rate. Magically, of course.

Either way I can certainly think of a use for it.

Regards,

Mark.

User avatar
Marvo Marky
Senior Member
 
Posts: 652
Joined: Mar 8th, '07, 21:43
Location: Newcastle, UK (30:AH)

Postby LobowolfXXX » Sep 29th, '07, 18:13

It's not 100% identical to the Monte Hall problem, but it's similar. The similarity is that in the Monte Hall problem, Monte knows what the actual situation is, and deliberate shows you a worthless door. Similarly, the spectator looks at both coins, and deliberately hands you one that is tails.

If the spectator flipped two coins and you knew nothing about how they came up, and you reach around the screen and pull out one, and it is tails, the second coin would be 50-50.

Similarly, if in the Monte Hall problem you picked door #1, and Monte opened door #2 without knowing what was behind it (and it was worthless), then there would be no advantage to trading for door #3.

The similarity pertains to the distinction between biased and unbiased information.

LobowolfXXX
Full Member
 
Posts: 51
Joined: Sep 21st, '07, 02:38

Postby Marvo Marky » Sep 29th, '07, 19:37

Yeah you're right, there is a similarity. And the similarity is that in both effects you are forcing the spectator to pick the intuitive explanation, even though this explanation is wrong.

ah well, pub time.

Bye all.

:D

User avatar
Marvo Marky
Senior Member
 
Posts: 652
Joined: Mar 8th, '07, 21:43
Location: Newcastle, UK (30:AH)


Return to Miscellaneous

Who is online

Users browsing this forum: No registered users and 9 guests