110 of 132
110
The two envelopes problem
Posted: 21 November 2012 12:27 AM   [ Ignore ]   [ # 1636 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 12:15 AM
StephenLawrence - 21 November 2012 12:05 AM

I can’t use the chance of getting $20 to do the calculation since I have it. If that’s wrong you do need to explain why.

You throw a die. What is the chance that you throw a 3?
You have thrown a die. It was 3. What was the chance on throwing a 3?

Yes, I accept the point, I just don’t accept it makes any sense at all to use this in the calculation.

As you said in your example, if I knew what the two pairs were I should switch, regardless of what the chance of me having $20 was. All that matters is I have $20. (and know it)

This doesn’t seem to change just because I have less information.

I guess a way to look at this is you are saying I ought to treat this puzzle as if I don’t know what I have. But you need to say why, which is genuinely hard to grasp, if true at all.

Stephen

[ Edited: 21 November 2012 12:34 AM by StephenLawrence ]
Profile
 
 
Posted: 21 November 2012 01:18 AM   [ Ignore ]   [ # 1637 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31
StephenLawrence - 21 November 2012 12:27 AM
GdB - 21 November 2012 12:15 AM
StephenLawrence - 21 November 2012 12:05 AM

I can’t use the chance of getting $20 to do the calculation since I have it. If that’s wrong you do need to explain why.

You throw a die. What is the chance that you throw a 3?
You have thrown a die. It was 3. What was the chance on throwing a 3?

Yes, I accept the point, I just don’t accept it makes any sense at all to use this in the calculation.

No? Compare it with the Monty Hall problem, please.

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 04:28 AM   [ Ignore ]   [ # 1638 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 01:18 AM
StephenLawrence - 21 November 2012 12:27 AM
GdB - 21 November 2012 12:15 AM
StephenLawrence - 21 November 2012 12:05 AM

I can’t use the chance of getting $20 to do the calculation since I have it. If that’s wrong you do need to explain why.

You throw a die. What is the chance that you throw a 3?
You have thrown a die. It was 3. What was the chance on throwing a 3?

Yes, I accept the point, I just don’t accept it makes any sense at all to use this in the calculation.

No? Compare it with the Monty Hall problem, please.

OK, In the Monty Hall problem There are two boxes one of them has a 1 in 3 chance of having the prize and another has a 2 in 3 chance of having the prize.

This matters in that problem because I don’t know which one has the prize.

The best comparison I can make is to think of the $20 like the prize. It doesn’t work because I know I have the $20.

In the scenario you have set up I’m a bit like a late comer to the Monty Hall problem, If I had more information I would know I should switch but since I don’t I have no reason to switch.

Stephen

Profile
 
 
Posted: 21 November 2012 05:03 AM   [ Ignore ]   [ # 1639 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31
StephenLawrence - 21 November 2012 04:28 AM

The best comparison I can make is to think of the $20 like the prize. It doesn’t work because I know I have the $20.

In the scenario you have set up I’m a bit like a late comer to the Monty Hall problem, If I had more information I would know I should switch but since I don’t I have no reason to switch.

Yep, you are exactly like the late comer. The importance is that history matters. Without the knowledge about what happened with the boxes before, you cannot do better than 50/50.

When you say that if you see 20 then chance of being 10 or 40 in the other envelope is just wrong. You would discover that if you would consistently switch everytime, but see you do not do better than somebody who does not switch. The reason only becomes clear when you take into account that you had the choice between the same two amounts before. By omitting that you can simply say “I have 20 and the other envelope contains 10 or 40”. You must take into account this complete process, you cannot reduce this to asking what the chance for 40 or for 10 is.

And to do that you must ask yourself: ok, here I have 20, so the pair could have been 10 and 20, or 20 and 40, and then you must, given these pairs, ask yourself which are the respective chances for the different ‘ways through the future’: from 10 to 20, from 20 to 10, from 20 to 40, and from 40 to 20. You know you have 20, but that is the same as the die: you know you have thrown a 3, but your chance for throwing that was 1/6, i.e. the chance for a 6 has not reduced to 0, or 1/5 or whatever.

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 05:40 AM   [ Ignore ]   [ # 1640 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 05:03 AM

Yep, you are exactly like the late comer. The importance is that history matters. Without the knowledge about what happened with the boxes before, you cannot do better than 50/50.

If I didn’t switch every time I would lose.

When you say that if you see 20 then chance of being 10 or 40 in the other envelope is just wrong. You would discover that if you would consistently switch everytime, but see you do not do better than somebody who does not switch.

No, I do better by switching every time I get $20.

You’ve not solved the puzzle you just think you have.

The reason only becomes clear when you take into account that you had the choice between the same two amounts before.

The reason is unclear to you and me.

By omitting that you can simply say “I have 20 and the other envelope contains 10 or 40”. You must take into account this complete process, you cannot reduce this to asking what the chance for 40 or for 10 is.

If I knew the chance I would know what to do. For some reason we can’t reduce this to the chance for 40 or 10 but solving the problem is to explain why not. Your solution In which you do equations as if you don’t know what you’ve got doesn’t make sense.

The startling fact is we need to treat the situation as if we have a 1 in 3 chance of having 40 in the other envelope and a 2 in 3 chance of having 10 in the other envelope!

And to do that you must ask yourself: ok, here I have 20, so the pair could have been 10 and 20, or 20 and 40, and then you must, given these pairs, ask yourself which are the respective chances for the different ‘ways through the future’: from 10 to 20, from 20 to 10, from 20 to 40, and from 40 to 20.

No, this most definately does not make any sense. You are trying to use the fact I could have had 40 and the loss I would have made to cancel out the expected gain. It is just wrong.

You know you have 20, but that is the same as the die: you know you have thrown a 3, but your chance for throwing that was 1/6, i.e. the chance for a 6 has not reduced to 0, or 1/5 or whatever.

No, what you are saying is like this. You have an envelope which might contain $20 or $50. You are offered $33 in exchange for your envelope and you don’t switch. You open your envelope and see you have $20 and are asked whether you want to switch or not. You say “no” because you take into account the chance of having $50.

What is happening is you are not recognising that this is a genuinely difficult puzzle, unlike the Monty Hall problem.

Stephen

[ Edited: 21 November 2012 06:07 AM by StephenLawrence ]
Profile
 
 
Posted: 21 November 2012 06:15 AM   [ Ignore ]   [ # 1641 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  2602
Joined  2012-10-27
StephenLawrence - 21 November 2012 05:40 AM
GdB - 21 November 2012 05:03 AM

Yep, you are exactly like the late comer. The importance is that history matters. Without the knowledge about what happened with the boxes before, you cannot do better than 50/50.

If I didn’t switch every time I would lose.

When you say that if you see 20 then chance of being 10 or 40 in the other envelope is just wrong. You would discover that if you would consistently switch everytime, but see you do not do better than somebody who does not switch.

No, I do better by switching every time I get $20.

You’ve not solved the puzzle you just think you have.

The reason only becomes clear when you take into account that you had the choice between the same two amounts before.

The reason is unclear to you and me.

By omitting that you can simply say “I have 20 and the other envelope contains 10 or 40”. You must take into account this complete process, you cannot reduce this to asking what the chance for 40 or for 10 is.

If I knew the chance I would know what to do. For some reason we can’t reduce this to the chance for 40 or 10 but solving the problem is to explain why not. Your solution In which you do equations as if you don’t know what you’ve got doesn’t make sense.

The startling fact is we need to treat the situation as if we have a 1 in 3 chance of having 40 in the other envelope and a 2 in 3 chance of having 10 in the other envelope!

And to do that you must ask yourself: ok, here I have 20, so the pair could have been 10 and 20, or 20 and 40, and then you must, given these pairs, ask yourself which are the respective chances for the different ‘ways through the future’: from 10 to 20, from 20 to 10, from 20 to 40, and from 40 to 20.

No, this most definately does not make any sense. You are trying to use the fact I could have had 40 and the loss I would have made to cancel out the expected gain. It is just wrong.

You know you have 20, but that is the same as the die: you know you have thrown a 3, but your chance for throwing that was 1/6, i.e. the chance for a 6 has not reduced to 0, or 1/5 or whatever.

No, what you are saying is like this. You have an envelope which might contain $20 or $50. You are offered $33 in exchange for your envelope and you don’t switch. You open your envelope and see you have $20 and are asked whether you want to switch or not. You say “no” because you take into account the chance of having $50.

What is happening is you are not recognising that this is a genuinely difficult puzzle, unlike the Monty Hall problem.

Stephen


I’m not convinced that it’s any different than the Monty Hall problem when it comes to probability.  You might like to take a look at this website regarding Marilyn vos Savant’s take on it and the controversy it engendered.  She devoted a whole fascinating chapter in a book on the subject several years ago. 

http://math.ucsd.edu/~crypto/Monty/montybg.html

There is a fuller explanation here:

http://en.wikipedia.org/wiki/Monty_Hall_problem

(Someone else in this thread might have mentioned vos Savant and these websites, but I looked at this thread only recently amd didn’t feel up to going through all 110 posts.)

Profile
 
 
Posted: 21 November 2012 06:57 AM   [ Ignore ]   [ # 1642 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31
Lois - 21 November 2012 06:15 AM

(Someone else in this thread might have mentioned vos Savant and these websites, but I looked at this thread only recently amd didn’t feel up to going through all 110 posts.)

Pages. More than 1600 posts.

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 07:17 AM   [ Ignore ]   [ # 1643 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31

Listen Stephen, you must take into account that you do not know the amounts. We do not repeat the same (10,20) and (20,40) game over and over again. Otherwise you would know that you must switch when you have 10 and must not switch when you have 40, and then you would know that switching would give you the better chance. But that is not the case. If you would get 40, you would argue the same way, supposing that the pair could be (20,40) and (40,80). But given that I used the pairs (10,20), (20,40), you would loose 40 by switching. Here your reasoning goes astray: you do as if you know something, but you don’t.

So assume you would always switch with the pairs (10,20) and (20,40), then:

Pair (10,20): you have 10, you switch, you gain 10 (you would suppose the possible pairs are (5,10) and (10,20) so you switch, don’t you?)
Pair (10,20): you have 20, you switch, you loose 10 (you would suppose the possible pairs are (10,20) and (20,40) so you switch, don’t you?)
Pair (20,40): you have 20, you switch, you gain 20 (you would suppose the possible pairs are (10,20) and (20,40) so you switch, don’t you?)
Pair (20,40): you have 40, you switch, you loose 20 (you would suppose the possible pairs are (20,40) and (40,80) so you switch, don’t you?)

Average gain: 0. Switching is no use.

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 08:19 AM   [ Ignore ]   [ # 1644 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 07:17 AM

Listen Stephen, you must take into account that you do not know the amounts.

I’m listening GdB and trying to provide constructive criticism to see if we can get to the real answer.

My point is your equation in which we ignore the fact we have $20 and calculate our expected gain or loss if we have 10 if we have $20 and if we have 40 is not taking account of that.

It’s treating the situation as if we know the amounts but don’t know which we have.

So that calculation isn’t right.

Stephen

Profile
 
 
Posted: 21 November 2012 08:21 AM   [ Ignore ]   [ # 1645 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20

. mistake

[ Edited: 21 November 2012 08:33 AM by StephenLawrence ]
Profile
 
 
Posted: 21 November 2012 08:39 AM   [ Ignore ]   [ # 1646 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31

Stephen, just look at the ‘calculations’ above, and say what is wrong.

My calculation is just the variation of above, but then with help of the concept of ‘expected value’ and chances.

If you do not see that my derivation above is correct, I give up.

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 08:56 AM   [ Ignore ]   [ # 1647 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 08:39 AM

Stephen, just look at the ‘calculations’ above, and say what is wrong.

My calculation is just the variation of above, but then with help of the concept of ‘expected value’ and chances.

If you do not see that my derivation above is correct, I give up.

GdB, of course you are not correct.

Of course there is nothing wrong with your calculation if we don’t know that we have $20.

But using your calculation with the knowledge we have $20 we should switch.

So it’s simply wrong. If you’d listen you might come up with something better

Stephen

Profile
 
 
Posted: 21 November 2012 09:21 AM   [ Ignore ]   [ # 1648 ]
Sr. Member
Avatar
RankRankRankRankRankRankRankRankRankRank
Total Posts:  4491
Joined  2007-08-31

Then tell if you should switch if you have 40, you see 40. Would you switch?

 Signature 

GdB

“The light is on, but there is nobody at home”

Profile
 
 
Posted: 21 November 2012 09:27 AM   [ Ignore ]   [ # 1649 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20

GdB,

Ignoring I have $20 in my envelope and know it, is just like my counter example I gave. I’ll give it (very similar) again.

I have a closed envelope and am told I might have $50 and might have $20 in it.

I’m told I can exchange it for $35. If I have $20 I gain $15 if I have $50 I lose $15 so my expected gain is $0.

Now I open the envelope and find $20. I’m again offered the chance to switch.

Your calculation would come up again with the answer that the expected gain is 0.

You’ll protest this is different but it’s not relevantly different. Yes Switching wouldn’t work as a general strategy but it does work in this case.

The trouble is you just can’t use your calculation to come to the conclusion don’t switch in this case.

The situation is in both cases opening the envelope does make a difference but in this case the player does know what the difference is and in your example he doesn’t.

Stephen

[ Edited: 21 November 2012 09:35 AM by StephenLawrence ]
Profile
 
 
Posted: 21 November 2012 09:34 AM   [ Ignore ]   [ # 1650 ]
Sr. Member
RankRankRankRankRankRankRankRankRankRank
Total Posts:  6031
Joined  2006-12-20
GdB - 21 November 2012 09:21 AM

Then tell if you should switch if you have 40, you see 40. Would you switch?

As you know if I switch when I see $40 I will lose every time in your set up, so according to your calculations I should not switch, though I don’t know that.

It is nonsense to use your calculations to show it makes no difference whether I switch or not, which is what you are trying to do.

Stephen

Profile
 
 
   
110 of 132
110