In your example I might have had $20 and $40 or $10 and $20 with 50/50 probability.

Given I had $20 the probability the other envelope contained $40 was 1 in 2 and the probability of it containing $10 was 1 in 2.

If I’d know that I should have switched. But I didn’t.

It’s not knowing the probability distribution that makes the difference.

No, that is not my example. It is not given that you have $20. You happened to get $20 from an unknown TEP pair. You are changing the scenario because you are blind to the difference between the two.

In your example I might have had $20 and $40 or $10 and $20 with 50/50 probability.

Given I had $20 the probability the other envelope contained $40 was 1 in 2 and the probability of it containing $10 was 1 in 2.

If I’d know that I should have switched. But I didn’t.

It’s not knowing the probability distribution that makes the difference.

No, that is not my example. It is not given that you have $20. You happened to get $20 from an unknown TEP pair. You are changing the scenario because you are blind to the difference between the two.

I’m not changing the scenario. I’m sticking strictly to the facts.

I objectively had $20.

In the objective situation in which I had $20 there could be $10 or $40 in the other envelope with a probability of 50/50.

The only relevant difference, sticking to the objective situation in which I had $20, is I don’t know the probability distribution is 50/50.

I’m not changing the scenario. I’m sticking strictly to the facts.

No, you are not. You leave out facts. You do not distinguish between ‘here you have one amount, and the other envelope contains half or twice of it’ or ‘here you have two envelopes, one has twice the amount of the other, you pick one, and after you looked you may switch’. You see, if you pick an envelope, then the what the other envelope contains is fixed, even if you don’t know what amount it contains. In the first situation it is not: it is decided after you know what you have, by chance. But that is not the same. You see how they differ if you play them through as I did here. How otherwise do you explain the different outcome of the two scenarios, if history plays no role?

StephenLawrence - 24 November 2012 09:35 AM

I objectively had $20.

Yes. And in and TEP the other amount is fixed by your choice (it is the other of the two amounts), and in the other situation after you know you’ve got $20 the amount in the other envelope is determined by chance to be half or twice. That is not the same Stephen.

StephenLawrence - 24 November 2012 09:35 AM

The only relevant difference, sticking to the objective situation in which I had $20, is I don’t know the probability distribution is 50/50.

The solution follows on from there.

Yeah? Show me how it follows that on switching or not makes no difference. Give me your solution.

Yeah? Show me how it follows that on switching or not makes no difference. Give me your solution.

In each specific (objective) situation, switching or not does make a difference. The trouble is we don’t have the probability distribution, so don’t know what the difference is.

So we don’t gain any information from opening the envelope.

Because we don’t gain any information from opening the envelope it is right to treat the situation as if we don’t know what we got, we can’t narrow the situation down beyond the broader circumstances in which we might have whatever other amounts and whatever other pairs of envelopes.

Once we do that we are then interested in, and see why we are interested in switching as a general strategy and we then see switching makes no difference as in our specific example in which we’d be better off switching if we have $20 in our envelope and if we have $10 in our envelope, we’d be worse off switching if we have $40 and these all cancel out.

So that’s the solution and it’s complete because it explains why we can’t view the situation as if we know what amount we have even if we open the envelope and do know!

The switching argument: Now suppose you reason as follows:

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.
3. The other envelope may contain either 2A or A/2.
4. If A is the smaller amount the other envelope contains 2A.
5. If A is the larger amount the other envelope contains A/2.
6. Thus the other envelope contains 2A with probability 1/2 and A/2 with probability 1/2.
7. So the expected value of the money in the other envelope is

8. This is greater than A, so I gain on average by swapping.
9. After the switch, I can denote that content by B and reason in exactly the same manner as above.
10. I will conclude that the most rational thing to do is to swap back again.
11. To be rational, I will thus end up swapping envelopes indefinitely.
12. As it seems more rational to open just any envelope than to swap indefinitely, we have a contradiction.

The puzzle: The puzzle is to find the flaw in the very compelling line of reasoning above.

2. is false.

Although it is true that you have the smaller or larger amount with a probability of 50/50, it is false that the amount in your envelope is the smaller or larger amount with 50/50 probability.

It’s a subtle distinction and no surprise the puzzle teases the brain. We can apply it when we get to 8.

You don’t gain on average by switching because although you have the smaller amount half of the time, that amount is, on average, half of the amount you have when you have the larger amount.

2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

2. is false.

Although it is true that you have the smaller or larger amount with a probability of 50/50, it is false that the amount in your envelope is the smaller or larger amount with 50/50 probability.

StephenLawrence - 25 November 2012 12:38 AM

It’s a subtle distinction and no surprise the puzzle teases the brain.

Yes. Very subtle.

Explain the distinction with one or more examples.

2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

2. is false.

Although it is true that you have the smaller or larger amount with a probability of 50/50, it is false that the amount in your envelope is the smaller or larger amount with 50/50 probability.

StephenLawrence - 25 November 2012 12:38 AM

It’s a subtle distinction and no surprise the puzzle teases the brain.

Yes. Very subtle.

Explain the distinction with one or more examples.

OK, any example will do but I’ve picked the following.

We can simplify the TEP so that the rules are it is played with two amounts $10 and $20.

Now if I play many times I will have the smaller amount half of the time so the probability of having the smaller amount is 50/50.

But let’s say the amount in my envelope is $20. It isn’t true that the probability of $20 being the smaller amount is 50/50. The probability of $20 being the smaller amount is 0.

So as I say 2. is false because of this subtle distinction.

We can simplify the TEP so that the rules are it is played with two amounts $10 and $20.

Now if I play many times I will have the smaller amount half of the time so the probability of having the smaller amount is 50/50.

But let’s say the amount in my envelope is $20. It isn’t true that the probability of $20 being the smaller amount is 50/50. The probability of $20 being the smaller amount is 0.

That is correct. It is what I was saying since here in other words. Now you also take 10 and 20 as the point of departure, and not the fact that you have 20 and then the other envelope still can contain 10 or 40.

StephenLawrence - 25 November 2012 01:59 AM

So as I say 2. is false because of this subtle distinction.

Edit: Oh and if 2. were true you should switch!

But that’s still wrong, and your example shows it:

2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

You can pick 10 or 20, and the chance is 50% for both. But given these chances, there is no chance that the other envelope can contain 5 (which would be when 10 were the biggest amount) or 40 (which would be where 20 were the smallest amount). These are plainly ruled out. kkwan however introduces these as if they were possible.

But that’s still wrong, and your example shows it:

It’s correct and I think this is the best way to explain it. We’ll see when Kkwan comes back.

We can focus on 2. alone. If 2. were true the rest would follow and we should switch but 2. is false.

I’ll explain.

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

In order for 2. to be true I need to be able to replace A with any possible amount.

So firstly 2. isn’t true because I can’t.

And secondly if it were true I should switch because I could apply the formula to any possible amount I have in my envelope.

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

In order for 2. to be true I need to be able to replace A with any possible amount.

1. I denote by A the amount in my selected envelope.
2. The probability that A is the smaller amount is 1/2, and that it is the larger amount is also 1/2.

In order for 2. to be true I need to be able to replace A with any possible amount.

No idea what you are saying here.

Then I’ll put it this way.

1. I denote by A the amount in my selected envelope.

Must mean: I denote by A the amount in my selected envelope whichever of the possible amounts my envelope contains

So if 2. is true by this definition of A I should switch.

But 2. is false and what is actually true is just I have the smaller amount with 50/50 probability.

I think the mind trick is right there and so I think this is the correct solution.

1. I denote by A the amount in my selected envelope.

Must mean: I denote by A the amount in my selected envelope whichever of the possible amounts my envelope contains

But if you state it like this the error is already in 1.: A does not denote a fixed amount, but two possible amounts.

One should rewrite 1 and 2 then as:
1. Denote the two amounts as X and 2X.
2. The chance that you have X is 1/2, and the chance that you have 2X is 1/2

Now we can continue:
3. The other envelope may contain twice or half of the envelope you have. (This is true, but we do not need it; 4 and 5 follow from 1 alone)
4. If you have X, then the other envelope contains 2X.
5. If you have 2X, then the other envelope contains X.
6. Thus the other envelope contains 2X with probability 1/2 and X with probability 1/2.
7. So the expected value of the other envelope is (3/2)X
8. You do not know what you have: X or 2X. So the expected value of your envelope is also (3/2)X
These amounts are the same, so there is no advantage in switching, nor in not switching.

1. I denote by A the amount in my selected envelope.

Must mean: I denote by A the amount in my selected envelope whichever of the possible amounts my envelope contains

But if you state it like this the error is already in 1.: A does not denote a fixed amount, but two possible amounts.

A does denote the amount in the envelope but it must be true that counts for any amount the envelope could contain.

In the TEP there are infinite possible amounts that could be in my envelope and it must be true that if I were to have any one of them that would be A.

I don’t see any error in 1.

One should rewrite 1 and 2 then as:
1. Denote the two amounts as X and 2X.
2. The chance that you have X is 1/2, and the chance that you have 2X is 1/2

Now we can continue:
3. The other envelope may contain twice or half of the envelope you have. (This is true, but we do not need it; 4 and 5 follow from 1 alone)
4. If you have X, then the other envelope contains 2X.
5. If you have 2X, then the other envelope contains X.
6. Thus the other envelope contains 2X with probability 1/2 and X with probability 1/2.
7. So the expected value of the other envelope is (3/2)X
8. You do not know what you have: X or 2X. So the expected value of your envelope is also (3/2)X
These amounts are the same, so there is no advantage in switching, nor in not switching.

Right?

Yes, this works.

Still I stick with the mind trick is that we *do indeed have the smaller amount with 50/50 probability but 2. isn’t true and if it were we should switch. 2. easily get’s confused with *.

It’s been fun coming to that conclusion (right or wrong) with your input and I think I understand probability better as a result.

Now I’m just waiting to test my answer on Kkwan, If I convince him that will be conclusive proof that I’m right.