I am not sure I understand your argument about two-boxing "in the very moment" but by my reading it looks like you are begging the question. I read your argument as:
1. In the moment, you should use causal decision theory.
2. According to causal decision theory, you should two-box.
3. Therefore, by (1) and (2), you should two-box.
4. Causal decision theory says you should two-box.
5. Therefore, by (3) and (4), causal decision theory gives the correct answer.
Or, it seems like your construction about getting teleported to a separate plane of reality is an attempt to set up a scenario where you cannot causally influence the prediction machine, and then assuming causal decision theory gives the correct answer in that scenario. So I think your real argument isn't that causal decision theory gets the correct answer in the moment, your real argument is that you should two-box in the moment. (That is, the real argument is in step 1, not steps 1-5.)
And your argument for two-boxing seems handwavey to me, I don't think you've adequately shown that it's true. And AFAICT it's wrong—I still think you should one-box regardless of whether you're a floating consciousness in a separate plane of reality. As long as the machine can predict what you'll choose, you should one-box.
I am not sure I understand your argument about two-boxing "in the very moment" but by my reading it looks like you are begging the question. I read your argument as:
1. In the moment, you should use causal decision theory.
2. According to causal decision theory, you should two-box.
3. Therefore, by (1) and (2), you should two-box.
4. Causal decision theory says you should two-box.
5. Therefore, by (3) and (4), causal decision theory gives the correct answer.
Or, it seems like your construction about getting teleported to a separate plane of reality is an attempt to set up a scenario where you cannot causally influence the prediction machine, and then assuming causal decision theory gives the correct answer in that scenario. So I think your real argument isn't that causal decision theory gets the correct answer in the moment, your real argument is that you should two-box in the moment. (That is, the real argument is in step 1, not steps 1-5.)
And your argument for two-boxing seems handwavey to me, I don't think you've adequately shown that it's true. And AFAICT it's wrong—I still think you should one-box regardless of whether you're a floating consciousness in a separate plane of reality. As long as the machine can predict what you'll choose, you should one-box.