Tuesday, August 25, 2015

Discontinuity of self

I like to observe the LessWrong community from the sidelines, because sometimes they have such strange consensus beliefs.  Roko's Basilisk is not believed by most LessWrongers, but it is a rather amusing introduction to some of their beliefs.

Roko's Basilisk is the idea that a benevolent AI could take over the world in the future, and then torture a clone of you unless you donate more money to building the AI now.  The idea is absurd on its face, but becomes even more absurd when you learn that it sort of makes sense, given a bunch of beliefs that many LessWrongers have:
  1. An AI takeover in the future is highly likely, and it will resemble LW predictions (e.g. it will follow their particular brand of utilitarianism, have the ability to clone people).
  2. If someone clones your state of mind, then you are the clone.
  3. It is rational to provide incentives for past actions that have already occurred.  This all part of Timeless Decision Theory, a utilitarian philosophy based on gazing deeply at Newcomb's Paradox and trying to rigorously justify the one-boxer position.
Note that this is unlike Pascal's Wager, in that the only people who get tortured are the true believers.  If you don't believe in Roko's basilisk or aren't aware of it, then no good could come out of the threat of torture.

There are good counterarguments to Roko's Basilisk, even within LessWrong assumptions, but for me it's all moot since I find the AI predictions to be implausible.

----------------------------------------------------

I also disagree with the idea that I am my clone, for idiosyncratic reasons.

I believe the me of right now and the me of a minute from now are different people.  We are in different space-time locations, we have different brain configurations, why would we be the same person?  Yes, clearly we are the same person, falling along the same continuous line, but we're not the same same, we're not identical.

Since I am unquestionably different from the person I was a minute ago, the question is why should I particularly care about this other person?  He's not so special, you see.  Maybe I shouldn't particularly care about him, maybe I should care about everyone equally.  But the fact of the matter is, curse this material body, I care a lot about future me even though he is not me.  I would act against the interests of everyone else to favor this one random guy, I really would.

If someone clones my exact state of mind, that clone is not me.  Like my future self, the clone would be a lot like me, but still not be identical.  But unlike my future self, I don't particularly care about my clone.  Why should I?  I may care a lot about my future self, but that favoritism is a necessary evil.  I see no reason to extend that evil any further to my clone.

0 comments: