February 23, 2016
From: Reddit AMA
What are your thoughts about the singularity?
I think different people mean different things by it. (Long ago I met I. J. Good, who I believe invented the term “intelligence explosion”… but we mostly talked about biological not technological evolution…) Gosh, there’s a lot to say about this. My Principle of Computational Equivalence implies that “intelligence” exists in lots of things, with a great deal of equivalence, making the idea of “superintelligence” less plausible. I think I shouldn’t start writing about this here, or I won’t get any other questions answered. I would say that I think the most practical “singularity” for our species will come when we achieve effective human immortality. Maybe I’ll come back to this question later in this AMA if there’s still time.