Search Lincoln Cannon Menu
Lincoln Cannon

Thanks for visiting!

4 Reasons I Don't Identify as a Singularitarian

9 October 2016

Listen to recording

4 Reasons I Don't Identify as a Singularitarian

When others learn that I identify as a Transhumanist, or when they see me reference Moore’s Law or Kurzweil’s Law with enthusiasm, they often assume that I identify as a Singularitarian. That is, they often assume I’m someone who advocates the idea of the Technological Singularity. It’s an idea that persons interpret and express in different ways. Basically, though, we could say that the Singularity would be a time when technological change is so rapid that, given our present intelligence, humans would be unable to predict or control the change. However, despite those assumptions, I do not identify as a Singularitarian. Here are four reasons why.

First, some associate Singularitarianism with a sense of unmitigated goodness toward the Singularity, which I consider neither inherently good nor inherently evil. I share in the optimistic dreams of those who imagine a world beyond present notions of poverty, suffering, and death. However, there is always major risk (not just major opportunity) associated with major change. The possibility space of the Singularity includes both wonder and horror. And arguably, actually experiencing the Singularity may be a failure mode, consequent to proving ourselves incapable of remaining in the driver’s seat, so to speak.

Second, some associate Singularitarianism with a sense of inevitability or fatalism toward the Singularity that I reject on practical grounds. Assuming the Singularity is possible, it may be there’s nothing we can do to stop or change it. Maybe the future is entirely deterministic. Maybe the thrust of the macro trends are so powerful that our individual choices are lost like heat dissipating from a machine. Whatever the reason for supposing such fatalism, even if true, I’m confident neither you nor I know infallibly. Despite any hunch or supposition to the contrary, it may be that our choices can make a difference, maybe even a big difference, in whether and how we experience the Singularity. And to the extent that our beliefs effect our behavior, it may be practically essential to believe we can make a positive difference.

Third, some associate Singularitarianism with an indifference toward humanity’s place in the future, and I reject such anti-humanism as immoral. In my estimation, it’s hard to say whether artificial superintelligence will prove more feasible than enhanced human superintelligence. But it shouldn’t be hard to say that humans matter, whether or not it’s easier to develop some other kind of intelligence. Values and ethics, if they mean anything at all to humans, must reflect human categories of thought, even if they are negations of those categories to some extent. When they become ironic pretenses to total negation, there’s no further grounds from which to make meaningful judgment. It becomes an antihumanism and a nihilism, against which humanism must guard us – must! It is imperative if anything is. That’s not to say we should avoid developing artificial intelligence. I think we should. But we should not do it at the expense of human intelligence.

And fourth, quite simply, the word “Singularitarian” is cumbersome. There are easier words to say and repeat. That alone wouldn’t stop me, if the word had better denotations. But as outlined above, the word simply denotes the possibility space of accelerating technological change. And that possibility space is neither good nor evil in itself, and therefore doesn’t merit advocating in itself. It would be immoral to do so.

Rather than identifying as a Singularitarian, it seems easier and perhaps more responsible to say I think the Singularity is an important concept. It may be among the greatest risks and opportunities presented to humanity at this time. Many celebrity technologists have called attention to it, warning us to take technological risk more seriously. I agree with them. And accordingly, I don’t advocate the Singularity in itself. Whether the Singularity happens or not, I advocate human transformation into superintelligence, radically compassionate and creative. It would be wonderful if that could happen along timelines often associated with the Singularity. It would be horrible if that never happens because we destroy ourselves along the same timelines. In either case, I trust you and I can and should make a difference.

Thanks for reading! If you've found value here and would like to support my work, you can do that in at least five ways:

  1. Comment Thoughtfully Below
  2. Share on Social Media
  3. Subscribe to My Newsletter
  4. Make a Donation
  5. Check Out My Sponsors!


Thrivous is the human enhancement company. We develop nootropics to enhance cognition and geroprotectors to promote healthy aging. Use code SUPERHUMAN for 50% off your first order at