Singularity: n. The state, fact, or quality of being singular. That's the dictionary definition of the word. My personal definition is: the single greatest and most terrifying threat, ever, to the human species.
According to those who sing of the holy virtues of "the Singularity," thanks to ever-increasing improvements in technology and artificial intelligence programming, a time will come when man and machine merge; when it becomes possible for machines to become truly self-aware and self-learning; when the consciousness of man may be offloaded from the organic computer of the brain and loaded into electronic computers of man's design.
I can see the allure. Life everlasting -- assuming you don't get stuck on a cut-rate desktop with a failing hard drive. Or worse, some teenager's smartphone texting machine.
Ray Kurzweil, the inventor of text-to-speech synthesis, member of the US Patent Office National Inventors Hall of Fame, and Presidential-nominated holder of the National Medal of Technology and Innovation, is the leading evangelist of the Singularity vision. He's so for the notion that in addition to taking somewhere around 150 vitamins and supplements a day in an attempt to hang around long enough to see it happen, he's made a documentary film on the subject, called The Singularity Is Near.
Ray Kurzweil speaking at Stanford, 2008. Photo: Roland Dobbins.
Very original title aside, I'm not buying his vision.
Sorry. Just can't do it.
Unlike our man Ray, I don't see this as a good thing. In fact, the whole idea of a truly sentient machine scares the crap out of me, be it sentient by artificial intelligence (AI), or sentient by means of transference from the human brain.
Here's where Kurzweil's vision falls apart: If we do manage to make a self-aware AI, we would have to include limiting factors in AI code and resurrect slavery, or take the chance that AI-created cyborgs won't destroy us, a choice that would leave us open for variations on sci-fi themes: Borg? Robocop? Terminator? Human-form Replicators? A variation on The Matrix? Inspector Gadget?
We already have the Internet. We already have the beginnings of bionics and a thriving prosthetics industry. What's to stop something like the Borg from becoming the future of humankind?
Consider it for a moment. Okay, two moments. It's not at all difficult to imagine Borg-like qualities emerging from the joining of human and machine. There's the Pollyanna vision of Steve Austin and Jaime Sommers -- and then there's the more likely outcome, given the fact that humans are involved.
We're familiar with the effects of brain damage. It can lead to all manner of behavioral problems. Imagine the same type of thing happening to a "brain" that's connected to all the other "brains" via an Internet type massive mesh link. What's to stop some truly deranged "brain" from developing a virus that will spread and infest the whole of the Internet attached brains?
What if we develop a set of self-aware machines that one day realize they're dissatisfied with the whole notion of us being their lords and masters, a la Terminator?
What if we do develop the self-replicating nanobots Kurzweil envisions? What do we do when they get together and start acting like cells, forming networks and becoming specialized, then self-aware? Do we end up with human-form Replicators? Or simply machines that no longer deem us necessary? Picture a colony of army ants, operating at the cellular level. Get a chill?
Part of what makes humans great is our mortality. We know we're here for a limited number of years, and we work at achieving our goals before that time is up. What if we no longer have that mortality to spur us on? What if we had a virtually endless number of days in which to take on an infinite number of tasks? Would we take them on? Or would we become a species of procrastinators, willing to put off just about anything because we're no longer time constrained to get them done? Would we revert? Would we become something akin to the Feeders of Vaal -- slaves to a machine?
And, if we become capable of transferring the intelligence from a human brain to an electronic computer, what's to keep us from transferring the intelligence from a human brain to another human brain? Perhaps without the current owner's permission, like in Freejack?
These are just a few of the undesirable aspects of the Singularity. I'm sure if we consider it for a bit, we'll find a score or more of equal, or worse, risks associated with the idea. And if we can think of them now, when it's not a reality, consider how much worse it will be when it is.
Thanks for sharing the vision, Ray, but I'll pass.
— John Myers is a former computer science professor and the owner of several technology-related companies.