Wednesday, September 22, 2010

The Technological Singularity and The Posthuman

The Technological Singularity is an theorized event that has been showing up in more and more science fiction work, but what exactly is it?  There are two general ideas of what the Technological Singularity will mean and many variations in between.  One popular possibility is that, as advances in technology are constantly accelerating, there will be a point where we will be so integrated with technological augmentation that we will no longer be human in the traditional sense.  In the future we may be more like cyborgs, with brain implants that make us super-intelligent and life-extension technology that basically allows us to live forever.  The other most popular idea is that we will create computers that are more intelligent than we are and can self-replicate, essentially designing their next generation to be more intelligent than the previous generation.  Each new generation of machines will increase in intelligence and will replace humanity when humanity is deemed obsolete.

The first theory is incredibly popular in Cyber-Punk science fiction.  People may be able to use chemical brain augmentation along with nanobots that repair the damage caused by cognitive enhancing drugs and at the same time keep the body healthy and young.  People could acquire implants for wireless communication with anyone, like having a telepathic cell phone implanted in their brains.  You could just think about who you want to talk to and then communicate with them through your thoughts.  We could move beyond our biological physical limitations with cybernetic limbs, telescopic eye implants, integrated super hearing aids and cognitive enhancing super computers hard wired into our brains.  Many of these technologies already exist but they are expensive and not entirely refined.  Even when they do become more readily available, there is the question of who gets to use them?  Will only the rich become the cyborgs?  Will technological augmentation lead to a new kind of class warfare?  Sci-fi works like Ghost In The Shell by Atsuko Tanaka and Neuromancer by William Gibson both address some of these questions.

We've all seen the ultimate sci-fi outcome of the second theory in movies like the Matrix and Terminator.  Artificial Intelligence is invented and machines start to build more machines, eventually they become so advanced that they try to eliminate humans.  Sci-fi writer Isaac Asimov addressed this problem with his popular Three Laws of Robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Of course what makes sci-fi interesting is when something goes wrong.  If Artificial Intelligence were to advance to a point beyond programming and become more like actual human consciousness, these three laws would no longer apply.  The robot would have the ability to decide whether or not it should follow the laws.  A truly advanced A.I. would be indistinguishable from human consciousness.

I have my own theory on what the technological singularity will look like. I assume there are other people who share my opinion.  I think the technological singularity will be the point in time at which technology is changing so quickly that the change is almost, if not entirely, imperceptible.  Innovation, invention, creativity and novelty will increase at such a high rate that everyday we will make advances that used to take decades.  Technology will truly be indistinguishable from magic.  We will reach the point where technological acceleration is approaching infinity and nearly everything is possible.

The question is whether or not there is a technological wall.  Is there a point where no more technological advancement can be made?  Will there be a point where super computers are so fast that they can not possibly go any faster?  Will innovation stop when there is nothing left to invent?  I think that nearly everything may become possible but that doesn't mean that what is possible will be created.  A point of infinite potential may be reached that will open up a huge area of potential innovation but we will still be limited by things like space, time and natural resources.  Many technological innovations will be conceptual, but we will be unable to implement them all.  Many theories on the Singularity suggest that everything that is possible will come to be, but unless we can create material out of nothing, our resources for creation will remain limited.  Maybe we will be able to create material out of nothing and no limitations will apply.  I personally find it fun to think about possible scenarios but looking that far in the future is nothing but speculation.  What do you think the Technological Singularity will be like?

No comments:

Post a Comment