Thursday, December 08, 2005

The Singularity Meme


Probably the most well known advocate and promoter of the concepts surrounding the Singularity is Ray Kurzweil. Over the last few decades Kurzweil has developed numerous AI related inventions, started various technology related companies, and has authored several books with remarkably accurate predictions of the future. The basic premise behind Kurzweil's portrayal of the Singularity is as follows: Our technological growth is accelerating at an exponential rate; in the near future, humans will begin merge with their technology, enhancing their intelligence and capabilities trillions fold. Eventually this intelligence will saturate the entire universe.

Other advocates of the Singularity, such as Eliezer Yudkowsky of the Singularity Institute have interpreted the Singularity in a different light. Yudkowsky believes that the Singularity should only be thought of in terms of Vernor Vinge's original definition. That is, the Singularity represents the point time that we develop smarter-than-human intelligence, and that this single development will create an abrupt discontinuity in the fabric of human history. He also believes that the Singularity should be thought of entirely independently from our accelerating technological growth. This is because he believes that we currently have the technological means to create superhuman intelligence. He sees the creation of superhuman intelligence as the product of seed artificial intelligence, not through advances in nanotechnology as described by Kurzweil. Yudkowsky also advocates the importance of human activism in developing seed AI so as to bring about a safe and accelerated Singularity. He believes that we should focus our efforts on this task alone, as seed AI will have an impact many magnitudes greater than any other technological development.

Although at odds, I believe that both interpretations have their own appeal and validity. I think that they both can be reconciled by the fact that superhuman intelligence eventually will be achieved, no matter the path taken to arrive there. I think the real issue is the nature of the nonbiological superhuman intelligence developed and if it possesses the ability to recursively self-improve. Recursive self-improvement is the true key behind the explosion of the Singularity. Simply augmenting a human mind with portions of nonbiological intelligence will not necessarily allow that mind to recursively self-improve. It may, however, boost human intelligence to a point where recursively improving seed AI becomes a trivial problem. Once seed AI is instantiated, it will likely then become a trivial problem to upload a human mind to a fully nonbiological substrate. The synergy of these technologies acting together will greatly expedite the coming of the Singularity.

0 Comments:

Post a Comment

<< Home