What is the Singularity?

And will it lead to the extermination of all humans?
The Economist explainsby T.C.
—THE cover of this week’s issue of The Economist is about the Vision Fund, a colossal investment fund with $100bn to spend, which is making a splash in the technology industry. The fund is piloted by Masayoshi Son, the head of SoftBank, which has put up about a quarter of the money. Mr Son has had an interesting career.

He founded SoftBank in 1981; it is now reckoned to be the fifth-biggest company in Japan. He also holds the distinction of being the person who has lost the most money in the history of investing, after seeing roughly $70bn go up in smoke during the dotcom bust at the turn of the century. But he is interesting for another reason, too. Plenty of people are fearful about the future. Mr Son has few doubts. Robots will have IQs of 10,000 within the next 30 years, he says, and there will be as many of them on Earth as there are humans. Along with a group of science-fiction authors, futurists and computer programmers, Mr Son is an exponent of the idea of the Singularity.

The term has different definitions depending on whom you ask, and it often overlaps with ideas like transhumanism. But the broad idea is that the rate of technological progress is accelerating exponentially, and will continue to do so, to the point where it escapes all efforts at control. The projected results vary: the extermination of the human species by godlike artificial intelligences is a favourite of the pessimists. Optimists, meanwhile, prefer to conjure up an age of limitless material abundance and infinite leisure, with genetically modified humans bound together by brain implants into a solar-system spanning hivemind, or perhaps uploading their minds into a silicon utopia. And because of the power of exponential growth—where every doubling creates as much progress as all previous doublings combined—this science-fiction future is actually mere decades away.
Get our daily newsletter

Upgrade your inbox and get our Daily Dispatch and Editor’s Picks.

If the idea of exponential growth sounds familiar, it should. One of the pillars of the Singularity is Moore’s Law, the observation (which later became a self-fulfilling prophecy) that the number of components engineers could cram onto a chip—and thus, in a loose sense, that chip’s computational power—doubles every couple of years. Run that trend into the future, and you arrive at a world where stupendous amounts of computing power can be had for pennies. The arrival of proper AI (as opposed to the limited pattern-matching software that the term often refers to at the moment) is usually flagged as the tipping point. Imagine a computer intelligent enough to understand its own design, say the Singularitarians. Then imagine that computer improving on that design, making itself cleverer still. Iterate a few times, and the machines will bootstrap themselves to godhood.

Not everyone is convinced. Critics point out that one of the points of exponential growth is that it cannot carry on forever. After a 50-year run, Moore’s Law is stuttering. Singularitarians retort that the laws of physics define a limit to how much computation you can cram into a given amount of matter, and that humans are nowhere near that limit. Even if Moore’s Law slows, that merely postpones the great day rather than preventing it. Others say the Singularity is just religion in new clothes, reheated millenarianism with transistors and Wi-Fi instead of beards and thunderbolts. (One early proponent of Singularitarian and transhumanist ideas was Nikolai Federov, a Russian philosopher born in 1829 who was interested in resurrecting the dead through scientific means rather than divine ones.) And those virtual-reality utopias do look an awful lot like heaven. Perhaps the best way to summarise the Singularity comes from the title of a book published in 2012: the Rapture of the Nerds (pictured).
The Economist explains
May 14th 2018

Leave a Reply

Your email address will not be published. Required fields are marked *