I've been asked "Why was this blog previously called 'The Last Stop'"? Okay, I have not. Probably there aren't that many people around here. Yet. But the readers will come, and with them that question, so...
Depending on who you ask, the (technological) Singularity is a hypothetical point at which technological development progresses so fast that humanity can no longer control or even understand it.
I think a Singularity can happen in several areas. If Artificial Intelligence is your thing, there might be a "take-off point" at which the AI starts improving itself, faster and faster until its capacities, reaction times, motifs leave humans behind without knowing what hit them. It has been suggested that such super intelligence will not be me vs. Albert Einstein, but a small bug vs. Albert Einstein. Another fitting analogy would be "Something is coming... Can't see exactly what... Wait for it... OMG what was that."
You don't need to wait until an artificial general intelligence arises - financial markets are already headed into Singularity territory. In specific areas of trading, there is an entire ecosystem of algorithms so fast and so obscure that no human can hope to compete. In the time it takes you to say "I'll buy 100 stocks at $5", an algorithm has bought that stock at $4.9999 each and sold it to you with $0.01 arbitrage. When something does go wrong, it requires the entire exchange to be shut down and month-long investigations by specialists.
Nobody knows, and depending on who you ask, there would be nobody left to tell you. There are theories though:
A company or nation state develops a super intelligent AI and lets it out of the basement (likely)
A super smart brain is simulated (likely)
The internet awakes (unlikely)
Super smart people are engineered or enhanced (unlikely)
We make contact with aliens and they send us a blueprint to create a malicious super intelligence (might want to think twice before building that machine, Jodie Foster)
A technological singularity may leave people behind, and some people are already being left behind - financially, educationally, in terms of opportunities. Today though, they can still catch up, however hard that may be, and it should be our mission to make sure humanity does not fall behind. That will require all sorts of measures and may be the defining theme of the next decades.
As Yuval Noah Harari puts it in the excellent Homo Deus: “In the early twenty-first century the train of progress is again pulling out of the station – and this will probably be the last train ever to leave the station called Homo sapiens. Those who miss this train will never get a second chance. In order to get a seat on it you need to understand twenty-first-century technology, and in particular the powers of biotechnology and computer algorithms."
This is why this blog used to be called "The Last Stop".
There is MUCH more to say, including
What scenarios are likely outcomes of Singularities
What I would like to happen
A list of great books to read on the topic
So this is the first entry in a series on this blog, yay! You can get an (albeit depressing) head start into the topic by reading something like Nick Bostrom - Superintelligence: Paths, Dangers, Strategies. The cover is adorned by a cute owl that will fill you with dread once you make it past the introduction.
You can learn more about what is going on in the financial markets in Michael Lewis - Flash Boys: A Wall Street Revolt.
Our next stop in the tour towards the singularity will take us to introducing AI into the mixture. Read on!