Sunday, May 12, 2019

The AI Revolution: Our Immortality or Extinction



Here's a fascinating discussion about the risks and possibilities of ASI (Artificial Super Intelligence) with insights from  Nick Bostrom and Rey Kurzweil.

Article: Tim Urban  
January 27, 2015
https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-2.html

It dates back from 2015, almost prehistory in the field of AI but remains as comprehensive as can be. All potential aspects of AI are discussed in depth including the singularity as well as out of fashion ones such as the "grey goo". Although newer ideas such as the "simulation hypothesis" are of course not included.

What I find most interesting is that in the long list of potential answers, the most probable one, "We don't know!" is not given more thoughts.

ASI is beyond our grasp, in the sense that a 3D sphere is beyond a 2D circle. In this respect, the most telling example is the one of extra terrestrial beings. Statistically they should be there, somewhere, but clearly they are nowhere to be seen. This is a paradox, but only from our perspective.

In reality, the answer is probably quite simple: There are all over the place but are not visible to us. They do not cross the galaxy in interstellar ships, Star wars like, do not communicate with any type of wave we can intercept, do not "expand" across the Universe or whatever we can imagine. It is quite likely that evolution, past the human and civilization stages takes a sharp turn that we do not understand towards new goals we cannot fathom.

Likewise for ASI, it's goals, thought processes and mere existence may be forever beyond our grasp like a tunnel with sharper and sharper turns.

As the article explains, we cannot confine a ASI in a box since a higher intelligence will necessarily find a way out however cleverly we build the box. We will necessarily be outsmarted. Then why even try?

If it is our destiny to build the next level of evolution as seems to be the case, why not accept it and do what we must, comes what may?

Competition in any case gives us no choice: "We" do it or "they" do it!

This was the case with nuclear devices during the second world war. Likewise, the coming conflagration will greatly accelerate whatever progress in AI we would have made anyway if just a little slower.

Evolution as we understand it now under its new variant of punctuated bursts of hyperactivity is but an unending race towards a goal of higher efficiency. Long periods of quiet, balanced equilibrium suddenly broken by unforeseen events creating the conditions for extreme competition, towards the next paradigm where a new equilibrium can be found.

The only difference with AI is that we have no clue whatsoever what such a future will look like or even if we still have a place in it. There seems to be a event horizon in front of us beyond which nothing is visible not imaginable.

It would have been nice to pause and give it some thoughts before rushing in. The laws of nature give us no such choice. ASI is our future, always was. Let's hope for the best.



No comments:

Post a Comment

Why am I afraid of AI and why should you too?

  About 10 years ago, I started working with early AI models. The first thing we started doing was not AI at all. We were calling it: The Ra...