Pages

Saturday, June 24, 2023

AI to ASI - an unknowably unknown future: chapter 4

 Chapter 4 : ASI and Singularity 



The singularity is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. The term was coined by mathematician and computer scientist Vernor Vinge in his 1993 essay "The Coming Technological Singularity."

 

Ray Kurzweil, a futurist and author, has popularized the concept of the singularity. In his book The Singularity Is Near, Kurzweil argues that the singularity will be caused by the exponential growth of artificial intelligence (AI). He predicts that AI will eventually become so intelligent that it will be able to improve itself at an ever-faster rate, leading to a rapid and unpredictable transformation of the world which can also be coined as intelligence explosion. 

 

Kurzweil has placed the date of the singularity at around 2045. However, other experts have argued that it could happen sooner or later, depending on the rate of technological progress.

 

The singularity is a controversial topic, with some people believing that it is a real possibility and others believing that it is science fiction. There are also concerns about the potential risks of the singularity, such as the possibility that AI could become so powerful that it could pose an existential threat to humanity. There handful who do not think singularity poses any risk, however, I think we should approach with caution as whatever turns out, it's hard to ignore that we are staring at an unknowably unknown future. 

Quite a few experts in this field like to believe that ASI is too far in the future to worry about now. I think relatively smart human with google search or with large language models like ChatGPT or google bard could experience above human intelligence and can appear intimidatingly smart compared to an unaided human. ASI might be more favorable for development when machines are more integrated with humans. Perhaps, human like intelligence is only possible with a body associated with. Intelligence we are bestowed with has been honed over millions of years of evolution. This is one of the main line of logic as to why some experts feel ASI is simply not possible in the short term. 


There is another races that is brewing up in this space , i.e. creation of 'friendly AI' vs creation of AGI leading to ASI. One set of experts are running after friendly AI before AGI kicks in. This seemingly friendly AI can cut off risks which we see in dystopian science fiction novels/movies where humans are subjected to mercy by advanced machines run amok in the human society and controls literally everything. However, what exactly is friendly AI leaves lot of space for debate. This can lead to fundamental questions as can intelligent machines develop consciousness like we humans do. Albeit getting into consciousness is another battle field of ideas for the wider group of academicians. On the other hand, there is another school of thought who are hell bent in creating AGI/ASI as soon as possible, at least much before everything gets connected to a network. This will allow to mitigate any dystopian like situation as it can be switched off and disconnected from network before a massive damage. 

 

Despite the controversy, the singularity is a topic that is worth considering, as it has the potential to have a profound impact on the future of humanity.

 

Here are some of the potential benefits of the singularity:

 The singularity could lead to the development of new technologies that could solve some of the world's most pressing problems, such as climate change and poverty.

The singularity could lead to a new era of human evolution, as humans and machines become increasingly merged.

The singularity could lead to a new understanding of the universe and our place in it.


Here are some of the potential risks of the singularity:

 The singularity could lead to the development of artificial superintelligence (ASI) that could pose an existential threat to humanity.

The singularity could lead to a loss of control over technology, as ASI could become so powerful that it could no longer be controlled by humans.

The singularity could lead to a radical transformation of society, which could be disruptive and chaotic.


No comments:

Post a Comment