In recent years, the debate around the capabilities of artificial intelligence (AI), particularly regarding its consciousness, has gained significant momentum. It's noteworthy that even leading scientists occasionally publish premature contributions on the topic. Popular theories, such as emergence, which claim that consciousness arises from a multitude of oscillations, seem insufficient to me and lead me to the thought experiment of whether my mobile phone could possess consciousness since it also generates many oscillations. Furthermore, I question whether the emergence of General Artificial Intelligence (AGI) requires consciousness at all.
But let's take this step by step, and where are we now? The rapid increase in computing power needed for training AI models is a clear indicator of the swift development in the field of Artificial Intelligence (AI). Especially since 2012, we have witnessed an exponential increase in this computing power, far surpassing Moore's Law. Specifically, the computing power used for training the largest AI models has doubled every 3.4 months – a growth that far exceeds the historical growth, which saw a doubling every two years. This development highlights the dramatic increase in required resources and the costs associated with achievements in the AI field.
OpenAI has found in an analysis that since 2012, the amount of computing power used for training the largest AI models has grown more than 300,000 times. In comparison, a doubling every two years, as predicted by Moore's Law, would have led to only a 7-fold increase. The history of AI and its computing power shows three eras of AI computation, starting with the Pre-Deep Learning era (1950–2010), where the computing power doubled every 18–24 months, through the Deep Learning era (2010–2016), where this time shortened to 5–7 months, to the current era of large-scale models (2016–2022), where the doubling time increased to 11 months. This acceleration in computing power has enabled AI models like Minerva, which uses nearly 6 million times more computing power than AlexNet did a decade ago.
Although it is uncertain whether the growth of computing power can continue at this pace, as large-scale models increasingly require more computing power, the immense investments in AI research suggest that further breakthroughs may be on the horizon that could reach the computing power of the human brain. Given the availability of vast amounts of data and improved algorithms, this increase in computing power has enabled significant advances in AI in a short time, with AI already surpassing or even exceeding human performance in many areas.
Yet, the main argument is that AI does not need consciousness to perform advanced functions. This is illustrated by existing technologies, such as thermostats that respond to temperature changes, or combat drones that can make complex decisions without consciousness. An AGI could be conceived as a highly complex interconnected system based on millions of human inputs, similar to systems like ChatGPT today.
Given the rapid technical progress, it is entirely possible that we will have an AGI in less than 10 years, as all necessary prerequisites are already in place. Therefore, it is crucial to establish international rules for the use of AGI now to ensure that these technologies are used responsibly.
Keine Kommentare:
Kommentar veröffentlichen