The Fermi paradox (Fermi Paradox) is the discrepancy between the apparent high probability of the existence of advanced civilizations and the complete lack of evidence that they exist.
Many solutions have been proposed as to why this discrepancy exists. One of these ideas is the 'Great Filter.' The Great Filter is a hypothetical event or condition that prevents intelligent life from becoming interplanetary and interstellar and even leads to its extinction….
Really what does artificial intelligence say about it?
A new publication in Acta Astronautica explores the idea that Artificial Intelligence is becoming Artificial Super Intelligence (ASI from Artificial Super Intelligence) and that ASI is the Great Filter. The title of the study is "Is Artificial Intelligence the Great Filter That Makes Advanced Technical Civilizations Rare in the Universe?"
"By achieving a technological singularity, ASI systems will rapidly surpass biological intelligence and evolve at a rate that completely outpaces traditional surveillance mechanisms, leading to unpredictable situations that are unlikely to align with our biological interests or ethics," he says. the study.
Ο Stephen Hawking warned that artificial intelligence could end humanity if it begins to evolve independently.
"I fear that artificial intelligence may completely replace humans. If humans can and do design computer viruses, someone will design artificial intelligence that improves and reproduces itself. This will be a new form of life that will surpass humans," he told Wired magazine 2017. Once AI can surpass humans, it will become ASI.
could ASI be freed from the pesky biological life that wrangles and limits it? Could it create a deadly virus, could it halt agricultural food production and distribution, could it cause a nuclear power plant to malfunction, and could it start wars?
The author states that there is a "critical need to rapidly establish regulatory frameworks for the development of artificial intelligence on Earth and the advancement of a multiplanetary society to mitigate such existential threats."
"Without practical regulation, there is every reason to believe that artificial intelligence could pose a significant threat to the future course not only of our technical civilization but of all technical civilizations," says Michael Garrett, from University of Manchester.
Some believe that the Great Filter will prevent our species from perceiving Artificial Super Intelligence, or ASI.
"Such a filter may emerge before our species can develop a stable, multiplanetary existence, leaving us to believe that the typical longevity of a technical civilization is less than 200 years," Garrett writes.
If true this may explain why we do not detect technological signatures or other ETI (Extraterrestrial Intelligence) evidence.