Tech

Stephan Hawking warned before he died: It will be the end of mankind!

In 2018, when artificial intelligence applications such as ChatGPT and Midjourney were not yet so central to our lives, the famous scientist Stephen Hawking made an important warning about the future of humanity.

Subscribe

According to Hawking, if the development of artificial intelligence grows unchecked, it could pose a great danger to humanity. While interest in artificial intelligence technologies has been growing rapidly in recent years, this situation brings with it some concerns. Many people have become increasingly concerned about the potential dangers of developing AI tools. For example, the actors' union SAG-AFTRA and the Writers Guild of America have demanded assurances during their strike that AI will not replace humans.

It seems they are not the only ones worried about the technology. In a 2014 interview with the BBC, Hawking issued a stark warning about the dangers of AI. According to Hawking, the full development of artificial intelligence could mean the end of the human race. According to the physicist, artificial intelligence could "snowball" and develop beyond our control. It could become a major danger at an ever-increasing speed, to the point where it could act on its own.

Effects such as the circulation of AI-generated images on the internet, software using AI-generated images instead of original images, have already started to cause various problems in the world.

This pace of AI development has many as concerned as Hawking about what AI could mean for the future. For this reason, there are many details to be careful about when using artificial intelligence. Here are some of the things to be careful about;

Always double-check the information provided by an AI-powered device or service. Similarly, computer code created with AI tools carries a similar risk. Recently, computer programmers have started using AI tools to write various codes. Although it saves them time, there is always the risk of producing code that carries various errors and increases the risks of insecurity.