Technical singularity Theory

Author - Aakanksha Darekar


ai image

Technological Singularity typically describes the conjectural future where the technologies have taken over a human a lot and are uncontrollable , where we will find ourselves in an unpredictable manner. Singularity may involve technology becoming so advanced that artificial intelligence may exchange human intelligence , which may potentially lead in erasing boundaries between humanity and computers.

There has been an AI researcher who has been warning us about the technology for over 20 years says we should “SHUT IT ALL DOWN “ and issue an “indefinite and worldwide” ban .

Eliezer Yudkowsky, a researcher and author who works in Artificial general intelligence wrote an open letter about this and warned us about this from 2001. The letter was signed by 1,125 people including Elon Musk and Apple’s co-founder Steve Wozniak requested a pause on the training of AI tech . Yudkowsky wrote : Many researchers steeped in these issues , including myself , except that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances is that literally everyone on Earth will die.

He further explained that AI doesn't care for us nor sentiment in life in general. Yudkowsky’s bombastic warning about disastrous consequences of AI was described as AI Boomer with author Ellen Huet noting the possibilities of “AI Apocalypse"