I was having a conversation with my friend about this. We were discussing AI and she believes AI will destroy all of humanity just like so many others. I personally don’t believe that. I’m aware of all the theories and the multitude of ways that it could happen and I understand that with AI, in theory we wouldn’t understand its goals so we wouldn’t know how it would destroy us but again, that’s just a theory.
There’s also the constant fear of massive nuclear holocaust with WWIII but I also don’t believe that we’d realistically get to a point where we’d use Nukes on each other knowing the implications of what would happen. But it made me realize that we’re constantly fearful of mass extinction. To the point where some people fight tooth and nail and will not try to look at things from a more positive or optimistic perspective. It’s all death or you’re wrong.
Please help me understand this. I’m here with open ears.
Historically, we’ve always been pretty awful to each other. A lot of our cutting edge science has revolved around ways to hurt and kill each other since the first human realised it was easier to kill the person pissing him off with a rock than their hands.
In the last 100 years or so however, those weapons have become powerful enough to end us as a species and I think you’d be hard pressed to find a type of weaponry that, once invented, hasn’t been used and I’m not sure we’ve evolved enough empathy to prioritise not killing all of us over not killing the country/group who are currently annoying us.
It’s pretty understandable therefore to have a realistic fear that there’s a very good chance we’ll bring about our own end.