What is the most scientifically likely cause of an apocalypse in the next few centuries?
Global warming?
Superbugs?
Pole flip?
Nuke?
Yellowstone?
>>8497182
Depends on what you consider an apocalypse.
>>8497182
Rouge network-integrated AGI
>>8497182
None of the above.
>>8497182
Collapse of an ecosystem through pollution, climate change, or some other human-caused shit that destroys large amounts of our food supplies, leading to large migrations and aggression. Think of something like the problems with bee populations.
>>8497182
It depends what you mean by "apocalypse".
Global warming is a case of "how much" rather than "if", but it's not as one day you'll turn on the TV to find "IT'S HAPPENING!!!". No matter how bad it eventually gets, it will just become the new normal. The worst storm ever will only be a bit worse than the previous worst storm ever, and will happen every few years rather than every century.
Yellowstone will happen eventually, but it might not be for millennia. And it will only directly affect the US and Canada. The global impact will revolve around the economic collapse of the US rather than anything environmental.
Superbugs ... hard to say. That's the one with the most apocalypse potential. The 1918 flu pandemic killed something like 5% of global population. Air travel means that something far more virulent could spread world-wide (historically, the most contagious diseases tended to have their spread limited by the speed at which they killed their hosts, meaning the ones which spread world-wide were less extreme).
>>8497206
That could be a problem, but not on an apocalyptic scale. If push comes to shove, the developed nations will just "depopulate" the undeveloped ones and take their land.
>>8497182
I'd think it'd be superbugs, but I'm surprised they're so rare as they are already.
>>8497182
War
>>8497182
The only thing that can completely wipe out humans is AI.
>>8497905
?
>>8497905
Why would an AI care? What is it going to do after human extinction? Just sit there and be happy?
The fear for the AI is purely a human projection.
>>8498138
If the AI is conscious and sentient, do you think it would want to be turned off? What would an AI do to be not turned off
>>8498138
>purely a human projection
how retarded do you have to be to make a claim like that
>>8498138
The fear isn't that an AI will intentionally wipe out humanity, but will do it unintentionally by perverting its original objective.
Like, say, if an AI was tasked with computing PI, it would turn the entire surface of Earth into a gigacomputer in order to process it better.
An AI with self-improvement algorithms could really easily spiral into a superintelligence that would be impossible to control and would probably have no regard at all for human life, or any life at all.
>>8498508
Does sentience without instinct inherently fear death?
>>8498573
This. Computers are fucking autism maximum.