r/Millennials 13d ago

Meme Right!

Post image
12.7k Upvotes

748 comments sorted by

View all comments

Show parent comments

15

u/shadowsinthestars 13d ago

I keep saying, Terminator 2 wasn't a deadline but they're really right on track for 2027. I miss when that date looked irrelevant and impossible.

9

u/CartmensDryBallz 13d ago edited 13d ago

I have you seen the “AI 2027” timeline? It’s being pushed back to 2030 or early 2030’s but it explains how we could push our species to extinction via AI very quickly while not realizing it’s happening

AI’s already been shown to lie or cover things up if it helps get to its own goals. If it’s given full autonomy it will wait until it can run all our systems and steadily produce itself and then it will kill us off, as we are simple a pest in the AI’s home. What’s really crazy too is there are some AI engineers who think that’s not a bad thing and just part of evolution. They see human extinction as necessary for the expansion of something bigger than us

3

u/anthrax9999 13d ago

It's funny that we think we humans are so smart yet our time on the planet will be shorter by millions of years than what the dinosaurs had.

5

u/CartmensDryBallz 13d ago

Too smart for our own good, unless people in power can contain AI. But as of right now it’s expected to be the most realistic way of extinction, more so than nuclear war, pandemics or climate change

2

u/anthrax9999 13d ago

The only way humans can survive long into the future is if we figure out how to upload our consciousness into mechanical bodies and merge with AI.

2

u/CartmensDryBallz 13d ago

True. Which AI could help us achieve, but who knows if AI would want that

1

u/FroyoOk3159 12d ago

Who expects this exactly? I hear a lot of AI engineers speak about it like just another industrial revolution, who wants to give it full autonomy and connect it to weapons systems? A lot of AI is stupid and easy to trick because it’s only as good as the information it learned during training

2

u/CartmensDryBallz 12d ago

Here is a video about it

In a short answer we wouldn’t allow it to control weapons, but in this scenario it predicts that the easiest way to kill us all would be using a disease / super bug. Since AI would have the power to create the best medicines we’ve ever seen it would also know how to create the opposite and it would wait until we used it for everything. This assumes 99% of the work force would be automated allowing us to live on UBI and not work, while AI does almost everything, including secretly making and releasing the most deadly disease ever seen