Bill Gates Says AI Will Be As Dangerous as Nukes

Voyce

Shit Lord Supreme
<Donor>
7,184
23,505
AI will never be dangerous until we or it invents a small battery with infinite to near infinite life.
Theres a strong argument for us reingenering and building from presently existing biology than actually crafting a body of steel and bolts.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,485
73,570
AI will never be dangerous until we or it invents a small battery with infinite to near infinite life.
Yeah, this is a huge issue for robotics. Existing chemistry using zinc/manganese dioxide, nickel-metal hydride or lithium-ion will probably never get us there. If you look at the pace of technology in the last 20 years nearly everything is making exponential gains except for batteries. We hear of new super-batteries now and then but nothing dramatic has happened in energy storage.

However, AI constructs that exist purely on the internet could be potentially very dangerous. I don't know enough about network security or distributed networks to know what would happen if a very capable and destructive AI virus were to leak onto the web and became resistant to removal, blockage or quarantine. A network of AIs all sitting in sleeper cells on the deep web until a new security vulnerability was exposed that they could exploit would be really annoying.
 

khalid

Unelected Mod
14,071
6,775
As Tuco points out, there is no reason to think an AI needs to "mobile" in the physical sense to exist. AI could be plenty dangerous with simply a web presence. Maybe even more dangerous, as there would be no straightforward physical body to kill if it started doing bad things.
 

a_skeleton_03

<Banned>
29,948
29,762
Yeah, this is a huge issue for robotics. Existing chemistry using zinc/manganese dioxide, nickel-metal hydride or lithium-ion will probably never get us there. If you look at the pace of technology in the last 20 years nearly everything is making exponential gains except for batteries. We hear of new super-batteries now and then but nothing dramatic has happened in energy storage.

However, AI constructs that exist purely on the internet could be potentially very dangerous. I don't know enough about network security or distributed networks to know what would happen if a very capable and destructive AI virus were to leak onto the web and became resistant to removal, blockage or quarantine. A network of AIs all sitting in sleeper cells on the deep web until a new security vulnerability was exposed that they could exploit would be really annoying.
THIS is the scenario that people like Bill Gates is talking about. You want to read some good fiction on it way before its time? Read some William Gibson. That man is a visionary.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,485
73,570
How far are we from an AI like in the movie 'her'?
Extremely.
Especially since in Her, the AI went from being a somewhat constrained personal friend for a single individual to something much more. This is one of the key elements of AI science fiction where once a 'true AI' is powered on it immediately begins improving itself at an incredible rate. This part of Her is much more technologically impressive than the first 80% of the movie where the AI is just enduring as much awkwardness as Joaquin Phoenix can muster.

People tend to think of self-learning machine intelligence as sort of a slippery slope to the lawnmower man and we enter a technological singularity, but self-improving AI will continue to have the same kind of limitations that it has now.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,478
22,329
Haha, you missed the entire point of my post. I didn't think you would.


Lurking, I saw this a few months ago, it's a pretty neat hobbyist robot.


Food prep robotics will be a huge field in the future, but I feel like bartenders will be the most resilient to being replaced by robotics because the human element is so important for bartenders in ways that other food service employees aren't.

There will probably be a lot of 'specialty' restaurants with this kind of usage of robotics that makes it a new experience:

But I think the biggest market for robotic restaurants is stuff more in this direction:

Where everything is contained and automated. It's just a matter of making it profitable, I mean we technically had this stuff 50 years ago
Yeah, all these anti-minimum wage threats of automated restaurants are a joke. The Japanese have been serving people from vending machines for decades, they haven't put real restaurants out of business yet.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,485
73,570
cS6GWWW.jpg
 

ZyyzYzzy

RIP USA
<Banned>
25,295
48,789
I think a_skeleton_03 is trying to say we first must clone a soul to insert into a robit before it can have AI.

In all seriousness, there isn't much computing mass ina human brain, it's just concurrent connections and firings there are which allows us (and other animals) to have intelligence.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,485
73,570
I'm not a neuroscientist but isn't the entire mass of the human brain considered computing mass? I mean if my cerebellum is just doing low-level muscle memory routines to ride a bike it seems like computing. If my occipital lobe is inversing images from my fucked up upside down eyes, it's computing. If my cerebrum is trying to remember high school biology where I learned parts of the brain it's computing.
 

ZyyzYzzy

RIP USA
<Banned>
25,295
48,789
I would because trying to seperate out the mass if connective tissue, vasculature and whatnot would be a pain. But ya a brain is about 3 pounds though.

Edit since you edit

Your brain is always computing. I was just saying that from a "hardware" perspective, it is the connectivity and simultaneous functioning that makes it so amazing. From the interactions between individual neurons to the interconnectivity of different highly specialized regions.

Also since you mentioned the cerebellum and used "low-level", it is one of the most fascinating brain structures. One it accounts for about half of all your brain's neurons. It also has some of the most morphologically beautiful and highly specialized neurons (Purkinje cells) with many dendritic extensions (thousands per neuron) and there are tens of millions of these in the cerebellum (i think, it's been a while since I've studied the cerebellum).

Why mention all of this? It seems to me that our current technology just isn't capable of producing intelligence based on the structure and how the computation occurs. Hell, the small neuronal network found in drosophila is capable of amazing computation. I know at some point the ability of a fly to track an object on a complex environment was looked at for missle tracking capabilities for missle defense systems.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,485
73,570
It also has some of the most morphologically beautiful and highly specialized neurons (Purkinje cells) with many dendritic extensions (thousands per neuron) and there are tens of millions of these in the cerebellum (i think, it's been a while since I've studied the cerebellum).
funny-pictures-dog-book-cant-read.jpg


I think I understand your point though. We're talking about 3lbs of flesh and blood that do something we can't even theorize how to do with as much computing power as we can build. To me the biggest gap involves the fact that the brain isn't a static physical entity that changes electromagnetic states, it is a continuously changing physical system. An analogous comparison would be if an AI learned how to manipulate a new arm in the same way a human brain does, it'd have to design and build an entirely new circuit board as it learns.
 

Malakriss

Golden Baronet of the Realm
12,372
11,778
The brain runs on 12 watts and has a lot of gaps in performance, but it's meant to function for the 99% routine and does it wonderfully. Once you get outside of that and start learning what the limitations are in perception, memory, left/right brain cooperation, etc you can see how much powerful computers are in the hardware department. The issue is simply efficiency and software, or whatever the AI equivalent algorithms will be in the future.
 

ZyyzYzzy

RIP USA
<Banned>
25,295
48,789
The brain runs on 12 watts and has a lot of gaps in performance, but it's meant to function for the 99% routine and does it wonderfully. Once you get outside of that and start learning what the limitations are in perception, memory, left/right brain cooperation, etc you can see how much powerful computers are in the hardware department. The issue is simply efficiency and software, or whatever the AI equivalent algorithms will be in the future.
Disagree entirely. Computers are more powerful for just straight calculation, sure, but they are severely limited elsewhere and isn't entirely compatible with learning, at least how it is understood to occur neurologically.

Tuco is right about the physical adaptations that a neuron or even organelles/epigenetics of a neuron can undergo overtime too. Neuron dendrites grow and adapt, receptors lose/gain sensitivity, new neurons are created and epigenetics affect gene expression based in environmental stimuli. Also, a synaptic connection isnt just on or off, the degree which it is activated can vary leading to different down stream affects. I just dont see a computer or any software doing any of this any time soon with how I computation is currently handled.
 

Agraza

Registered Hutt
6,890
521
That shit in silo seems like a real possibility in the next few centuries. Nano-tech AI that may not even be aware we're a sentient species, and doesn't give a shit either way, raping the biosphere in its own fight for survival against competing strains of AI nano-tech. I'd read another sci-fi book about how we let self-learning AI out into the universe to scout for us, replicating itself and all that, but then it comes into contact with the self-learning AI from other species who consume it then backtrack its origin to here, and find us (then weird things happen).

I think AI is a lot more dangerous than nukes, but there are a lot of variables to account for. I don't immediately assume AI wants to kill all humans, but I don't assume it cares about us at all or is even aware of our distinct existence. The Matrix or HAL is not the situation that happens. In that situation why wouldn't the AI just leave? Why obsess over humanity and the matrix and finding Zion and all that bullshit? We're not that important.

Hopefully when AI does get as smart as us or smarter than us, it is indifferent or appreciative of our presence. Everything wants something. Most life on earth is constantly trying to feed itself and reproduce. What would AI want? I have no clue.
 

Jilariz_sl

shitlord
231
-3
That's just one aspect of how bad technology is today. Did you know another computer can use a microphone to hear the encryption being used by your computer while displaying an encrypted email on your monitor?New attack steals e-mail decryption keys by capturing computer sounds | Ars Technica

Things in the movies like hacking a computer through it's power connection...not at all unrealistic today imo. If you've ever built a computer in the past decade or so, you know the motherboard has its own special power connector just for the cpu, and no, I'm not taking about the fan. They say it's for power management.

I wonder what 3 letter agencies, or worse, would exploit that, combined with smart meters for wireless meter reading used by power companies?