Bill Gates Says AI Will Be As Dangerous as Nukes

Chancellor Alkorin

Part-Time Sith
<Granularity Engineer>
6,029
5,915
LOL at AI making decisions based on emotions around them.
Why are you fully discounting this concept without a shred of intelligent debate? Because some guy wrote a book that didn't talk about it?

If we are attempting to create an artificial *human-based* intelligence, it will have to contend with emotional states of human beings in order to understand us. If it can't do that, it won't fully understand us and will potentially fuck everything up when it comes to interacting with us properly. Period. It's that simple. Why is this such a difficult concept?

I mean, either Tuco is trolling me to death here, or I'm just not understanding why you're being so obtuse. If it's the former, please explain.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,606
73,744
Why are you fully discounting this concept without a shred of intelligent debate? Because some guy wrote a book that didn't talk about it?

If we are attempting to create an artificial *human-based* intelligence, it will have to contend with emotional states of human beings in order to understand us. If it can't do that, it won't fully understand us and will potentially fuck everything up when it comes to interacting with us properly. Period. It's that simple. Why is this such a difficult concept?

I mean, either Tuco is trolling me to death here, or I'm just not understanding why you're being so obtuse. If it's the former, please explain.
Not trolling at all. The experience to sense and experience emotion is a critical component to AI and there is a lot of research currently in developing AI with that in mind. a_skeleton_03 believes that AI cannot be convinced of anything because it is impossible for an AI to be created that can be persuaded by emotional arguments. I believe the creation of an AI that responds to human emotion is possible and very likely.
 

a_skeleton_03

<Banned>
29,948
29,762
Not trolling at all. The experience to sense and experience emotion is a critical component to AI and there is a lot of research currently in developing AI with that in mind. a_skeleton_03 believes that AI cannot be convinced of anything because it is impossible for an AI to be created that can be persuaded by emotional arguments. I believe the creation of an AI that responds to human emotion is possible and very likely.
No, there is a difference. A machine can be taught to see and have a response to emotion. A machine cannot (the probability is high enough to say this) reason utilizing emotions. There is the difference and why you are responding the way you are because you haven't read a single thing on the subject.

The difference is a huge gap. One is just input and response and the other would be input with a random response that a machine cannot, if truly intelligent, accept as a valid option. Humans can barely handle it and we think on a very fluid level and have chemicals that alter our response somewhat randomly if looked at objectively. Just take a woman on her period, a depressed teenager going through puberty, or a man immediately after sex and give them a problem to solve. It just doesn't work that way in an AI environment with logical guidelines in place and significant amount of information available with a vastly more complex problem solving matrix. It is going to finish a task as optimally as it can taking into effect any limitations it can find.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,606
73,744
I think my BartenderBot that gives a free double upgrade to someone who is emotionally distraught and clearly needs a stiff drink is both reasonable and antithetical to your assumptions.

On the contrary to Mist's initial argument you disagreed with, if the bartender was religious zealout ex: a Mormon he wouldn't even serve you a coffee much less liquor!
 

Tuco

I got Tuco'd!
<Gold Donor>
45,606
73,744
I just thought of something, are you under the assumption that only beings with souls can experience and sense emotion? This sounds familiar, wasn't there some other laughable action you assumed that only beings with souls could partake in?
 

a_skeleton_03

<Banned>
29,948
29,762
I think my BartenderBot that gives a free double upgrade to someone who is emotionally distraught and clearly needs a stiff drink is both reasonable and antithetical to your assumptions.

On the contrary to Mist's initial argument you disagreed with, if the bartender was religious zealout ex: a Mormon he wouldn't even serve you a coffee much less liquor!
That is a measure response to input. That is not convincing him, that is an option on a matrix. You are confusing a robot designed with a purpose and AI though. They are as different as night and day. I don't think you understand that at all.

A Mormon bartender could be persuaded since you could explain to him that you aren't part of his religion and therefore those rules don't apps to you, you could appeal to him wanting a large tip, or you could threaten to kill his family. All options that would be taken in as options by an AI and processed not as emotion but how it would further the purpose of that AI. Is its purpose one of proselytization? It might serve you that drink and then go into a speech about Mormonism or not serve it and go into a different speech. Is making money going to serve that purpose better because it can give more to the church? It has no family so you have tailored your threat differently and threatened to destroy it and is self preservation higher priority than your libation? It isn't evaluating these things based on their emotion but on its purpose. Huge distinction that can lead to the same outcome but entirely different reasons.
 

khalid

Unelected Mod
14,071
6,775
I wouldn't expect him to admit that Tuco. After arguing for days that cloning was impossible because of souls, he has since completely backtracked on it and now denies he said it.
 

Abefroman

Naxxramas 1.0 Raider
12,589
11,907
I think my BartenderBot that gives a free double upgrade to someone who is emotionally distraught and clearly needs a stiff drink is both reasonable and antithetical to your assumptions.

On the contrary to Mist's initial argument you disagreed with, if the bartender was religious zealout ex: a Mormon he wouldn't even serve you a coffee much less liquor!
Your bartender bot is shit. Word gets out and everyone in the bar is gonna act like a fucking grey cloud just to get more booze for free. That doesn't sound like a fun time.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,606
73,744
I wouldn't expect him to admit that Tuco. After arguing for days that cloning was impossible because of souls, he has since completely backtracked on it and now denies he said it.
Oh that's right, it was cloning. You can't clone something with a soul!

a_skeleton_03, I think you're making a false distinction between inputs you feel are based in reason and inputs that aren't. You're assuming a lot about how AI in the future are required to work and the inputs to it that are valid. You're probably also weaseled into a semantic argument that I don't care about. I mean if you fuck up on the word convince I don't think you're equipped for a semantic debate.

Abe, good point. My Bartender-Bot Mk II will give drink upgrades to people who are partying hard.


you can buy the Mk I if you want a somber area with people who go through a self-perpetuating cycle of acting depressed and drinking, or you can get Mk II if you want a party atmosphere.
 

a_skeleton_03

<Banned>
29,948
29,762
I wouldn't expect him to admit that Tuco. After arguing for days that cloning was impossible because of souls, he has since completely backtracked on it and now denies he said it.
Quote me saying that, I most specifically said that if we did clone someone that would make me change my faith, that I didn't think it would happen or it already would have. Distinct difference from saying it is impossible because of souls. I am pro human cloning and don't see why we have any regulation on it at all.

Here is where you will mist away though or drop a few more one liners that aren't on topic or back up what you just claimed. Classic khalid.
 

a_skeleton_03

<Banned>
29,948
29,762
Oh that's right, it was cloning. You can't clone something with a soul!

a_skeleton_03, I think you're making a false distinction between inputs you feel are based in reason and inputs that aren't.

Abe, good point. My Bartender-Bot Mk II will give drink upgrades to people who are partying hard.


you can buy the Mk I if you want a somber area with people who go through a self-perpetuating cycle of acting depressed and drinking, or you can get Mk II if you want a party atmosphere.
And you still don't understand what Artificial INTELLIGENCE is I see. Let's give you a hint, it isn't just input > output like your bartender is.

You aren't trolling, you are ignorant.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,606
73,744
Oh man, Khalid was right.

I wonder how long it'll take before a_skeleton_03 says, "Quote me saying that robots can't be manipulated."
 

khalid

Unelected Mod
14,071
6,775
CLassic a_skeleton_03, to deny science and claim everyone else doesn't understand it as much as he does!

The human brain is just a machine. A very complex and chaotically built machine, but just a machine. Unless you believe in something silly like "you need a soul to have emotion", there is really no reason to think we won't be able to eventually build machines that work with emotions or even have emotions themselves.

edit: a_skeleton_03, thanks for proving my assertion while attempting to deny it.
 

a_skeleton_03

<Banned>
29,948
29,762
Oh man, Khalid was right.

I wonder how long it'll take before a_skeleton_03 says, "Quote me saying that robots can't be manipulated."
So quote me saying either. You still are saying bot or robot like they are even a topic in this discussion which they are not.
 

a_skeleton_03

<Banned>
29,948
29,762
CLassic a_skeleton_03, to deny science and claim everyone else doesn't understand it as much as he does!

The human brain is just a machine. A very complex and chaotically built machine, but just a machine. Unless you believe in something silly like "you need a soul to have emotion", there is really no reason to think we won't be able to eventually build machines that work with emotions or even have emotions themselves.
Sounds like you have read zero books on neurochemistry. We are a machine, a very flawed one, that will let chemicals change our responses from a logical decision to something irrational. Something that you can just program, if input then do something random, into an AI.
 

khalid

Unelected Mod
14,071
6,775
The entire human body is just a machine also btw. Souls have nothing to do with it. I really think that is where your problem is coming from in all this. There is nothing mythological about emotions. No book on neurochemistry is going to reference souls in a serious fashion or say the shit is anything but a chain of physical reactions.
 

a_skeleton_03

<Banned>
29,948
29,762
The entire human body is just a machine also btw. Souls have nothing to do with it. I really think that is where your problem is coming from in all this. There is nothing mythological about emotions. No book on neurochemistry is going to reference souls in a serious fashion or say the shit is anything but a chain of physical reactions.
But not a logical and defined reaction. Completely different makeup in every single human. I am not going to argue about, nor did bring up souls here. Only you and Tuco are trying to derail this. You cannot program in the randomness of emotion into an AI. Nothing to do with souls at all.