Autonomous Systems

Would you ever own an autonomous vehicle?

  • Hell yeah Bring on our robotic overlords!

  • Fuck you! I'll keep my Indepenence


Results are only viewable after voting.

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
Why is it "scary" that a car loan company can locate and disable a car that you haven't paid them for?
I'm not scared by it, but the loss of control and the possibility for abuse is there.

Even more so than my house I feel like my car is my domain. I can get in, pump it full of gas and just go anywhere I damn well please and nobody can really stop me effectively.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
No deal. Self driving cars have been available on the market since the 90s.
 

Borzak

Bronze Baron of the Realm
24,585
31,885
They may not have to even actually ban humans driving. Most car insurance companies if they wind up with most of their claims being human caused accidents are going to pretty quickly either not insure cars people drive manually or do so with really punitive rates that defacto make it so if you want a car and want it insured you will let the robots do the driving.

If the state requires you to have auto insurance to drive legally and you can't get insurance if you drive manually they have basically banned human drivers without ever coming out directly and doing it.

I can see insurance companies lobbying against drone cars. If humans aren't allowed to drive anymore there goes their market.
 

mkopec

<Gold Donor>
25,389
37,457
The problem is that there will still be accidents. Due to software malfunction, power going wonky, whatever. So whose at fault there, clearly not the driver at that point, right?

I for one welcome this. I can imagine a time where my drive to work or my vacation spot is filled with stress free reading, web surfing, whatever. I just dont see this happening for a long time, and even then you will still have your holdouts fucking it up because they will not trust it.
 

khorum

Murder Apologist
24,338
81,363
I can see insurance companies lobbying against drone cars. If humans aren't allowed to drive anymore there goes their market.

Quite the opposite. They make money by betting on risk. If you still have to pay premiums on your car and autonomous driving inevitably diminishes that risk then it's all upside for them.

Consumers should get a lower premium with autonomous cars as they start showing more improvements on safety over meatbag cars.

In fact, a rise in premiums on meatbag cars may end up acting like a soft ban on them, as the risk pool grows smaller and smaller their premiums will get more and more prohibitive. Kinda like premiums for health insurance as a diabetic with pancreatic cancer.
 

sadris

Karen
<Donor>
21,131
80,758
Ehh insurance will still exist, it will just be Google paying the premiums, not Joe Bob.
 
  • 1Like
Reactions: 1 user

mkopec

<Gold Donor>
25,389
37,457
In fact, a rise in premiums on meatbag cars may end up acting like a soft ban on them, as the risk pool grows smaller and smaller their premiums will get more and more prohibitive. Kinda like premiums for health insurance as a diabetic with pancreatic cancer.

Thats a good point. Kind of like insuring them out of the meat cars and into the self driving ones. Never thought of that.
 

Aldarion

Egg Nazi
8,924
24,377
You fuckers are deeply concerned about the privacy of digital info on your cell phones -- something I've come to accept but still cant begin to relate to -- but youre all ready to hand over control of your cars themselves to networked computers. Which would be fine of course except when they mandate it for the rest of us.

The car has been a symbol of independence in American culture for a century. In 2016 its fuck independence, let the DMV drive my car so I can twitter while I commute.
 
  • 2Like
Reactions: 1 users

Brad

Trakanon Raider
198
480
Absolutely after autonomous vehicles fully mature.

I think that time period where almost everyone has an autonomous vehicle will be rather short as it should be replaced by vehicles you can just hail, making owning one pretty much pointless.

The same will happen for many other items.
 
  • 1Like
Reactions: 1 user

Chukzombi

Millie's Staff Member
71,669
212,888
even if eventually all big companies ban human drive auto insurance, there will still be antique auto insurance which covers cars 25+ years old. i used to have it on my 69 plymouth. was like 30 bucks a year.
 

Borzak

Bronze Baron of the Realm
24,585
31,885
Absolutely after autonomous vehicles fully mature.

I think that time period where almost everyone has an autonomous vehicle will be rather short as it should be replaced by vehicles you can just hail, making owning one pretty much pointless.

The same will happen for many other items.

Have you seen how some people keep the inside of their cars? I'd rather not share an car with them with no human interaction making them keep their shit out of the car lol, and it literally may be keeping their "shit" out of the car.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
Absolutely after autonomous vehicles fully mature.

I think that time period where almost everyone has an autonomous vehicle will be rather short as it should be replaced by vehicles you can just hail, making owning one pretty much pointless.

The same will happen for many other items.
Unlikely. The car is people's second home, not just a transportation device. Cars are also inexpensive to own and operate for most people.

And for people outside very dense areas you aren't just going to "hail" a car. If I need to go to the 24 hour drugstore at midnight for some formula, I'm not going to wait around for the nearest UberCar to come pick me up.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
The problem is that there will still be accidents. Due to software malfunction, power going wonky, whatever. So whose at fault there, clearly not the driver at that point, right?

I for one welcome this. I can imagine a time where my drive to work or my vacation spot is filled with stress free reading, web surfing, whatever. I just dont see this happening for a long time, and even then you will still have your holdouts fucking it up because they will not trust it.
This will be an interesting question. Shit will happen and people will die, but in many cases a failure of the autonomy system will be due to either a software defect, poor tuning, sensor failure or bad design. Up to this point the scope of what those vehicles were trying to accomplish were narrow enough that a bad throttle pedal, a bad airbag design, improper floor mats etc were enough to cause a recall and settlements. But we're talking about systems that are expected to work 100% of the time. What about autonomy systems that are expected to work 99.99% of the time?

This happened recently:
Understanding the fatal Tesla accident on Autopilot and the NHTSA probe

Tesla said:
What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.
So basically, in order to avoid a false positive and stop in the middle of the freeway for every low-hanging sign/billboard, their software classifies objects like that trailer as a billboard and now someone is dead. You know there's a dude who made that system, tuned the min-height of dangerous obstacles, and he fucked up and that's on him. As someone in that business that is terrifying.

So who is liable for that? I don't know. I don't know if the family of the deceased is going to sue Tesla, and we might only see that question get answered in the courts if they do and Tesla doesn't just immediately settle.

One of the big problems with autonomy is that people are way too willing to trust it. We're used to seeing humans learn something and once they nail it a few times they're good. Then users try out an autonomy system, see it stop for an obstacle cleanly a few times and they have absolute trust in it. There's always a huge learning experience when newbies see a catastrophic failure of well-tested autonomy systems happen in the field. Yeah, that thing you've watched perform great for 100 hours just went fucking bonkers, you'll never trust it again.
 

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
Oh, and my prediction on how the above question will play out: For decades OEMs will tell users, "Don't take fucking naps. Pay attention. if the autonomy system fails and you don't catch it, it's on your dumb ass."

Much to Aldarion's dissent, a big part of autonomy development is a camera in the cabin monitoring the driver. In cars that actually care about safety (ex: volvo), expect to see some nanny-cam that bitches and moans when it detects the driver isn't paying attention. After enough autonomy failures this might become a feature required by government.
 

kaid

Blackwing Lair Raider
4,647
1,187
Quite the opposite. They make money by betting on risk. If you still have to pay premiums on your car and autonomous driving inevitably diminishes that risk then it's all upside for them.

Consumers should get a lower premium with autonomous cars as they start showing more improvements on safety over meatbag cars.

In fact, a rise in premiums on meatbag cars may end up acting like a soft ban on them, as the risk pool grows smaller and smaller their premiums will get more and more prohibitive. Kinda like premiums for health insurance as a diabetic with pancreatic cancer.

Yup and you would still want car insurance. Accidents would still happen from weather or mechanical failure or other disasters. But it is win win even getting less premiums works out if you almost never have to pay out to anybody other than for minor shit like broken windshields and the like.
 

kaid

Blackwing Lair Raider
4,647
1,187
This will be an interesting question. Shit will happen and people will die, but in many cases a failure of the autonomy system will be due to either a software defect, poor tuning, sensor failure or bad design. Up to this point the scope of what those vehicles were trying to accomplish were narrow enough that a bad throttle pedal, a bad airbag design, improper floor mats etc were enough to cause a recall and settlements. But we're talking about systems that are expected to work 100% of the time. What about autonomy systems that are expected to work 99.99% of the time?

This happened recently:
Understanding the fatal Tesla accident on Autopilot and the NHTSA probe


So basically, in order to avoid a false positive and stop in the middle of the freeway for every low-hanging sign/billboard, their software classifies objects like that trailer as a billboard and now someone is dead. You know there's a dude who made that system, tuned the min-height of dangerous obstacles, and he fucked up and that's on him. As someone in that business that is terrifying.

So who is liable for that? I don't know. I don't know if the family of the deceased is going to sue Tesla, and we might only see that question get answered in the courts if they do and Tesla doesn't just immediately settle.

One of the big problems with autonomy is that people are way too willing to trust it. We're used to seeing humans learn something and once they nail it a few times they're good. Then users try out an autonomy system, see it stop for an obstacle cleanly a few times and they have absolute trust in it. There's always a huge learning experience when newbies see a catastrophic failure of well-tested autonomy systems happen in the field. Yeah, that thing you've watched perform great for 100 hours just went fucking bonkers, you'll never trust it again.


This is why most companies are going with LIDAR as the primary sensor instead of radar which I believe the tesla's use. It is more expensive but better at getting more detailed maps of the environment to help avoid stuff like that. The first gen is going to be like an advanced advanced cruise control. You can have it drive for you but it should still have steering wheels/pedals and you will be expected paying at least some attention to help in case of failure.

Some of the failures though also get mitigated as implementation increases. The cars should in theory talk to each other at least to send intentional signals of I intend to do this lane change and the other cars should get that notice and react accordingly. In the case of the running into a semi from behind if both were fully setup for this the semi would have been warning the car it was closing to fast on its position which would have alerted the semi and car to take corrective actions.

There are going to be teething pains and some of those likely will result in some bad accidents. But given the amount of accidents and fatalities every day that happen due to drunk/stupid/distracted/enraged/sleepy drivers it likely winds up being a huge gain for safety long term.
 
  • 1Like
Reactions: 1 user

Tuco

I got Tuco'd!
<Gold Donor>
45,411
73,480
Yep and the upcoming lidar are going to put the price point in the same range as the radar/cams that they are using. Tesla is trying to do a lot with just a radar and cam.
 

kaid

Blackwing Lair Raider
4,647
1,187
This kind of stuff though has come a LONG way in the last couple decades although I still think it is going to take longer to final maturity than I would like. But a TON of big businesses are really cranking hard at this tech so I hope when I get to my moms age I can get a car that can take me where I need to go. Mom is going less and less places on her own because she is trusting her driving abilities less and less. Automated cars like this would be a huge boon to elderly people and disabled people who want to get around still.