Are we really doing this?
Yes.
http://www.nhtsa.gov/staticfiles/rul...les_Policy.pdf
We've had level 1 automation for years.
Tesla is rolling out level 2 automation with their supercruise (lane following + automated cruise control that keeps speed with whatever is in front of you) in consumer's hands.
Bosch and Delphi are releasing their supercruise this year on a few OEMs. Other Tier1s are playing catch up to them. Ford recently started a massive effort toward catching up to google.
Every sensor manufacturer ( lidar, GPS, radar, sonar, IMU, mono/stereo/infrared cam) is trying to improve and reduce the cost of their systems. Despite people with one eye being able to drive just fine you need a MASSIVE amount of sensing on these vehicles for fully autonomous driving. Lidar are generally the most expensive perception sensors, but provide a lot of functionality where radar, sonar and cameras fail. I'm pretty excited about
Quanergy | Future of Mapping and Navigationwho are making a solid state lidar.
Computing hardware is increasing nicely with Moore's law which is making the immense processing task of autonomous driving much tractable. The earlier autonomous vehicles you might see in DARPA's grand challenge 2007 were basically SUVs with huge racks of computing hardware. Improvements to our processers have shrunk that down nicely, and further improvements (plus refining to our algorithms/custom processing hardware for intensive actions like object recognition, ray tracing etc) will enable that processing to fit on traditional ECMs.
The problem with ALL automation, from factories to roombas to roadways is this: The problem gets harder the closer you get to covering 100% of the situations you encounter. And roadways feature a massive variety of problems from construction, to pedestrians, to weather conditions, to other drivers. I was at a presentation last year from one of Google's autonomous cars leaders, Chris Urmson, who said they found an instance of the car being blocked by an old lady in a wheel chair chasing ducks.
Tesla only needs a forward facing cam + radar for their system. The increase in sensor payload to get from level 2 to 3 is massive.
So just because we have solved a certain level of automation does not mean that future levels are imminent.
All that being said, don't expect level 3 automation (what google has) to roll out this decade to consumers like what Tesla has done with their level 2 automation. The gains will be hard fought and many of those new cases will require new technological breakthroughs from both software and hardware.
Here's a good presentation on onroad autonomy:
Chris Urmson: How a driverless car sees the road | TED Talk | TED.com