Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

RKP5637

(67,112 posts)
Fri Jul 1, 2016, 09:38 AM Jul 2016

Tesla driver dies in first fatal crash while using autopilot mode

https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk?utm_source=esp&utm_medium=Email&utm_campaign=GU+Today+USA+-+morning+briefing+2016&utm_term=179979&subid=12122475&CMP=ema_a-morning-briefing_b-morning-briefing_c-US_d-1

The first known death caused by a self-driving car was disclosed by Tesla Motors on Thursday, a development that is sure to cause consumers to second-guess the trust they put in the booming autonomous vehicle industry.

The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Tesla’s autopilot mode, which is able to control the car during highway driving.

Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said. The car attempted to drive full speed under the trailer, “with the bottom of the trailer impacting the windshield of the Model S”, Tesla said in a blogpost.

A police report in the Levy County Journal said the top of the vehicle “was torn off by the force of the collision”.
16 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

The Animator

(1,138 posts)
1. Based on what I know about the present state of AI
Fri Jul 1, 2016, 09:54 AM
Jul 2016

and my wife's research on the subject. Two things that the human brain can do that AI's currently can't do, or can only do with great difficulty is
1) Pattern Recognition and
2) Making predictions based on that pattern.

These two shortcomings are two things that are absolutely essential to defensive driving. An situational awareness of the drivers around you in all directions. Being able to spot an aggressive driver as a potential problem that warrants more caution. Calculating the numerous stupid decisions any given driver could make, and the best way to avoid contact with said stupidity... On the road, with human drivers at high speed, a vehicle that bases it's reactions on logic is ill prepared for unexpected scenarios it wasn't programed to handle.

frazzled

(18,402 posts)
7. That's my whole fear about this technology
Fri Jul 1, 2016, 10:24 AM
Jul 2016

Why didn't he see the tractor-trailer and quickly override the automatic system? Probably because due to a false sense of security, he was not being fully alert. It's bad enough right now with people sneaking peeks or outright using their phones or texting as they drive. With self-driving cars, that will only increase a million-fold.

RKP5637

(67,112 posts)
15. It certainly seems this might have played a big part ... "a false sense of security, he
Fri Jul 1, 2016, 12:31 PM
Jul 2016

was not being fully alert."

 

SheilaT

(23,156 posts)
4. I know very little about AI, but I know a reasonable amount
Fri Jul 1, 2016, 10:06 AM
Jul 2016

about driving.

When my sons were first learning to drive, I could say things like, "That truck ahead of you is going to change lanes, so slow down a bit." The truck wouldn't have the turn signal on, but it would change lanes, and my sons would think I'm psychic. Not at all, just experienced enough to recognize the body language of vehicles on the highway.

And I doubt I'm any better at this than the typical driver. But it's what comes with long experience and the very human ability to recognize patterns, including and perhaps especially uncommon patterns.

I have no idea if AI can learn to do those things as well as humans. Perhaps when we get to the point where ALL vehicles are AI driven, then such accidents as described in the OP will be impossible because the AIs will all communicate with each other. But we're an incredibly long way from that happening.

tallahasseedem

(6,716 posts)
5. You're exactly right!
Fri Jul 1, 2016, 10:09 AM
Jul 2016

This happens especially if you drive a lot. I thought I was a pretty good driver until I moved to a large city and had to battle that kind of traffic day after day. You start to use spidey senses you thought you never had!

 

SheilaT

(23,156 posts)
6. Yes, especially in larger cities.
Fri Jul 1, 2016, 10:12 AM
Jul 2016

I have never driven in NYC or Chicago, and never want to, although I have driven in Los Angeles, although that was at least thirty years ago and I hope never to do that again. I have driven through San Francisco and wasn't crazy about it.

I currently live in Santa Fe, and the traffic there is rarely difficult, although people run red lights in a way I've never seen anywhere else.

daleo

(21,317 posts)
10. This has nothing to do with electric vs internal combustion, though
Fri Jul 1, 2016, 12:03 PM
Jul 2016

That hardly needs to be stated, but no doubt people with a vested interest in gasoline engines and fossil fuels will jump on this, saying that it shows how unsafe electric cares are.

 

Lee-Lee

(6,324 posts)
11. The ensuing court cases and liability issues may determine the fate of self driving cars
Fri Jul 1, 2016, 12:14 PM
Jul 2016

Who is liable in a self driving car accident?

Just the driver?

Or the company?

Or the coder who wrote the software that caused the crash?

RKP5637

(67,112 posts)
14. And also the decision model on what to do in a life/death situation ...
Fri Jul 1, 2016, 12:28 PM
Jul 2016

save the driver?, crash into a wall?, run over pedestrians?, or let the driver die. Very hard modeling for coders IMO. I would not want to be managing this team.

Latest Discussions»General Discussion»Tesla driver dies in firs...