General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsTesla driver dies in first fatal crash while using autopilot mode
https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk?utm_source=esp&utm_medium=Email&utm_campaign=GU+Today+USA+-+morning+briefing+2016&utm_term=179979&subid=12122475&CMP=ema_a-morning-briefing_b-morning-briefing_c-US_d-1The 7 May accident occurred in Williston, Florida, after the driver, Joshua Brown, 40, of Ohio put his Model S into Teslas autopilot mode, which is able to control the car during highway driving.
Against a bright spring sky, the cars sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said. The car attempted to drive full speed under the trailer, with the bottom of the trailer impacting the windshield of the Model S, Tesla said in a blogpost.
A police report in the Levy County Journal said the top of the vehicle was torn off by the force of the collision.
The Animator
(1,138 posts)and my wife's research on the subject. Two things that the human brain can do that AI's currently can't do, or can only do with great difficulty is
1) Pattern Recognition and
2) Making predictions based on that pattern.
These two shortcomings are two things that are absolutely essential to defensive driving. An situational awareness of the drivers around you in all directions. Being able to spot an aggressive driver as a potential problem that warrants more caution. Calculating the numerous stupid decisions any given driver could make, and the best way to avoid contact with said stupidity... On the road, with human drivers at high speed, a vehicle that bases it's reactions on logic is ill prepared for unexpected scenarios it wasn't programed to handle.
RKP5637
(67,112 posts)mindfulNJ
(2,367 posts)liberalla
(9,277 posts)frazzled
(18,402 posts)Why didn't he see the tractor-trailer and quickly override the automatic system? Probably because due to a false sense of security, he was not being fully alert. It's bad enough right now with people sneaking peeks or outright using their phones or texting as they drive. With self-driving cars, that will only increase a million-fold.
RKP5637
(67,112 posts)was not being fully alert."
SheilaT
(23,156 posts)about driving.
When my sons were first learning to drive, I could say things like, "That truck ahead of you is going to change lanes, so slow down a bit." The truck wouldn't have the turn signal on, but it would change lanes, and my sons would think I'm psychic. Not at all, just experienced enough to recognize the body language of vehicles on the highway.
And I doubt I'm any better at this than the typical driver. But it's what comes with long experience and the very human ability to recognize patterns, including and perhaps especially uncommon patterns.
I have no idea if AI can learn to do those things as well as humans. Perhaps when we get to the point where ALL vehicles are AI driven, then such accidents as described in the OP will be impossible because the AIs will all communicate with each other. But we're an incredibly long way from that happening.
tallahasseedem
(6,716 posts)This happens especially if you drive a lot. I thought I was a pretty good driver until I moved to a large city and had to battle that kind of traffic day after day. You start to use spidey senses you thought you never had!
SheilaT
(23,156 posts)I have never driven in NYC or Chicago, and never want to, although I have driven in Los Angeles, although that was at least thirty years ago and I hope never to do that again. I have driven through San Francisco and wasn't crazy about it.
I currently live in Santa Fe, and the traffic there is rarely difficult, although people run red lights in a way I've never seen anywhere else.
TampaAnimusVortex
(785 posts)Read this article about it.
http://gizmodo.com/fatal-tesla-crash-proves-full-autonomy-is-the-only-solu-1782923424
RKP5637
(67,112 posts)daleo
(21,317 posts)That hardly needs to be stated, but no doubt people with a vested interest in gasoline engines and fossil fuels will jump on this, saying that it shows how unsafe electric cares are.
RKP5637
(67,112 posts)Lee-Lee
(6,324 posts)Who is liable in a self driving car accident?
Just the driver?
Or the company?
Or the coder who wrote the software that caused the crash?
RKP5637
(67,112 posts)save the driver?, crash into a wall?, run over pedestrians?, or let the driver die. Very hard modeling for coders IMO. I would not want to be managing this team.
Matrosov
(1,098 posts)Skynet?