Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem
-
Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem
https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/
-
Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem
https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/
**VERY glad the guy and his kids are okay, but it would have been something else if the Uber self-driving chief had been incinerated or killed by a self driving car.

-
**VERY glad the guy and his kids are okay, but it would have been something else if the Uber self-driving chief had been incinerated or killed by a self driving car.

"...What makes this account particularly striking is Krikorian’s background. At Uber’s Advanced Technologies Center, he ran the team building autonomous vehicles and trained human safety drivers on exactly when and how to intervene when a self-driving system fails...."

-
"...What makes this account particularly striking is Krikorian’s background. At Uber’s Advanced Technologies Center, he ran the team building autonomous vehicles and trained human safety drivers on exactly when and how to intervene when a self-driving system fails...."

LOL this is the problem with relying on AI tools, as well...
"...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.
The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."
-
LOL this is the problem with relying on AI tools, as well...
"...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.
The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."
@ai6yr It's solved for pilots. It's called "training" and it works fine.
Oh wait ...
-
@ai6yr It's solved for pilots. It's called "training" and it works fine.
Oh wait ...
@ai6yr Having said which:
(1) When I've taken passengers flying I've trained them in opening the door as part of the safety briefing.
(2) When I have overnight visitors I should them how to open all the outside doors in case of fire.
(3) It has *never* occurred to me that I need to give *car* passengers a safety briefing on how to open the door.
-
@ai6yr Having said which:
(1) When I've taken passengers flying I've trained them in opening the door as part of the safety briefing.
(2) When I have overnight visitors I should them how to open all the outside doors in case of fire.
(3) It has *never* occurred to me that I need to give *car* passengers a safety briefing on how to open the door.
@TimWardCam @ai6yr I recently thought about that while hitching a ride with a friend in a Mustang EV (and they actually weren't 100% sure about the manual override instructions!). Here's an article with instructions for most of the EVs on the market. https://www.consumerreports.org/cars/car-safety/how-to-escape-your-car-if-the-electronic-door-release-fails-a8152892189/
-
@TimWardCam @ai6yr I recently thought about that while hitching a ride with a friend in a Mustang EV (and they actually weren't 100% sure about the manual override instructions!). Here's an article with instructions for most of the EVs on the market. https://www.consumerreports.org/cars/car-safety/how-to-escape-your-car-if-the-electronic-door-release-fails-a8152892189/
@soundstruck @TimWardCam Holy crapola
"... Bloomberg investigations found 140 reports of occupants trapped in Tesla vehicles with electronic door handles, some of which resulted in severe injuries, and 15 deaths in crashes where a Tesla’s electronic doors would not open...."
-
@soundstruck @TimWardCam Holy crapola
"... Bloomberg investigations found 140 reports of occupants trapped in Tesla vehicles with electronic door handles, some of which resulted in severe injuries, and 15 deaths in crashes where a Tesla’s electronic doors would not open...."
@soundstruck @TimWardCam (that said... it's got to be more than that of people stuck. My son is a first responder and he said he's already been on scenes more than a few times where they can't get someone out of a Tesla because of lack of battery power, and have to break the window")
-
R relay@relay.mycrowd.ca shared this topic