Tesla driver operating car on autopilot in deadly crash ordered to stand trial
Los Angeles driver to be tried on two counts of vehicular manslaughter

Kevin George Aziz Riad, 27, will be tried on two counts of vehicular manslaughter, Fox 11 LA reports.

Police reportedly said his Tesla Model S took a freeway exit and ran a red light before plowing into a Honda Civic Dec. 29, 2019.

The crash killed Gilberto Alcazar Lopez, 40, and Maria Guadalupe Nieves-Lopez, 39, who were on a first date the night of the accident.

The car’s Autosteer and traffic aware cruise control were activated at the time of the crash, according to prosecutors.

A Tesla engineer also testified that sensors indicated Riad had a hand on the steering wheel.

Crash data reportedly showed that no brakes were applied in the six minutes leading up to the accident.

Tesla officials have said that Autosteer and the more sophisticated “full self-driving” function cannot entirely drive the vehicle, and drivers must monitor and respond to needs based on traffic conditions.

Let me see if I understand the situation.

Tesla sells a car with a “full self-driving” mode.

Tesla says that actually full self-driving isn’t actually fully self-driving and the driver still have to drive the car because the full self-driving mode can malfunction and cause an accident.

If the driver doesn’t understand that full self-driving isn’t actually fully self-driving and causes and accident, the driver who was trusting the full self-driving mode is legally liable.

So why buy a self-driving car?

It only encourages you to not pay attention but requires you to pay attention while making you legally liable for its malfunctions.

This seems like more of a risk than not having it at all.

If people can’t trust their self-driving system and will go to jail if their self-driving system makes a mistake, self-driving is dead.

Spread the love

By J. Kb

11 thoughts on “This case could destroy the self driving car”
  1. This starts to speak to a question I’ve had. If you have a real self-driving car, do you need to have liability insurance, as you are not controlling it? If there’s an accident, does it fall on the vendor who sold and (presumably) certified it as safe?
    .
    Because if not, as you say, why on earth would you buy one?

  2. It’s one of many reasons why I don’t trust the “driver aids”. A couple vehicles before my current truck were high trim level Fords with the highest tech level at the time (2011 then 2017). None of it even worked right. Adaptive cruise control, collision avoidance, active braking, parking assist, lane keep assist. Either the tech didn’t work right or it was so frustrating to use that it was never used.
    .
    A recent example: A buddy got a new for with the high end adaptive cruise controls. It can read the speed limit signs and you can program a threshold above/below whatever the speed limit is. He had it set for ~5mph over and all was good. Until the camera in the car saw a sign for Florida State Route 85 and reset the cruise to 90mph.

    I’m good with my slightly above work truck spec new truck.

    1. One of the things I dislike about frequently traveling for work, is getting a different rental car each trip, usually with its own special combination of driver aid crap.
      .
      My wife’s vehicle, a 2018, has a full complement – adaptive cruise control, collision warning, auto-braking, lane departure, et cetera. The blasted thing will ping about 10-15 times on a 6-mile drive to, say, a local dog park. It’s not that my wife is a bad driver at all, it’s just that roads aren’t perfect, for instance, so a redone striping job will trigger lane departure, a curve in the road fakes out “imminent collision” sensing an oncoming car, and so forth. At this point it’s pure “boy who called wolf” syndrome … right until it actually increases the danger by applying the brakes unnecessarily and at an inappropriate time.

  3. Back in the 70s Winnabago motor homes got cruise control. The add said” it practically drives itself”.. you know what happens next- guy sets cruise control and goes up back to make a sandwich…. He won his lawsuit because “practically drives” was in the add. I have a new Ford Transit 250 work gave me, I dont use any of the driving “tech”. It makes people dumber.

  4. Actual self driving won’t be properly viable until the next decade. In addition to the proper sensors and software for self driving to work, V2x and V2v standards and infrastructure need to be set up/in place for self driving to have greater awareness than the vehicles own sensors.

  5. The job of an aircraft autopilot is probably 100 times easier than a totally autonomous car and yet they have problems, too.

    One of the biggest, and what may be behind this one, is the attention gap between actually driving and sitting there watching things. The driver was in the seat, hand on the wheel, feet in position but his mind was a thousand miles away. When he suddenly needs to pay attention, he suddenly needs to be 100% at attention, “in the moment” and ready to do whatever he has to do to avoid the accident. People just aren’t good at transitioning that fast. We have enough accidents from failing to pay attention as it is. This is a risky time to be around those cars.

    1. I very much doubt that self driving cars will ever happen. Self driving flying cars, that’s a possibility.
      The fact that people seem to be into using “AI” for these projects only reinforces my belief that it will not, cannot, happen.

  6. We can’t even have self-driving trains, which have only three directional settings to select: Go, Stop, and Back Up.
    “Self-Driving” cars are the LZ Hindenburg waiting to happen.

    One of them will have to take out a crowd, or kill a schoolbus-load of kids to get there, but they’re going to keep trying until the idea is banned for being exactly as moronic and idiotic as it is and likely always will be, short of forcing you to stay inside your human hamster cage 24/7/365/forever, run on your exercise wheel, and get your government-supplied food pellet each day.

    1. Self driving trains are easy, but the union bosses would not approve them.
      It’s interesting, though, that history has shown some train drivers are stupider than computers.

Only one rule: Don't be a dick.

This site uses Akismet to reduce spam. Learn how your comment data is processed.