It may come as no surprise to you to discover that the automaker Tesla has been sued again. Another crash involving a self-driving vehicle left a woman with a broken foot. According to the lawsuit, the woman believed that the car was driving on its own and would stop if the need arose. As such, she decided to check out her phone while autopilot was engaged. The car crashed into a fire engine.
It’s pure coincidence, but this is the second report of a vehicle on autopilot crashing into a fire engine. In the first accident, in the beginning of 2018, another person luckily walked away unscathed. In yet another incident, a man was located passed out behind the wheel of his self-driving car. It begs the question: Can we trust self-driving cars yet?
Before they were released to the public, self-driving cars held with them an expectation: Vehicle collisions would be reduced because driver error wouldn’t be present. So far, the expectation is not exactly meeting reality. The cars brought with them visions of being able to do just about anything we wanted behind the wheel, while our cars zipped us from here to there without any concern for safety. Unfortunately, we aren’t quite up to speed with the Jetsons just yet.
Self-driving cars can be put on autopilot, but they aren’t meant to drive without a human paying close attention to the things that are happening. In essence, you can sit back and relax a bit, but you’ve got to be ready to take over the controls in a moment’s notice. When the vehicle signals an alert to some sort of danger or you notice a hazard, you’ve got to get off autopilot and start driving yourself.
For its part, Tesla is saying that autopilot accidents do not mean the car manufacturer is to blame. Founder Elon Musk has pointed out that we aren’t in the autonomous stage. Autopilot is not the same as autonomous, in that you can’t shift your attention away from the road.
Does that get Tesla off the hook? Not exactly. Driving on autopilot lulls people into a false sense of security. They believe they can shift their attention from the road and onto other things. Tesla perhaps should have anticipated this. It’s an interesting legal argument, if nothing else.
The company has recently published its very first voluntary safety report and is making claims that, to some, mean very little. According to the report, there was a single crash-like event for every 3.34 million miles driven with autopilot engaged. Drivers who were in control of their Teslas experienced similar events at a rate of one for every 1.92 million vehicles.
Despite releasing this report, the company is being tight-lipped as to exactly what “crash-like events” were. The report had nothing in regards to crash severity or number of injuries. It also didn’t include what may have caused the crashes or when and where they occurred.
Experts point out that this is not a safety report at all. Instead, it’s a few data points that have been released by the company to back up previous claims. It misses key points and lacks both context and detail.
Interestingly, the report was released just a day after cars with semi autonomous systems were ranked by Consumer Reports. Tesla’s Autopilot system came in second behind Cadillac’s Super Cruise.
If you have dreams of pushing a button and having your car take over your responsibilities, it may be time to wake up. Unless you have been in a Tesla or know someone who owns one, chances are that you don’t have a clear idea of how the feature operates. According to those in the know, this is how autopilot really works:
A driver engages the feature by tapping a cruise control stalk. This will activate both auto-steering assistance and traffic-aware cruise control. Drivers will be advised to keep their hands on the wheel for safety purposes.
When you want to change lanes, activate your turn signal as usual and the car will take care of it for you. A video on the Tesla website reminds drivers that the features are only meant to be used on the highway and that drivers should remain alert at all times.
What many people don’t realize is that the car actually checks to see if human hands are on the wheel. If the vehicle doesn’t detect that the wheel is being held, it will alert the driver and slow the speed of the vehicle until hands are laid on the wheel again.Tesla says this is to ensure that the feature is used in a safe way.
If you have been involved in an accident with a vehicle on autopilot, you have rights. Our car accident lawyers in Miami are ready to represent you and fight for compensation for your medical bills, lost wages and more. Call our office today to discover more about how we can assist you and to schedule your free case evaluation. Our team is here for you and ready to focus on your case.