
Defensive driving used to be about awareness of what other drivers were doing. With technological “progress,” now those on the road need to be concerned about what other cars are doing. What’s the difference you ask? Well, with Tesla’s Autopilot feature, some of its cars are equipped to drive themselves resulting in collisions with stationary first responder vehicles. As a result, the National Highway Safety and Transportation Administration (NHSTA) has opened an investigation into Tesla’s smashing autopilot failures.
Tesla is an American electric car company based in Palo Alto, California. The company’s name is a tribute to inventor and electrical engineer extraordinaire Nikola Tesla. Even if you aren’t familiar with Tesla, certainly you have heard of the company’s high-profile CEO, Elon Musk. Yes, he’s so high that he’s aiming for the stars, literally with his SpaceX program. Meanwhile, back on the ground, Tesla and Musk have experienced some liftoff failures with its autopilot system for Tesla electric cars.
The company’s autopilot feature enables Tesla vehicles to steer, accelerate, and brake automatically within their lane. Nevertheless, Tesla manuals instruct drivers to put their hands on the steering wheel when the vehicle is in autopilot mode. And all drivers follow their car manual’s instructions to the letter, of course. Yeah, right. When was the last time YOU even opened your car’s driver’s manual? Just as I suspected….
Demonstrative Exhibit A as to why this instruction is given comes to us from a March 2018 crash of a Tesla in the self-driving mode. Did the “driver” have his hands on the wheel? Nope. He had his hands and his eyes on his cell phone playing games while the car was rolling down the road. We don’t know the outcome of the game on his phone, but we do know that this was the last game he ever played; the “driver” was killed in the crash. In another instance, a drunk driver was found in the back seat of his Tesla as it drove him, helpfully trying to assist him in avoiding a DUI.
But human error (stupidity?) cannot be fingered in all the Tesla crashes under investigation. Tesla vehicles operating with this feature are reported to have been in repeated collisions with stationary emergency vehicles such as police cars, ambulances, or other emergency vehicles. These accidents typically occur after dark where “scene control measures” such as road cones, flares, illuminated arrow boards, and first responder vehicle lights are in use. For example, in January 2018, a Tesla in struck a parked firetruck with its lights flashing. Maybe the Teslas are being “blinded by the light.” (Cue Manfred Mann music in the background.)
Don’t blame the poor Teslas. Blame their programmers. According to experts, the likely cause of these crashes is that the autopilot systems are programmed to pretty much ignore stationary objects. Why? If this programming were included, the vehicle could react to all sorts of things on the side of the road such as signs and buildings. Methinks there needs to be some technology tweaking.
Due to the rise in collisions in autopilot situations, NHTSA issued new rules in June requiring companies like Tesla to report all incidents involving such systems. By mid-August, concern had so increased about these collisions, that NHSTA opened an investigation. In particular, its investigation is focused on twelve accidents which have occurred in nine different states.
The twelfth accident actually happened shortly after the investigation began. (Poor timing, if you ask me.) This crash took place on I-4 in Orlando shortly before 5:00 a.m. A car had broken down in a travel lane, and a highway patrol car was stopped behind the disabled vehicle with its lights flashing. The Tesla hit the police cruiser and narrowly missed hitting the trooper who had exited his vehicle to approach and render aid to the stranded motorist. Perhaps the Tesla was rubbernecking and not paying attention leading to the crash.
The NHTSA investigation is focused on Tesla’s Advanced Driver Assistance System (ADAS) and particularly Tesla Models X, Y, S, and 3. A Model 3 Tesla was the one involved in the late August 2021 Orlando crash. To gather information on the problem, NHTSA sent a detailed 11-page letter to Tesla with numerous questions to be answered. An October 22nd deadline for a response was set. If it is determined that the Tesla autopilot system is unsafe, NHTSA could require the company to recall cars or repair them to correct safety defects. This remediation effort could affect up to 765,00 Teslas built between 2014 and 2021. Who knew there were even that many electric cars out there on the road?
In the meantime, Tesla, in light of its “success” with the autopilot feature, is moving forward to release a new and even more ambitious version of Full Self Driving (FSD) software. Currently that feature is undergoing beta testing, where the near finished product is provided to a target group of users to evaluate performance in real world conditions. Sure, that’s where Tesla needs to work the bugs out–on the road where the rest of us are innocently driving. Sounds like a great plan to me. What could go wrong with that? (See earlier paragraphs regarding crashes with current system…..)
Much is at stake, such as life and limb, when one gets in a car. I am not convinced that any convenience derived from relying on imperfect autopilot technology to drive me from point A to point B is worth putting my life and health on the line. Admittedly, human beings aren’t perfect drivers either and can make mistakes; however, we can at least usually recognize and attempt to avoid stationary first responders. While Tesla goes back to the drawing board to teach its autopilot program about parked emergency vehicles, I’ll keep my hands on the wheel. Won’t you do so as well?
WONDER-ing Woman:
Have you ever driven an electric car? Would you be comfortable in fully relying on an autopilot system to drive you on the highway? At what point should an autopilot feature be deemed safe enough for use on the road?