Elon Musk promised Tesla owners who bought the so-called “full self-driving” package that they would receive it by the end of September. Elektrek reports that hackers have gotten their hands on it early, as in now.
This is alarming on numerous levels. Tesla has been severely limiting even the number of beta testers for this software, which is only intended for use in the US. But a YouTube video shows a hacked car in Ukraine operating in “full self-driving” mode. Ukraine, if you’re not familiar with geography, is not the US.
So far, all that hackers have wanted to do is get their hands on the software and try it for themselves, not alter it for any nefarious ends. But what if someone does? Now that it’s widely known to be out there, hackers with less benevolent intentions can have their way with it. This reminds me of Doctor Who‘s “The Sontaran Stratagem” episode, where the Sontarans hacked cars to gas humanity, as well as eliminate key opponents who learned of the plan before it was fully implemented.
Finally, there’s the fact that we’re simply not ready for “full self-driving” cars yet. Teslas are still randomly crashing into police cars while in the currently available Autopilot mode. That’s a rather fundamental flaw, one that should be addressed before giving the software even more control over the car. How do we know that aliens aren’t waiting for the opportunity to crash Teslas into every police car simultaneously, then take over the world? It could happen, especially if the software is that easy to hack.
Further validating the NHTSA investigation into such things, another Tesla flying on Autopilot has crashed into a police car conducting a routine traffic stop, this time in Florida, reports Automotive News.
It may or may not be the fault of Autopilot itself. It is certainly the driver’s responsibility, because either way, he shouldn’t be allowing his car to smash into police cars on the side of the road. Jalopnik makes a convincing argument about how awful Level 2 autonomy systems such as, but not limited to, Autopilot are. They can drive under ordinary circumstances, but it doesn’t take much to confuse them, at which point the human driver is expected to take over immediately. We’re bad at doing this. It’s not that we’re bad drivers (well, some of us are) — it’s just human nature.
At this point, just add one more incident for NHTSA to investigate, and be glad that nobody got hurt — especially the cop who might’ve been standing on the side of the road if the timing had been different.
It’s been a long time coming, but it was crashes with emergency vehicles that prompted the National Highway Traffic Safety Administration to open an investigation into Tesla’s Autopilot semi-autonomous driving feature. Since January 2018, there have been 17 injuries and one death in 11 crashes, all of which involve Teslas operating in Autopilot mode colliding with vehicles at first responder scenes. This investigation covers 765,000 Teslas, all models, from 2014 to 2021.
Tesla’s eyes are getting better, as camera-based sensors continue to replace the ultrasonic sensors previously used. Electrek brings this video of the new camera-based Autopark feature in action from Tesla hacker Green, who has managed to enable this new camera-based software on his older Tesla, even though it’s only supposed to work on Model S cars delivered since June.
What can I say? It works! There’s absolutely no way that ultrasonic sensors could see the painted lines in the open area where Green demonstrates the feature. This makes it possible to park your precious Tesla far, far away from the plebs and their ancient caveman cars that still run on fossil fuels.
Kidding aside, this is pretty cool, and an essential step as Tesla keeps improving its self-driving capabilities. Though I have to wonder how well this will work in the average parking lot with paint that’s worn out, or repainted in a different pattern while the old lines are still somewhat visible as well? Maybe Green should go to his local “dead mall” for his next test.