#tesla This should give you pause. A tech writer, who lives in my City, got his update and this was his experience in his words. Last weekend, my Tesla Model Y received an over-the-air update to make its driver-assistance software safer. In my first test drive of the updated Tesla, it blew through two stop signs without even slowing down. In December, Tesla issued its largest-ever recall, affecting almost all of its 2 million cars. It is like the software updates you get on your phone, except this was supposed to prevent drivers from misusing Tesla’s Autopilot software. After testing my Tesla update, I don’t feel much safer — and neither should you, knowing that this technology is on the same roads you use. During my drive, the updated Tesla steered itself on urban San Francisco streets Autopilot wasn’t designed for. (I was careful to let the tech do its thing only when my hands were hovering by the wheel and I was paying attention.) The recall was supposed to force drivers to pay more attention while using Autopilot by sensing hands on the steering wheel and checking for eyes on the road. Yet my car drove through the city with my hands off the wheel for stretches of a minute or more. I could even activate Autopilot after I placed a sticker over the car’s interior camera used to track my attention. The underlying issue is that while a government investigation prompted the recall, Tesla got to drive what went into the software update — and it appears not to want to alienate some customers by imposing new limits on its tech. It’s a warning about how unprepared we are for an era where vehicles can seem a lot more like smartphones, but are still 4,000-pound speed machines that require a different level of scrutiny and transparency. I found we have every reason to be skeptical this recall does much of anything. Yikes!
Autopilot Safety Recall Review
Explore top LinkedIn content from expert professionals.
Summary
The term "autopilot-safety-recall-review" refers to the evaluation and analysis of safety recalls related to autonomous or semi-autonomous driving technologies, such as Tesla's Autopilot system. It highlights concerns about the effectiveness of software updates aimed at addressing safety shortcomings in these systems.
- Understand the limitations: Despite updates, semi-autonomous systems like Tesla's Autopilot may still have safety risks, requiring constant driver attentiveness and intervention.
- Stay informed: Keep track of recall notices, as they often address critical safety concerns that may impact your vehicle's performance.
- Advocate for regulation: Support efforts to push for stricter testing and transparent oversight of autonomous vehicle technologies to improve road safety.
-
-
There has been a lot written about the Tesla recall. It’s basically a software update that will be pushed out to vehicle owners that will increase the frequency of alerts if it thinks your hands are not on the wheel when driving with Autopilot. Unfortunately it is unlikely to do much to solve the problems with this software. Tesla already gives frequent alerts when it thinks your hands are off the wheel (often erroneously) starting at about 30 seconds and escalates to lock out fairly quickly. But this doesn’t really solve the driver attention problem. People simply are not good passive monitors of automation in general and never will be, even with these nuisance alarms. This has been documented in hundreds of studies over the past 40 years. Situation awareness decreases and the likelihood of detecting and responding correctly to problems goes way down. All this “fix” will do is make the autopilot even more annoying. Therefore people will either not use it or use methods to fool it (for which there are several techniques out there in use). This is a poor bandaid for a fundamental problem with low reliability automation that requires human vigilance. NHTSA needs to do much more to ensure that vehicles with autonomous software are designed to promote good performance outcomes and tested to make sure it is safe before being used on our nation’s roadways. This will require that Congress passes legislation to address this legal gap. https://lnkd.in/g9cXTCdi
-
As a byproduct of issuing a new investigation into Tesla, NHTSA issued a wrap-up report of their previous investigation EA22002 yesterday. Some nuggets of the crashes NHTSA knows about since Jan 2018: - Frontal AutoPilot: 211 crashes / 13 fatalities - Low traction AP crashes: 145 - Autosteer inadvertently cancelled crashes: 111 - FSD-Beta: 75 crashes / 1 fatality - 83% of crashes had hazard visible for less than 10 seconds before crash - Tesla crashes under-reported to NHTSA by perhaps 82% due to emphasis on pyrotechnic deployment as a crash reporting criterion. The safety issue with AP (and Tesla) in a nutshell is summarized as: "A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities." Source: https://lnkd.in/eamnbNhn