Obstacles in Vehicle Automation

Explore top LinkedIn content from expert professionals.

Summary

Vehicle automation faces significant obstacles, including sensor limitations, data accuracy issues, and the complexities of operating in unpredictable real-world environments. These challenges highlight the need for robust technology, comprehensive testing, and realistic expectations for autonomous driving solutions.

  • Use multi-sensor fusion: Combine cameras, LiDAR, radar, and other sensors to address depth perception issues and ensure accurate detection in diverse environmental conditions.
  • Prioritize real-world testing: Conduct autonomous vehicle trials in varied and unpredictable environments, such as construction zones or inclement weather, to improve system reliability.
  • Focus on sensor calibration: Develop software that continuously monitors and recalibrates sensors to compensate for wear, vibration, and other physical changes over time.
Summarized by AI based on LinkedIn member posts
  • View profile for Matt Damasceno

    Head of Commercial Accounts & New Markets | SDV, EV, ADAS | Global Automotive Industry | Connecting Technology, Customers & Market | #ENERGYDM

    51,859 followers

    🚗 𝗫𝗶𝗮𝗼𝗺𝗶 𝗦𝗨𝟳’𝘀 𝗳𝗶𝗿𝘀𝘁 𝗳𝗮𝘁𝗮𝗹 𝗰𝗿𝗮𝘀𝗵: 𝟱 𝘁𝗮𝗸𝗲-𝗮𝘄𝗮𝘆𝘀 and it was barely reported⚠ On Mar 29 a base-trim SU7 in Navigate on Autopilot slammed a construction barrier at ~97 km/h after only 0.8 s of warning—far below NHTSA’s 2.3 s reaction benchmark. 𝗧𝗵𝗿𝗲𝗲 𝗹𝗶𝘃𝗲𝘀 𝘄𝗲𝗿𝗲 𝗹𝗼𝘀𝘁, and 𝘁𝗵𝗲 𝘀𝘁𝗼𝗿𝘆 𝗶𝘀 𝘀𝗵𝗮𝗸𝗶𝗻𝗴 𝗖𝗵𝗶𝗻𝗮’𝘀 𝗵𝘆𝗽𝗲𝗿-𝗳𝗮𝘀𝘁 𝗘𝗩 𝘀𝗰𝗲𝗻𝗲. Here’s why it matters: 𝗥𝗲𝗮𝗱 𝗳𝘂𝗹𝗹 𝗮𝗿𝘁𝗶𝗰𝗹𝗲 𝗶𝗻 𝗼𝘂𝗿 𝗯𝗹𝗼𝗴: https://lnkd.in/gcFgNRqH 1- Sensor stack ≠ sensor redundancy • Base SU7 runs 11 cameras + radar—no LiDAR. • Higher trims add a roof LiDAR & double the NVIDIA Orin compute—but this car didn’t have it. 2- AEB “blind spot” • Xiaomi’s logic brakes hard for moving vehicles, but often filters out stationary obstacles at highway speeds. • Result: just a 19 km/h slowdown before impact. 3- Speed to market vs. validation • SU7 hit showrooms < 24 months after the project kicked off. • Crash has slashed new orders ~55 % & sparked fresh MIIT rules banning “autonomous” marketing buzzwords. 𝗥𝗲𝗮𝗱 𝗳𝘂𝗹𝗹 𝗮𝗿𝘁𝗶𝗰𝗹𝗲 𝗶𝗻 𝗼𝘂𝗿 𝗯𝗹𝗼𝗴: https://lnkd.in/gcFgNRqH 4- Global regulators are converging 🌍 • 🇨🇳 China: OTA driving updates now need pre-approval. • 🇺🇸 US: NHTSA adding 4 new ADAS tests & probing 2.4 M Teslas. • 🇪🇺 EU: 2024 Euro NCAP doubles assisted-driving scenarios—stationary-object AEB is mandatory for top scores. 5- Lesson for the industry 🔁 • Vision-only stacks are cheaper 🏷 but risk glare, low light & depth errors. • Multi-sensor fusion (camera + LiDAR + radar) adds cost but buys critical milliseconds—and lives. 💡 Bottom line: Level-2 “pilots” are assistants, not chauffeurs. Redundancy, rigorous validation, and honest marketing will separate winners from recalls in the global EV race. #ENERGYDM #EVSafety #ADAS #LiDAR #AutonomousDriving #XiaomiSU7 #Xiaomi #EV #SDV #China #automotive #AutomotiveEngineering #ChinaTech #Mobility

  • View profile for Philip Koopman

    Embedded Systems & Embodied AI Safety. Helping teams take the next step for software quality and safety. (Emeritus)

    32,618 followers

    There is more to driving than just moving the vehicle down the road, and those other parts need to be addressed by autonomous vehicle makers. This article talks about Kodiak addressing the vehicle inspection process, and it seems they are doing well. From the article: “the traditional roadside inspection regime is a problematic fit for autonomous trucks.” “Most importantly, traditional Commercial Vehicle Safety Alliance (CVSA) North American Standard Level I Inspections require cooperation between drivers and enforcement during the inspection process — drivers are responsible for testing indicator lights, stepping on brake pedals, and otherwise demonstrating that key truck safety systems are operating properly. Conducting such an inspection without a driver in the cab raises significant challenges for fleets and law enforcement alike.” “Once an autonomous truck is on the road, it must have the capacity to digitally communicate a Safety Data Message Set, which includes the outcome of the Enhanced Inspection as well as other relevant safety information, to roadside enforcement officers at inspection sites.” “While officers will retain the authority to pull over a truck should they have probable cause, vehicles participating in the Enhanced Inspection program will receive bypasses for routine inspections. “ https://lnkd.in/g3yijNFN

  • View profile for Gervais  T. Mbunkeu, M.Eng, M.Sc, CCSK, CIPM

    Global Data Risk and Privacy Leader @ PwC|🚀Experienced Engineer | Cybersecurity, Privacy & CAV Policy | AI Enthusiast | Building Trust in a Connected World 🌐

    2,138 followers

    The Self-Driving Car Fantasy: A Dangerous Oversimplification of Road Safety This Washington Post Editorial Board’s piece on self-driving cars reads less like a measured assessment and more like a tech industry press release. While the human toll of traffic crashes is tragic and demands urgent action, the answer is not to rush an immature technology onto the roads under the illusion that machines will miraculously outperform humans. Let’s cut through the hype: 🚗 The Real Picture: Comparing Apples to Oranges Before we start rewriting traffic laws to accommodate robotaxis, let’s acknowledge some fundamental differences in how human drivers and AVs are evaluated: • 🚶♂️ Humans drive in all conditions; AVs do not. • Waymo’s much-touted safety statistics come exclusively from controlled environments: well-mapped, predictable, and often warm-weather cities like Phoenix and San Francisco. • The vast majority of human drivers have experience in rain, snow, ice, fog, rural roads, construction zones, and unpredictable conditions—AVs do not. • 📏 The mileage comparison is wildly misleading. • U.S. drivers log roughly 3.2 trillion miles per year. Waymo, the most experienced AV company, has managed 33 million miles total—or about 0.001% of annual human driving. • No self-driving system has even approached 100 million miles without a fatality, the baseline required to compare against human crash statistics. • 🚦AVs don’t have to make real-world driving decisions. • No self-driving car has successfully navigated complex human-driving environments like aggressive merging in dense traffic, anticipating pedestrian behavior in the dark, or making judgment calls in emergency scenarios. • AVs also fail to recognize social driving norms—things like when to cautiously inch through a blocked intersection or yield informally in heavy congestion. 🚨 Ignoring the Real Safety Issues The editorial conveniently brushes aside the fact that AVs still struggle with basic safety failures: • Waymo and Cruise vehicles freeze in traffic, stopping unpredictably and blocking emergency responders. • AVs still fail to detect and react to unpredictable hazards—as seen in Cruise’s horrifying incident where it dragged a pedestrian. • When AVs do crash, they lack the instincts of a human driver to minimize impact or avoid secondary collisions. Let’s Focus on What Actually Works If we genuinely care about reducing traffic deaths, there are proven, immediate solutions available: ✔️ Lower speed limits and road designs that prioritize safety over speed. ✔️ Widespread adoption of ADAS, like automatic emergency braking and lane-keeping assist. The rush to remove humans from the driver’s seat isn’t a safety revolution—it’s a dangerous gamble. Missy Cummings Michael DeKort David Beck #selfdrivingcars #ai #ml #fsd #waymo https://lnkd.in/eDX3SyJN

  • View profile for John Hayes

    President at Sensei

    2,973 followers

    Today’s cars are built to last many thousands of miles in all types of environmental conditions without requiring in-person professional services. Cars with advanced autonomous capabilities will need to be this resilient to gain mass consumer adoption. The challenge is that the precision and reliability required for AVs to execute complex perception and planning tasks are extremely difficult to achieve with hardware alone, as evidenced by the camera calibration challenges Teslas are currently experiencing. This recall underlines just how critical software is to building attention-free autonomous driving into consumer vehicles. These systems must be designed with the assumption that cameras (and all sensors) can — and will — produce imperfect signals at times due to the effects of constant motion, vibration, accidents, and more. A software layer is required to continuously test and recalibrate key sensors in real time to compensate for continuous physical changes to the hardware. To make AV capability both safe and durable enough for mass adoption and use, online sensor calibration is one of the essential ingredients required to bridge the gap between L2 driver assistance and an L4 attention-free driving system in consumer vehicles. https://lnkd.in/gfxRdhWP

  • View profile for Nicholas Nouri

    Founder | APAC Entrepreneur of the year | Author | AI Global talent awardee | Data Science Wizard

    130,946 followers

    Is a Camera-Only Robotaxi Really Feasible? Elon Musk has long promised a completely driverless Tesla Robotaxi. The catch? Tesla is betting on cameras alone - no LiDAR (Light Detection and Ranging). Meanwhile, most industry leaders (like Waymo and Zoox) use a blend of LiDAR, radar, and cameras to construct a precise 3D map of the world around the car. Why does this sensor mix matter? - LiDAR generates detailed depth information, helping vehicles “see” distance and shape more accurately - even in less-than-ideal lighting or weather. - Radar and other sensors add extra layers of protection, especially when visibility is poor. - Camera only systems can struggle in certain conditions - like heavy rain or fog - where they might misinterpret or simply not detect obstacles. Most fully autonomous services today operate within carefully mapped areas - known as “geofencing.” Tesla’s ambition is to deploy Full Self-Driving across all roads, which raises big questions about reliability and safety if they continue relying solely on cameras. Some industry figures have openly criticized Tesla’s approach, labeling it unrealistic and pointing to crash data as evidence that a single-sensor strategy may be risky. It’s also worth noting that no single sensor type is perfect in every scenario. Even LiDAR, for all its benefits, has limitations where radar or infrared cameras might perform better. Sensor fusion - combining multiple sensor inputs - can offer the most robust picture of the driving environment. Another critical factor? Maintenance and routine inspections. As self-driving systems become more complex, ensuring they’re checked and certified regularly could be a crucial step in keeping our roads safe. After all, even the most sophisticated technology won’t matter if it isn’t kept in top condition due to all the electrical components. Could Tesla eventually pivot if LiDAR becomes cheaper and more efficient? That remains to be seen. But one thing is clear: the road to fully autonomous vehicles is about more than just cutting costs - it’s about finding a sensor strategy that consistently delivers safety and reliability in the real world. #innovation #technology #future #management #startups

  • View profile for Jason Corso

    Toyota Professor of AI at Michigan | Voxel51 Co-Founder and Chief Scientist | Creator, Builder, Writer, Coder, Human

    18,672 followers

    🚗 🚗 🚗 Data Blind Spots and Their Impact in Automotives Visual AI 🚗 🚗 🚗 Excited to share my latest article in WardsAuto on the critical challenge of achieving 99.999% accuracy in automotive visual AI (link below). Drawing from both my academic research at University of Michigan College of Engineering and industry experience with Voxel51, I explore why data blind spots are the main obstacle holding back autonomous vehicle development. Data blind spots --- critical gaps in datasets that happen in practice but are out of domain for the model --- prevent AI models from achieving the level of robustness necessary to deliver real systems to production. I bring this to light with a real case study involving phantom potholes and how unified data visibility helped solve it. The key - bringing in teams, data, and models together to rapidly identify and fix issues. Read the full piece to learn how we can cut the 80% AI project failure rate and make road transportation safer: https://lnkd.in/df8Gf-xd

Explore categories