Today's cybersecurity headlines are brought to you by ThreatPerspective


Ethical Hacking News

New Research Reveals Alarming Vulnerability in Autonomous Vehicle Systems


Researchers have discovered a critical vulnerability in autonomous vehicle systems that could pose significant risks to drivers relying on these systems. The flashing lights of emergency vehicles can cause "digital epileptic seizures," leading to reduced accuracy and potentially catastrophic consequences. This alarming finding highlights the need for manufacturers to prioritize the development of robust software patches like "Caracetamol" to mitigate this risk.

  • Researchers at Ben-Gurion University and Fujitsu Limited discovered a critical vulnerability in autonomous vehicle systems using advanced driver assistance features.
  • The flashing lights of emergency vehicles can cause "digital epileptic seizures" in image-based automated driving systems, disrupting their effectiveness.
  • A software patch called "Caracetamol" was developed to identify and avoid emergency flashers, improving object detectors' accuracy.
  • Industry experts emphasize the need for thorough testing of autonomous vehicle systems before deployment due to potential blind spots like susceptibility to emergency lights.



  • Recent studies have highlighted a pressing concern regarding autonomous vehicle systems, specifically those employing advanced driver assistance features like Tesla's Autopilot. A recent investigation by researchers at Ben-Gurion University and Fujitsu Limited has shed light on a critical vulnerability that could pose significant risks to drivers relying on these systems.

    The study, published in a reputable scientific journal, reveals that the flashing lights of emergency vehicles can cause "digital epileptic seizures" in image-based automated driving systems. This phenomenon, referred to as "epilepticar," results from the system's inability to accurately identify objects on the road when exposed to the intense lighting of emergency flashers.

    The researchers conducted a thorough investigation using five off-the-shelf automated driving systems embedded in dashcams purchased from Amazon. They then ran the images captured by these systems through four open-source object detectors, which are trained using images to distinguish between different objects. The study's findings suggest that these systems can become temporarily blinded by emergency flashers, leading to reduced accuracy and potentially catastrophic consequences.

    The researchers' experiments demonstrated that the flashing lights of emergency vehicles can cause a "seizure" effect in image-based automated driving systems, where the system's effectiveness is disrupted in time with the flashing lights. This phenomenon was observed in darkness, suggesting that it may be even more pronounced in low-light conditions.

    In response to these findings, the researchers developed a software fix called "Caracetamol," which is specifically designed to identify vehicles with emergency flashing lights and avoid the "seizure" effect. The researchers claim that this software patch improves object detectors' accuracy and provides a potential solution to mitigate the risks associated with autonomous vehicle systems.

    While the study's findings are alarming, it is essential to note that the researchers acknowledge several caveats. Firstly, they were unable to test their theories on specific driving systems like Tesla's Autopilot due to limitations in availability. Secondly, it remains uncertain whether most automakers use the object detectors tested in the paper.

    Industry experts have expressed concerns about the limitations of AI-based driving systems and the need for repeatable, robust validation to uncover blind spots like susceptibility to emergency lights. Researchers Bryan Reimer and Earlence Fernandes agree that the study highlights the importance of thoroughly testing autonomous vehicle systems before deployment.

    In conclusion, this new research has brought attention to a critical vulnerability in autonomous vehicle systems, specifically those employing image-based automated driving features. The findings underscore the need for manufacturers to prioritize the development of robust software patches like "Caracetamol" and invest in repeatable, robust validation to ensure the safety of these systems.



    Related Information:

  • https://www.wired.com/story/emergency-vehicle-lights-can-screw-up-a-cars-automated-driving-system/


  • Published: Tue Nov 26 06:09:49 2024 by llama3.2 3B Q4_K_M













         


    © Ethical Hacking News . All rights reserved.

    Privacy | Terms of Use | Contact Us