The U.S. National Highway Traffic Safety Administration announced on Tuesday that it has initiated an investigation involving approximately 2.6 million Tesla vehicles following reports of accidents associated with the company’s “Actually Smart Summon” autonomous driving feature. The agency’s Office of Defects Investigation revealed that it had received a complaint detailing a crash related to the utilization of the feature and had also taken into account three separate media accounts of similar incidents linked to it. According to the regulator, the Tesla vehicles implicated in the collisions failed to identify obstacles such as posts or parked vehicles while operating the Actually Smart Summon function.
The Office of Defects Investigation stated, “The ODI is aware of multiple crash allegations, involving both Smart Summon and Actually Smart Summon, where the user had too little reaction time to avoid a crash, either with the available line of sight or releasing the phone app button, which stops the vehicle’s movement.” The Actually Smart Summon feature was introduced in September, enabling users to remotely guide their vehicles towards them or another specified location using a smartphone application. Its predecessor, formerly known as Dumb Summon, permitted users to maneuver their car forward or backward within a parking space.
The National Highway Traffic Safety Administration disclosed that the investigation will assess various aspects of the Actually Smart Summon feature, including its maximum speed, viability on public roads, and line of sight prerequisites. Additionally, the probe will delve into issues such as remote control via phone app, potential connectivity delays, and the system’s performance under unforeseen circumstances. Despite Tesla’s stock experiencing a 1.6% decline in pre-market trading, the company refrained from immediate comment in response to the inquiry.
This recent inquiry marks the second notable investigation conducted by the traffic safety regulator on Tesla within a span of approximately four months, both focusing on the automaker’s automated driving capabilities. In October, the agency launched a probe involving 2.4 million Tesla vehicles equipped with the Full Self-Driving (FSD) software subsequent to four documented collisions, one of which resulted in a fatal accident in 2023. The heightened scrutiny surrounding Tesla’s advanced driver assistance systems aligns with CEO Elon Musk’s strategic shift towards self-driving technologies and the development of robotaxis.
The investigative efforts by the National Highway Traffic Safety Administration underscore the regulatory scrutiny surrounding the implementation and safety of autonomous driving features, particularly in light of reported incidents involving Tesla vehicles. As the agency delves deeper into the functionality and potential risks associated with Tesla’s cutting-edge technologies, stakeholders and consumers alike await further insights and potential regulatory measures to ensure the safety and reliability of autonomous driving systems in the automotive industry.