Tesla’s Autopilot system—which, contrary to its name, does not enable the car to drive itself—has been involved in an accident yet again. This time, a 2019 Tesla Model 3 with Autopilot activated crashed into two parked cars on the side of a highway near downtown Orlando in Florida.
Early Saturday morning, a 27-year-old Model 3 driver crashed into a parked Florida Highway Patrol car, the Associated Press reported. The highway patrol officer had stopped to help another driver who was having trouble with their vehicle when the Model 3 ran into the cruiser. It barely missed the highway patrol officer, who had stepped out of his car. The Model 3 then proceeded to collide with the other parked vehicle.
Fortunately, there were no fatalities as a result of the crash. The 27-year-old Model 3 driver and the driver of the other car receiving assistance sustained minor injuries. Meanwhile, the highway patrol officer was unhurt, according to the AP.
Officials are still investigating the cause of the crash. CNBC points out that it has not yet been determined whether Tesla’s Autopilot caused or contributed to the accident.
Gizmodo reached out to Tesla for comment on Saturday but did not receive a response by the time of publication. Considering that Tesla disbanded its public relations team last year, it’s unlikely we’ll get a response, but we’ll make sure to update this blog if we do.
The latest crash involving Tesla’s Autopilot comes nearly two weeks after the National Highway Traffic Safety Administration opened an investigation into the company’s assisted driving system. Specifically, it will focus on 11 incidents dating back to 2018 in which Tesla cars with Autopilot or cruise control activated crashed into parked emergency vehicles. The incidents resulted in 17 injuries and one death.
The NHTSA’s investigation will cover Tesla cars manufactured between 2014 and 2021, including the Tesla Model Y, Model X, Model S, and Model 3, which total roughly 756,000 vehicles.
Although the agency is only investigating crashes involving emergency vehicles, Tesla’s Autopilot system has been involved in numerous incidents in which drivers haven’t been giving the car their full attention. Some drivers have been found drunk and asleep at the wheel. Others have crashed because they were looking at their phones.
Besides the NHTSA, Tesla may have another agency’s attention on it in the future. Shortly after the NHTSA revealed its investigation, Democratic Sens. Richard Blumenthal and Ed Markey asked Federal Trade Commission Chairwoman Lina Khan to look into the company’s “potentially deceptive and unfair” marketing and advertising practices for its driving automation systems.
In their letter, the senators rightly point out that Tesla’s Autopilot and Full Self-Driving features are only partially automated and that there are no vehicles on the market that can drive themselves at this time.
“Understanding these limitations is essential, for when drivers’ expectations exceed their vehicle’s capabilities, serious and fatal accidents can and do result,” Blumenthal and Markey wrote.
https://ift.tt/3Dr7VhN
Business
No comments:
Post a Comment