The safety of Tesla Autopilot has been called into question after a recent fatal motorcycle accident

Washington, DC

Tesla’s Autopilot was involved in a third fatal motorcycle crash this summer, raising questions about the driver assistance system’s ability to operate safely.

The National Highway Traffic Safety Administration has already begun investigations into the first two accidents and has gathered information about the third accident. More details of the latest crash emerged on Monday.

The three fatal crashes occurred within 51 days this summer and follow a similar incident: a person driving a Tesla hit a motorcycle with Autopilot active in the early morning hours.

Accidents renew questions about whether system users are ready to take full control of the vehicle when needed. Studies have shown that drivers look further away from the road while using Autopilot, and many Autopilot users believe their cars are driving themselves.

Tesla’s Autopilot system keeps the vehicle in its lane while traveling at a set speed, and drivers are instructed to keep their hands on the wheel at all times. The automaker says it detects wheel torque and uses a camera near the rearview mirror to determine driver inattention, and uses alerts to remind drivers to keep their eyes on the road.

Ingrid Eva Noon was riding her motorcycle in Palm Beach County, Florida at 2:11 a.m. on Aug. 26 when an impaired driver using Tesla’s Autopilot struck the back of Noon’s motorcycle, throwing her into the Tesla’s windshield and killing her, according to Palm. Beach County Sheriff’s Office. Driver assistance crash data, which automakers like Tesla are required to report to NHTSA, was released Monday and revealed that Autopilot was activated.

Utah resident Landon Embry was killed on July 24th while riding his Harley-Davidson at approximately 1:09 a.m. when a Tesla driver using Autopilot collided with the back of his motorcycle.

A Tesla driver using Autopilot hit a motorcycle lying on the road at 4:47 a.m. on July 7 in Riverside, California. The motorcyclist, who had already fallen from his bike after hitting a dividing wall, died, according to the California Highway Patrol. The Tesla did not hit the cyclist, who had already been thrown, the California Highway Patrol said.

Recent crashes suggest Tesla’s system is insufficient, according to motorcycle advocates.

Motorcycle safety advocates say they are concerned that the software does not see the motorcycles and reassures Tesla drivers that they are calm and not distracted. Advocates say the government’s vehicle safety rules do not adequately protect motorcycle riders and that measures should be taken to better protect them, including testing driver assistance systems such as Autopilot to detect motorcycles.

“For a long time motorcyclists were told by reckless drivers who cause accidents: ‘Sorry, I didn’t see you.’ Now we’re hearing, ‘Sorry, my car didn’t see you.’ This is unacceptable,” said Rob Dingman, president and CEO of the American Motorcyclist Association.

“If he can’t see a motorbike, can he see a pedestrian? Can he see a small child? Can he see an animal?” Eric Stine, ABATE Utah chapter treasurer, advocate for motorcyclists.

NHTSA said Monday that no commercially available vehicles can currently drive themselves and encouraged drivers to use assistive technologies appropriately.

“Some advanced driver assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that do occur, but as with all motor vehicle technology and equipment, drivers must use them correctly and responsibly,” NHTSA said.

Tesla fans came to the automaker’s defense this summer after a prominent critic released a video showing one of its cars being fitted with driver-assistance technology into child-sized mannequins.

Tesla did not respond to a request for comment for this story.

It’s not just the apparent challenges of identifying things at night.

The Insurance Institute for Highway Safety found that 19 of 23 vehicles tested for pedestrian detection achieved a “superior” or “advanced” rating during the day, but only four received a “superior” rating at night. More than half earned a basic score or no credit.

Visibility is a challenge for humans and machines at night, as there is less light reflecting off things on the road. Tesla warns in its vehicle owner’s manuals that many factors can affect Autopilot’s performance, including poor visibility.

“Never depend on them [Autopilot] to keep components safe,” says Tesla. “It is the driver’s responsibility to stay alert, drive safely and control the vehicle at all times.”

Tesla has said it relies on cameras to detect objects on the road and determine whether a driver has eyes on the road. Tesla competitors General Motors and Ford use infrared sensors in their vehicles that can see certain objects better in less visible light, better see the driver’s face and detect distracted driving in low-light conditions.

The American Motorcyclist Association says driver assistance technology that reliably detects motorcycles can prevent accidents. For years the National Highway Traffic Safety Administration has been asked to test motorcycle detection as it evaluates the safety of new vehicles, including driver assistance technologies. (NHTSA has not commented on why it doesn’t.) European vehicle safety programs test whether driver assistance systems can identify motorcycles.

The American Motorcyclist Association has warned for years about the dangers of emerging driving technologies failing to properly detect motorcyclists.

“If this problem is not addressed early in the development of automated vehicles,” he wrote to NHTSA last year, “the consequences for motorcyclists will be disastrous.”