Tesla owners who drive for Uber and Lyft with “Full Self-Driving” enabled operate like gray-market autonomous taxis, as a new report from Reuters details. That’s a problem when one of these not-actually-self-driving cars crashes, as Justin Yoon’s allegedly did.

Reuters broke the story of Yoon’s allegedly car getting in a collision with another car while being operated for a ride-share service. The accident was ruled to be the other vehicle’s fault, not Yoon’s, but it was Yoon himself who eventually steered to mitigate the impact of the accident. Both Yoon and his passenger sustained minor injuries.

Reuters spoke to 11 other Uber or Lyft drivers who said they used the system while driving for ride-sharing companies. The drivers said they used it to reduce fatigue and stress while driving, allowing them to work more hours and earn more money. I am editorializing here, but I find it far more stressful to monitor an inept robot driver that I’m legally responsible for than I do to drive.

Tesla’s inaccurately named “Full Self-Driving” (FSD) system is a fully supervised system, meaning that the driver remains fully responsible for any collisions while the system is engaged. That’s an entirely different category of product from true driverless self-driving systems operated by companies like Waymo and Lyft, which face strict regulations and bear responsibility for collisions caused by their products. Tesla says that drivers must pay attention while using FSD and be ready to intervene.

Uber and Lyft both responded to Reuter’s report. While neither company seemed to directly support the idea of drivers using Tesla FSD, neither expressed direct disapproval or banned the practice.

Uber told Reuters: “Drivers are expected to maintain an environment that makes riders feel safe; even if driving practices don’t violate the law.”