Uber’s just-released U.S. Safety Report sets forth in some detail the number of fatal accidents, and the good news is that the overall rate per mile is about half the national average. But the report makes some puzzling choices as far as what is included and excluded.
To create the report, Uber took its internal reports of crashes, generated by drivers, users, or insurance companies, and compared it to the national Fatality Analysis Reporting System, or FARS, a database that tracks all automotive deaths. In this way Uber was able to confirm 97 fatal crashes with 107 total deaths in 2017 and 2018 combined.
As the company is careful to point out before this, more than 36,000 people died in car crashes in the U.S. in 2018 alone, so the total doesn’t really mean much on its own. So they (as others do in this field) put those accidents in context of miles traveled. After all, 1 crash in 100,000 miles doesn’t sound bad because it’s only one, but 10 crashes in a billion miles, which is closer to what Uber saw, is actually much better despite the first number being higher. To some this is blindingly obvious but perhaps not to others.
The actual numbers are that in 2017, there were 49 “Uber-related” fatalities over 8.2 billion miles, or approximately 0.59 per 100 million miles traveled; in 2018, there were 58 over 1.3 billion, or about 0.57 per 100 million miles. The national average is more than 1.1 per 100 million, so Uber sees about half as many fatalities per mile overall.
These crashes generally occurred at lower speeds than the national average, and were more likely by far to occur at night, in lighted areas of cities. That makes sense, since rideshare services are heavily weighted towards urban environments and shorter, lower-speed trips.
That’s great, but there are a couple flies in the ointment.
First, obviously, there is no mention whatsoever of non-fatal accidents. These are more difficult to track and categorize, but it seems odd not to include them at all. If the rates of Ubers getting into fender-benders or serious crashes where someone breaks an arm are lower than the national average, as one might expect from the fatality rates, why not say so?
When I asked about this, an Uber spokesperson said that non-fatal crashes are simply not as well defined or tracked, certainly not to the extent fatal crashes are, which makes reporting them consistently difficult. That makes sense, but it still feels like we’re missing an important piece here. Fatal accidents are comparatively rare and the data corpus on non-fatal accidents may provide other insights.
Second, Uber has its own definition of what constitutes an “Uber-related” crash. Naturally enough, this includes whenever a driver is picking up a rider or has a rider in their car. All the miles and crashes mentioned above are either en route to a pickup or during a ride.
But it’s well known that drivers also spend a non-trivial amount of time “deadheading,” or cruising around waiting to be hailed. Exactly how much time is difficult to estimate, as it would differ widely based on time of day, but I don’t think that Uber’s decision to exclude this time is correct. After all, taxi drivers are still on the clock when they are cruising for fares, and Uber drivers must travel to and from destinations, keep moving to get to hot spots, and so on. Driving without a passenger in the car is inarguably a major part of being an Uber driver.
It’s entirely possible that the time spent deadheading isn’t much, and that the accidents that occurred during that time are few in number. But the alternatives are also possible, and I think it’s important for Uber to disclose this data; Cities and riders alike are concerned with the effects of ride-hail services on traffic and such, and the cars don’t simply disappear or stop getting in accidents when they’re not hired.
When I asked Uber about this, a spokesperson said that crash data from trips is “more reliable,” since drivers may not report a crash if they’re not driving someone. That doesn’t seem right either, especially for fatal accidents, which would be reported one way or the other. Furthermore Uber would be able to compare FARS data to its internal metrics of whether a driver involved in a crash was online or not, so the data should be similarly if not identically reliable.
The spokesperson also explained that a driver may be “online” in Uber at a given moment but in fact driving someone around using another rideshare service, like Lyft. If so, and there is an accident, the report would almost certainly go to that other service. That’s understandable, but again it feels like this is a missing piece. At any rate it doesn’t juice the numbers at all, since deadheading miles aren’t included in the totals used above. So “online but not hired” miles will remain a sort of blind spot for now.
Written by Devin Coldewey
This news first appeared on https://techcrunch.com/2019/12/05/ubers-fatal-accident-tally-shows-low-rates-but-excludes-key-numbers/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Uber’s fatal accident tally shows low rates but excludes key numbers”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.