This site may earn affiliate commissions from the links on this page. Terms of use.

Uber doesn't take the greatest reputation when it comes to cocky-driving machine safety, having been unceremoniously kicked out of San Francisco in 2016, and scaling back its tests in 2017 after a crash in Arizona. In the most contempo incident, an Uber examination vehicle, in democratic mode with a safety driver backside the wheel, struck and killed a pedestrian in Tempe, Ariz., marking what seems to exist the first known case of that happening. The incident was reported at 10pm, so it was at night.

There is not nearly enough information to draw conclusions. But the known facts that the car was in autonomous mode and struck a pedestrian are dissentious to Uber. They also don't help the example for self-driving cars in the minds of the public. Uber has halted all of its self-driving tests on public roads in response, and the NTSB is sending a team to investigate. Supporters of deploying democratic vehicles will point out that mile-for-mile they accept actually been relatively safe (at least with a condom driver behind the wheel).

Withal, that won't stop a frenzy of "I told you then" comments from naysayers. Before we rush to judgment, though, at that place are some cardinal facts that need to exist sorted out. In the instance of the human being killed while driving his Tesla in democratic mode, it was months earlier all the facts were in, and at that place are still some open questions. Hopefully this incident volition become clarified sooner than that, but hither are the questions we need answered starting time:

What Near the Vehicle Safety Systems?

In that location is a great deal of confusion almost the role of various automated systems in a auto. Several times, including the infamous Tesla fatality in Florida, "autonomous" features have been blamed. However, both the Tesla and probable the Uber automobile in this case, are equipped with much more standard collision avoidance systems that are designed to restriction before they hitting something. So clearly the Automatic Emergency Braking (AEB) system needs to be looked at in this case.

How Did the Rubber Driver React?

I really don't envy the job of existence a safety driver. You lot need to sit behind the wheel ready to take over mile after mile, only might spend long periods of time doing nothing. Then you might just have a second or two to all of a sudden re-engage with the vehicle and take corrective action. This issue is part of why some similar Google and Cruise have argued that sometimes-automated vehicles aren't really the style to go, and that nosotros need consummate Level 5 autonomy (where the auto doesn't ever need a man) to ensure condom. Of grade, it needs to actually work!

Along with how the particular safe commuter involved behaved in this state of affairs, it'll be important to look at Uber's policies for training and evaluating its drivers, as well every bit the rules they provide for safe functioning of the vehicle.

What Part Did the Pedestrian Play?

Overhead signs present a challenge for autopilot systemsAne reason pedestrian fatalities are on the ascent appears to be distracted walkers. Smartphones are the primary new culprit, just as they are for drivers. At present, that isn't an alibi for hitting someone, just equally anyone who drives knows, if someone suddenly dashes out in front end of you, your options are pretty limited. The woman who was killed was also not in the crosswalk, although certainly that's a common-enough practice that an autonomous vehicle should be able to deal with information technology.

Is Information technology Self-Driving or Is It Uber?

Any crash involving a potentially-democratic vehicle generates a lot of interest and early headlines, along with Twitter feeds total of experts theorizing on what happened. Only in this case, like the others, we'll need to get some cardinal questions answered earlier any real conclusions tin can be drawn. Some of those conclusions may change the way we regulate the testing and deployment of self-driving vehicles. If, for instance, there are bug in Uber's policies or procedures that contributed to the crash, then they'll likely result in tighter regulation and enforcement. Since self-driving cars are heavily instrumented, including carrying multiple cameras, hopefully in that location will exist plenty data to get to the bottom of the incident fairly quickly.