Archives for October 4, 2018

When it Comes to Autonomous Cars, the Department of Transportation Says ‘Drivers’ Don’t Have to Be Human

The Department of Transportation is getting a little more creative about how it defines “driver,” Secretary Elaine Chao announced Thursday. In the third version of the department’s official stance on self-driving, the department said it would “adapt the definitions of ‘driver’ and ‘operator’ to recognize that such terms to not refer exclusively to a human, but may in fact include an automated system.” The computers have a ticket to drive now—at least where federal regulations are concerned.

And while this is good news for everyone working on building, and eventually deploying, self-driving vehicles, it’s especially welcome for the automated trucking crowd. Waymo, Daimler, Volvo, Embark Trucks, Kache.ai, Starsky and Kodiak Robotics, TuSimple, Ike: Automated trucking companies have boomed this year, even after Uber got out of the trucking race. And all these VC-funded people would one day like to use their robot vehicles to transport the 50 million tons of freight shipped on American highways each day.

To get there, though, the trucks have to be legal drivers, all by themselves. And they have to be able to drive everywhere freight goes. (So, everywhere.) This new guidance, which is the first to address automated trucks and buses specifically, looks to be the initial step in making that possible.

“We have a much clearer roadmap now,” says Jonny Morris, who heads up public policy at Embark. (Embark is shipping commercial loads with test trucks in California and Arizona, though drivers behind steering wheels monitor the technology during each drive.) “We’re starting from a point where the DOT is acknowledging that what we’re trying to do is generally allowable under existing regulations.”

The guidance is an especially handy thing for truck developers because trucks are much more likely than robotaxis to cross state lines. If there’s a single federal regulation for all highways, it will be much easier for these nascent companies to strike deals to ship goods all over the US. Today, different state rules create a patchwork of self-driving laws, where automated vehicles are welcomed enthusiastically in some states (Florida, Arizona) and require special licenses, permissions, permits, or testing parameters in others (California, Nevada, New York).

The DOT also announced in its guidance an “advanced notice of proposed rulemaking” for automated driving systems, both on passenger cars and trucks. Basically, that’s a heads up that the DOT will very soon start to solicit the public’s opinions on how the technology should be governed.

The goal here, the DOT says, is to guarantee road safety while ensuring that the federal government’s vehicle design standards don’t get in the way of self-driving vehicles. Today, anything on wheels is required by law to have features that won’t do much if the computer is driving the car: steering wheels, gas pedals, rear-view and side mirrors. Manufacturers have to apply for specialized exemptions if their vehicles don’t have those elements, and only a certain number of exemptions are available each year. As automated vehicle developers like Waymo and GM prepare to launch their own robotaxi services, they would love for those requirements to disappear—ASAP. (For now, DOT has pledged to “streamline” this exemption process, though it will need Congress to pass new legislation to hike the number of exemptions available each year.)

“These rulemakings could matter a lot, and the devil will be in the details,” says Bryant Walker Smith, a professor at the University of South Carolina School of Law who studies automated vehicle policy. In other words: A lot might be about to change in the world of vehicle regulation. We just don’t know what yet.

In the interim, though, the DOT Thursday strongly reasserted its authority to order any technology it finds unsafe off the road.

The DOT also announced today that it will work with the Departments of Labor, Commerce, and Health and Human Services to formally study how automated vehicle tech might affect the workforce—including truckers—and what sorts of skills the workforce will need to excel in a robotic future. A recent study by labor economists concluded that automated vehicles won’t begin to seriously displace workers until the mid-2040s, and that even the losses then will be relatively minimal. Still, the economists warned, now is the time to start preparing the workforce for the disappearance of some trucking jobs. And the federal government has begun to heed the call.

For now, Morris says Embark will wait to see how this new guidance document works in real life, meaning the company won’t start to test in place where lawmakers haven’t wanted them, yet. While the feds insist that they have the power to preempt state regulations of automated vehicle technology, expect states to at least have a voice in the testing process moving forward. At this point, everyone still wants to be on friendly terms as they welcome the robots.


More Great WIRED Stories

Tesla's Autopilot Report Makes Big Safety Claims With Little Context

Tesla has published its first voluntary “Vehicle Safety Report,” and the numbers seem to clearly back CEO Elon Musk’s assertion that drivers who use Tesla’s sort of self-driving Autopilot feature are involved in fewer crashes than those who turn it off, and far fewer crashes than the general driving population. But without more detail, the numbers mean little.

In the report, Tesla says that between July and September of this year, it “registered one accident or crash-like event for every 3.34 million miles driven in which drivers had Autopilot engaged.” Tesla drivers not using Autopilot went 1.92 million miles between incidents.

The company equates “crash-like events” with near misses but didn’t respond to a request for more detail on what that means. The report offers no insight into the severity of the crashes, whether anyone involved was injured, what may have caused the crashes, or where and when they happened.

Tesla’s Autopilot cleverly combines adaptive cruise control, which maintains a set distance from the car in front even if it slows down, and steering assistance, which keeps the car between painted lane markings. Tesla stresses both features are intended for use in limited circumstances. “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver,” the Model S’s manual says. Although the system can be engaged anywhere, that technically means drivers using Autopilot sticking to roads like freeways—routes free of intersections, pedestrians, cyclists, and other complicating factors. Drivers not using it might be on crowded city streets or twisty country roads, making a comparison useless without that extra information.

Since releasing Autopilot in 2014, Tesla has faced criticism that the system makes drivers overly confident in its abilities, lulling them into a dangerous sense of complacency. At least two people have died in crashes when Autopilot was engaged. Three have crashed into stopped fire trucks in 2018 alone (all survived without serious injury). Musk has tangled with the National Transportation Safety Board over its investigations into Autopilot crashes and attacked critics of the system during a May earnings call.

“It’s really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,” Musk said. “Because people might actually turn it off, and then die.” During that same call, he promised Tesla would start releasing these quarterly reports.

The safety report compares that 1.92 million miles per incident figure to data from the National Highway Traffic Safety Administration. It says NHTSA figures show “there is an automobile crash every 492,000 miles.” (Tesla apparently used the NHTSA’s public database to derive this number.) That indicates drivers in other manufacturers’ cars crash nearly seven times more often than drivers using Autopilot.

But again, a closer look raises questions. A broad comparison of Tesla with everyone else on the road doesn’t account for the type of car, or driver demographics, just for starters. A more rigorous statistical analysis could separate daytime versus nighttime crashes, drunk drivers versus sober, clear skies versus snow, new cars versus clunkers, and so on. More context, more insight.

“It’s silly to call it a vehicle safety report,” says David Friedman, a former NHTSA official who now directs advocacy for Consumer Reports. “It’s a couple of data points which are clearly being released in order to try to back up previous statements, but it’s missing all the context and detail that you need.”

Tesla’s one-page report comes the day after Consumer Reports published its comparison of “semiautonomous” systems that let drivers take their hands off the wheel but require them to keep their eyes on the road. That ranking put Cadillac’s Super Cruise in first place and Autopilot in second, followed by Nissan’s Pro Pilot Assist and Volvo’s Pilot Assist. It evaluated each on how it ensures the human is monitoring the car as well as its driving.

In response to those criticisms that Autopilot lulls users into trusting it too much, Tesla has recently used over-the-air software updates to ratchet up how often the human must touch the steering wheel to confirm they’re still alive and concentrating. Cadillac’s approach is more sophisticated: It uses an infrared camera to ensure the driver’s head is pointed at the road (instead of down at a phone), allowing for a truly hands-off system. (Audi’s Traffic Jam Pilot uses a gaze-tracking setup that allows a driver to look away in certain conditions, but it isn’t available in the US.)

Other safety-minded groups, including the IIHS and the UK’s Thatcham, are designing their own tests for these increasingly popular features, acknowledging they’re all flawed.

Tesla does have an excellent safety record when it comes to crash testing. In September the NHTSA awarded the Model 3, Tesla’s newest car, five stars in every category. The Model X SUV got the same commendation, and when the Model S sedan was tested in 2013, it proved so strong it broke the test equipment.

And it could be that its Autopilot system is making highway driving safer, perhaps by reducing driver fatigue or reducing rear-end collisions. But this report isn’t enough to show that. Friedman says he was hoping for more. He wants Tesla to give its data to an academic, who can do a rigorous, independent, statistical analysis. “If the data shows that Autopilot is delivering a safety benefit, then that’s great.”

Tesla is unique among automakers in releasing this type of data at all, and going forward it could expand on it to make it more useful. The company’s blog post with the latest statistics says it “introduced a completely new telemetry stream for our vehicles to facilitate these reports.”

And the size of its fleet is growing fast, as Tesla ramps up production of the Model 3. Its delivery numbers released on Tuesday show it put 83,500 cars in new driveways in the same quarter its safety figures cover. That means there’s going to be a lot more data to analyze in the future.

Tesla has always moved faster than the mainstream auto industry and deserves credit for acceleration the adoption of electric driving, software updates, and self-driving features. But if it wants to be congratulated for making roads safer, it has to cough up more data.


More Great WIRED Stories