Uber’s Crash and the Folly of Humans Training Self-Driving Cars

But video shows that the car’s operator, Rafaela Vasquez, wasn’t watching the road in the moments leading to the crash. She told NTSB investigators she was looking at the system’s interface, which is built into the center console. But police investigations have revealed that an episode of the TV show The Voice was being streamed via Vasquez’s Hulu account right up to the moment of the crash.

And so, along with the entire notion that robots can be safer drivers than humans, the crash casts doubt on a fundamental tenet of this nascent industry: that the best way to keep everyone safe in these early years is to have humans sitting in the driver’s seat, ready to leap into action.

Maybe Drivers

Dozens of companies are developing autonomous driving technology in the United States. They all rely on human safety drivers as backups. The odd thing about that reliance is that it belies one of the key reasons so many people are working on this technology. We are good drivers when we’re vigilant. But we’re terrible at being vigilant. We get distracted and tired. We drink and do drugs. We kill 40,000 people on US roads every year, and more than a million worldwide. Self-driving cars are supposed to fix that. But if we can’t be trusted to watch the road when we’re actually driving, how did anyone think we’d be good at it when the robot’s doing nearly all the work?

“Of course this was gonna be a problem,” says Missy Cummings, the director of the Humans and Autonomy Laboratory and Duke Robotics at Duke University. “Your brain doesn’t like to sit idle. It is painful.”

In 2015, Cummings and fellow researchers ran their own test. “We put people in a really boring, four-lane-highway driving simulator for four hours, to see how fast people would mentally check out,” she says. On average, people dropped their guard after 20 minutes. In some cases, it took just eight minutes.

Everyone developing self-driving tech knows how bad humans are at focusing on the road. That’s why many automakers have declined to develop semiautonomous tech, where a car drives itself in a simple scenario like highway cruising, but needs a person to supervise and grab the wheel when trouble seems imminent. That kind of system conjures the handoff problem, and as Volvo’s head of safety and driver-assist technologies told WIRED in 2016, “That problem’s just too difficult.”

published here
read
read full article
read full report
read here
read more
read more here
read moreÂ…
read review
read the article
read the full info here
read this
read this article
read this post here
read what he said
recommended reading
recommended site
recommended you read
redirected here
reference
related site
resource
resources
review
right here
secret info
see
see here
see here now
see it here
see page
see post
see this
see this here
see this page
see this site
see this website
sell
she said
site
site web
sites
sneak a peek at these guys
sneak a peek at this site
sneak a peek at this web-site
sneak a peek at this web-site.
sneak a peek at this website
sneak a peek here
source
[source]
sources tell me
speaking of
special info
straight from the source
such a good point
super fast reply
take a look at the site here
talking to
talks about it
that guy
the
the advantage

The problem for the companies eager to skip that icky middle ground and go right for a fully driverless car is that they believe the only way to get there is by training on public roads—the testing ground that offers all the vagaries and oddities these machines must master. And the only reasonable approach—from a pragmatic and political point of view—to testing imperfect tech in two-ton vehicles speeding around other people is to have a human supervisor.

“I think, in good faith, people really thought the safety drivers were going to do a good job,” Cummings says. In a rush to move past the oh-so-fallible human, the people developing truly driverless cars doubled down on, yes, the oh-so-fallible human.

That’s why, before letting them on the road, Uber puts its vehicle operators through a three-week training course at its Pittsburgh R&D center. Trainees spend time in a classroom reviewing the technology and the testing protocols, and on the track learning to spot and avoid trouble. They even get a day at a racetrack, practicing emergency maneuvers at highway speeds. They’re taught to keep their hands an inch or two from the steering wheel, and the right foot over the brake. If they simply have to look at their phones, they’re supposed to take control of the car and put it in park first.

Working alone in eight-hour shifts (in Phoenix they earn about $24 an hour), the babysitters are then set loose into the wild. Each day, they get a briefing from an engineer: Here’s where you’ll be driving, here’s what to look for. Maybe this version of the software is acting a bit funky around cyclists, or taking one particular turn a little fast.

And constantly, they are told: Watch the road. Don’t look at your phone. If you’re tired, stop driving. Uber also audits vehicle logs for traffic violations, and it has a full-time employee who does nothing but investigate potential infractions of the rules. Uber has fired drivers caught (by other operators or by people on the street) looking at their phones.

Leave a Reply

Your email address will not be published. Required fields are marked *