Connect with us


Autonomous Vehicles: There’s No Substitute for an Engaged ‘Safety Operator’

It’s simply not realistic or rational, in my opinion, to expect technology to anticipate every possible scenario encountered on the road.


Josh Cable has 17 years of experience as a writer and editor for newspapers, B2B publications and marketing organizations. His areas of expertise include U.S. manufacturing, lean/Six Sigma and workplace safety and health.

Click Here to Read More


After seven months of driving a 2017 SUV equipped with just about every advanced driver-assistance system (ADAS) you can imagine, I can tell you this: Autonomous vehicles are here.

OK, maybe they’re not here here. But they’re darn close.

I already take my ADAS for granted. It’s become part of my daily routine to turn on the adaptive cruise control as soon as I get on the turnpike. If you’re not familiar with it, adaptive cruise control allows you to lock in your desired speed, and then your car or truck will maintain a set distance between you and the vehicle in front of you – automatically braking as needed.

I don’t fully trust ADAS – and I’m not sure that I ever will – but it’s still fascinating to see the technological building blocks of automated vehicles making their way into everyday cars, trucks and SUVS.

Some have said that before we see widespread adoption or even mass production of self-driving vehicles, the technology will have to be perfected. And we know the technology isn’t perfect. I’ve lost count of the number of times this winter that Mother Nature has knocked my vehicle’s ADAS out of commission.

For me, an offline ADAS feature is a mere inconvenience. But when a vehicle is in autonomous mode – and the driver isn’t paying attention – any system failure or design flaw can have serious consequences.


On March 18, an Uber vehicle in autonomous mode plowed into a jaywalking pedestrian, killing her. Video released by the Tempe Police Department indicates that the Volvo XC90 didn’t detect the woman and didn’t slow down. The video also shows that the driver wasn’t paying attention to the road. His reaction at the point of impact tells me that the vehicle didn’t provide any visual or audible alerts in the seconds before the accident.

How do you assess blame in this type of accident? The woman – 49-year-old Elaine Herzberg – was crossing a busy road about 100 meters from the closest pedestrian crosswalk, according to news reports. The Uber “safety operator,” Javier Vasquez, told police that he had no time to react, and the external-camera view seems to support his statement that Herzberg came out of nowhere.

“The driver said it was like a flash, the person walked out in front of them,” Tempe Police Chief Sylvia Moir said in a San Francisco Chronicle article. “His first alert to the collision was the sound of the collision.”


Based on the video, Moir also asserted that “it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway.”

The investigation might conclude otherwise, but I think Moir makes a reasonable assertion. Still, it’s a setback for Uber and other companies hoping to bring self-driving vehicles to market as soon as humanly possible.

“Herzberg’s death is the first reported incident of a pedestrian killed by a self-driving car, and raises questions about whether such vehicles are ready to operate autonomously on public roads,” Larry Greenemeier wrote in an article for Scientific American. “The vehicle’s cameras and other sensors apparently did not detect the victim and made no attempt to brake or otherwise avoid her.”

They should have. Self-driving vehicles are designed to “yield to a person crossing a road,” even if the pedestrian is jaywalking, a Carnegie Mellon University professor told Greenemeier. The Volvo’s failure to detect Herzberg is puzzling, the professor added.

Before judging Herzberg for jaywalking – a poor decision, to be sure – there’s another angle to consider here: Uber and other companies are testing self-driving vehicles on public roads, which makes the rest of us guinea pigs.


“You have to test autonomous vehicles on public roads with real cyclists, real pedestrians, innocent people that didn’t sign up for this stuff,” Morgan Stanley analyst Adam Jonas said on CNBC’s “Squawk Alley.”

I think that’s a profound statement. More testing of autonomous vehicles on public roads could increase the likelihood of innocent bystanders being injured or killed. The harsh reality is that those bystanders will pay for the development of self-driving vehicles with their blood, as manufacturers learn from accidents and tweak the technology.

Legal and Ethical Hurdles

Morgan Stanley’s Jonas believes that legal and ethical concerns are the biggest hurdles to autonomous vehicles. I think he’s right. Even if the tragedy in Tempe turns out to be a temporary setback, imagine the public outcry if an automated vehicle kills or injures a child. Manufacturers can talk about preventing collisions and saving lives as much as they want. If there are more fatalities during the testing phase, it only will fuel the narrative that companies will stop at nothing to cash in on this technology.

It will be interesting to see how it plays out. My opinion is that it’s going to be at least a decade before we see mass production and adoption of fully autonomous vehicles. I could be wrong. But I think the most viable near-term future for this technology is using it in conjunction with an engaged “safety operator” at the wheel (emphasis on engaged).


If Vasquez actually had been paying attention to the road, could he have prevented this tragedy? I don’t know. But absent any help from the vehicle’s pedestrian-detection sensors, he had no chance of avoiding Herzberg, because he was looking downward (at a smartphone, I’m guessing).

Don’t get me wrong: ADAS is amazing technology. But as wonderful as my blind-spot detection monitors are, I still can’t bring myself to change lanes without an over-the-shoulder glance at the road. Blind-spot detection provides reassurance that there’s nobody in my blind spot, but it’s not enough by itself. Maybe I’m just old-school.

My point is this: Until this technology is perfected (if that’s even possible), drivers need to be alert and engaged – even when the vehicle is in autonomous mode. It’s simply not realistic or rational, in my opinion, to expect technology to anticipate every possible scenario encountered on the road.

Often during my 50-mile commute, I daydream about what it would be like to someday put my car on autopilot and catch a few extra winks before starting the workday. The tragedy in Tempe makes me think that it could be a while before that’s going to happen.

Click to comment

Win a Golden Tee Arcade Game at CREF Fundraiser

CSN Collision Centers and 1Collision Announce Merger

New Guess the Car!

ASE Test Centers, Summer Registration Now Open



Sponsored Content


Sponsored Content

In Search of a Good Technician

Sponsored Content

Effective Marketing for Collision Repairers

Sponsored Content