Ford Motor Co. has become convinced that fully autonomous is the necessary route to go for safe self-driving cars.

Having Ford engineers fall asleep during autonomous vehicle test rides has created a convincing argument in favor of going with fully autonomous vehicles, reports Automotive News.

Dozing off had happened before during long test rides, and Ford had tried several sound and vibrating alerts to keep its engineers alert.

The challenge has been that automated rides can become too relaxing and lulling for human drivers. Engineers have felt relaxed enough to fall asleep, taking away the human safety factor.

“These are trained engineers who are there to observe what’s happening,” said Raj Nair, Ford’s product development chief. “But it’s human nature that you start trusting the vehicle more and more and that you feel you don’t need to be paying attention.”

Ford has become even more committed to going the fully autonomous route, which automotive engineers classify as Level 5, while other automakers are taking a more cautious approach. Ford believes in the new advanced technology enough to remove the steering wheel, brake, and acceleration pedals from its self-driving cars that will be introduced in 2021.

That puts Ford Motor Co. much more inline with Google’s Waymo self-driving car division than with several other automakers. Waymo’s extensive testing had similar human safety findings, strengthening the company’s support for fully autonomous vehicles.

Other automakers have chosen Level 3 semi-autonomous features for their next wave of launches. Audi, BMW, and Mercedes-Benz plan to introduce semi-autonomous cars next year that require drivers to take over after receiving notification, with as little as 10 seconds warning popping up. General Motors and Toyota have also supported a similar, more cautious route.

Going the semi-autonomous, Level 3 route may cause more problems than expected, if drivers become lulled into turning over the car completely as this time.

“There’s evidence to suggest that Level 3 may show an increase in traffic crashes,” Nidhi Kalra, co-director of the Rand Center for Decision Making Under Uncertainty, said this week during a U.S. congressional hearing. “I don’t think there’s enough evidence to suggest that it should be prohibited at this time, but it does pose safety concerns.”

Audi considers going with Level 3 features offers a more gradual approach that will prepare vehicle owners to eventually adopt fully automated cars. It’s the premise behind Audi’s introduction next year of Traffic Jam Pilot, a Level 3 system that will allow hands-free driving up to 35 miles per hour. Drivers will be given a 10-second notice to take over the car again if the Audi’s sensors detect a potentially dangerous scenario. If the driver doesn’t respond, the car will be programmed to come to a stop in its lane.

SEE ALSO:  Ford Says Fully-Autonomous Cars Sans Pedals and Steering Wheel Coming By 2021

Volvo Cars CEO Hakan Samuelsson is leaning more toward the route Ford and Waymo are adopting. There’s no sensor good enough to save the distracted driver, he said.

“We don’t believe in five seconds, 10 seconds,” Samuelsson said. “It could even be dangerous. If you are doing something else, research shows that it will take two minutes or more before you can come back and take over. And that’s absolutely impossible. That really rules out Level 3.”

Volvo will roll out a self-driving system in 2020 that won’t require human intervention. That approach is being tested with Uber through an automated taxi service using self-driving Volvo XC90 crossover vehicles. The steering wheel can be tucked away in these test vehicles, but drivers can bring the wheel out again to steer the XC90 if preferred.

One of the major issues causing automakers and safety regulators to take a gradual and cautious approach with autonomous vehicles is legal liability. The question always comes up: If there’s a fatal crash involving a self-driving car, who will be considered responsible?

Volvo has committed to take responsibility for any crashes by its self-driving vehicles. Samuelson has been concerned that Level 3 semi-autonomous vehicles run the dangers of creating confusion over which party would be legally liable in a crash.

“It should be black and white,” Samuelsson said. “With responsibility, you cannot tell anybody you are a bit responsible. Either you are responsible or you are not.”

Automotive News