There’s a new generation of robots ready to make daily tasks easier, and some of them have already arrived.

Twenty years ago, people were lucky to have cruise control in their car, but the latest models can help you stay in the center of the lane, brake faster than humanly possible if the vehicle in front stops abruptly, and ignore your poor braking ability on an icy surface in favor of its own anti-lock systems.

That’s just the beginning. Autonomous cars are already navigating the open road, and they’ve brought with them one glaring problem engineers are keen to gloss over: Cars without humans are amoral. They don’t have the ability to decide between running over the family dog or a group of children to accomplish a parking, driving or lane-changing task.

The engineers in charge are well aware of the problem, of course, and are busy trying to figure out the correct software that can make instantaneous driving decisions when a problem emerges.

But if your brand new car is busy parallel parking, and a toddler walks in the way to grab a ball that’s rolled into the street, is the software going to work correctly?

[lz_ndn video= 30024991]

There’s something really arrogant about humans assuming we can properly analyze the extraordinarily complex task of driving and solve the problem with robots that have no ability to make moral judgments. Moreover, with most other vehicles on the road still driven by humans for the foreseeable future, having a fleet of conscienceless cars in the mix introduces the possibility that humans will behave even more aggressively.

Split second decision-making while driving is extraordinarily complicated for humans. Is it safe to change lanes? Is the dog chasing the car? Do you drive into the ditch to avoid hitting the deer? Your brakes failed. Do you smash into the car in front of you or veer over the sidewalk, hoping people will get out of the way in time? Self-driving cars simply don’t have the capacity to weigh the value of a life or the risks associated with a maneuver that may be advisable under some circumstances, but not others.

The list of decision points required to drive safely is endless, and it takes years for many people learning how to drive before they have a reasonable handle on the task.

Reflecting this challenge, automotive researchers are now saying it may be necessary for autonomous cars to be bad drivers for them to be safe and effective on the roads. Some of the problems are for the most improbable reasons, too. One of Google’s test autonomous vehicles got pulled over by a (human) policeman for driving too slow — 24 in a 35-mph zone.

Who do you think would win the Presidency?

By completing the poll, you agree to receive emails from LifeZette, occasional offers from our partners and that you've read and agree to our privacy policy and legal statement.

The problem is symptomatic of the flawed perspective of engineers who seem to believe that, as long as driverless cars err on the side of being super safe and cautious, all will be well.

This ignores the fact that humans aren’t entirely virtuous themselves, particularly when placed behind the wheel of an automobile. When an autonomous car leaves an extra bit of distance between of the car in front, human drivers are naturally inclined to take advantage of the cautious approach and slip in the space.

If robot cars are expected to drive slowly, humans can be expected to pass them. Probably on a curve.

It’s already happening. Google cars keep getting rear ended because they come to a complete stop at stop signs and then wait a few seconds before moving forward. The human drivers behind them aren’t accustomed to overly judicious driving, and accidents occur.

Evidently, autonomous vehicles are here to stay, and every adaptive cruise-control feature and self-parking button is a step in that direction. Without an actual driver to man the wheel, however, the likelihood of human drivers compensating is certain to bring unexpected challenges as software engineers work out the bugs of an increasingly complex world.

Until the roads are run exclusively by autonomous cars, this transitional period is proving dangerous for drivers and pedestrians alike. Engineers would do well to consider their own hubris in sending off into the open road a fleet of machines that have no capacity for moral judgment.