The SAE (formerly the Society of Automotive Engineers, now just SAE International) uses the term automated instead of autonomous. One reason is that the word autonomy has implications beyond the electromechanical. A fully autonomous car would be self-aware and capable of making its own choices. For example, you say “drive me to work” but the car decides to take you to the beach instead. A fully automated car, however, would follow orders and then drive itself.
The term self-driving is often used interchangeably with autonomous. However, it’s a slightly different thing. A self-driving car can drive itself in some or even all situations, but a human passenger must always be present and ready to take control. Self-driving cars would fall under Level 3 (conditional driving automation) or Level 4 (high driving automation). They are subject to geofencing, unlike a fully autonomous Level 5 car that could go anywhere.
How do autonomous cars work?
Autonomous cars rely on sensors, actuators, complex algorithms, machine learning systems, and powerful processors to execute software.
These cars create and maintain a map of their surroundings based on a variety of sensors situated in different parts of the vehicle. Radar sensors monitor the position of nearby vehicles. Video cameras detect traffic lights, read road signs, track other vehicles, and look for pedestrians. Lidar (light detection and ranging) sensors bounce pulses of light off the car’s surroundings to measure distances, detect road edges, and identify lane markings. Ultrasonic sensors in the wheels detect curbs and other vehicles when parking.
Sophisticated software then processes all this sensory input, plots a path, and sends instructions to the car’s actuators, which control acceleration, braking and steering. Hard-coded rules, obstacle avoidance algorithms, predictive modeling, and object recognition help the software follow traffic rules and navigate obstacles.
What are the challenges with autonomous cars?
Fully autonomous (Level 5) cars are undergoing testing in several pockets of the world, but none are yet available to the general public. We’re still years away from that. The challenges range from the technological and legislative to the environmental and philosophical. Here are just some of the unknowns.
Lidar and Radar – Lidar is expensive and is still trying to strike the right balance between range and resolution. If multiple autonomous cars were to drive on the same road, would their lidar signals interfere with one another? And if multiple radio frequencies are available, will the frequency range be enough to support mass production of autonomous cars?
Weather Conditions – What happens when an autonomous car drives in heavy precipitation? If there’s a layer of snow on the road, lane dividers disappear. How will the cameras and sensors track lane markings if the markings are obscured by water, oil, ice, or debris?
Traffic Conditions and Laws – Will autonomous cars have trouble in tunnels or on bridges? How will they do in bumper-to-bumper traffic? Will autonomous cars be relegated to a specific lane?Will they be granted carpool lane access? And what about the fleet of legacy cars still sharing the roadways for the next 20 or 30 years?
State vs. Federal Regulation – The regulatory process in the U.S. has recently shifted from federal guidance to state-by-state mandates for autonomous cars. Some states have even proposed a per-mile tax on autonomous vehicles to prevent the rise of “zombie cars” driving around without passengers. Lawmakers have also written bills proposing that all autonomous cars must be zero-emission vehicles and have a panic button installed. But are the laws going to be different from state to state? Will you be able to cross state lines with an autonomous car?
Accident Liability – Who is liable for accidents caused by an autonomous car? The manufacturer? The human passenger? The latest blueprints suggest that a fully autonomous Level 5 car will not have a dashboard or a steering wheel, so a human passenger would not even have the option to take control of the vehicle in an emergency.
Artificial vs. Emotional Intelligence
Human drivers rely on subtle cues and non-verbal communication—like making eye contact with pedestrians or reading the facial expressions and body language of other drivers—to make split-second judgment calls and predict behaviors. Will autonomous cars be able to replicate this connection? Will they have the same life-saving instincts as human drivers?
What are the benefits of autonomous cars?
The scenarios for convenience and quality-of-life improvements are limitless. The elderly and the physically disabled would have independence. If your kids were at summer camp and forgot their bathing suits and toothbrushes, the car could bring them the missing items. You could even send your dog to a veterinary appointment.
But the real promise of autonomous cars is the potential for dramatically lowering CO2 emissions. In a recent study, experts identified three trends that, if adopted concurrently, would unleash the full potential of autonomous cars: vehicle automation, vehicle electrification, and ridesharing. By 2050, these “three revolutions in urban transportation” could:
Reduce traffic congestion (30% fewer vehicles on the road)
Cut transportation costs by 40% (in terms of vehicles, fuel and infrastructure)
Improve walkability and livability
Free up parking lots for other uses (schools, parks, community centers)
Reduce urban CO2 emissions by 80% worldwide
Right now, it’s a watch and wait scenario as there are many kinks that need to be worked out – however, don’t be surprised if in our lifetime autonomous cars drive with us side-by-side.
(Source: Synopsis)