Driverless cars could be ‘RACIST’ when detecting pedestrians on the road, according to new study
Research into models similar to those used in driverless cars found lighter skinned people were five per cent more likely to be detected by a vehicle than those with dark skin
DRIVERLESS cars could have an issue with racial bias when it comes to identifying pedestrians on the road, according to a new study.
Research conducted by the Georgia Institute of Technology found autonomous car technology could have problems detecting people with darker skin tones.
The results could mean lighter skinned people have less chance of being hit by a driverless vehicle than those with dark skin.
Researchers analysed how object detection models like those used in autonomous cars identify people from different demographic groups.
Using a range of images of people with different skin tones, the study found technology was, on average, five per cent less accurate at detecting pedestrians in the dark-skinned group.
And even when variables like the time of day or the partial obstruction of a pedestrian were altered, examples of people with darker skin weren't as easily detected.
The levels of autonomous driving
Level 1: The first level of autonomy means the driver remains in control of the car the entire time. Steering and acceleration can be controlled by the car. These systems are already on sale and include self-parking and lane assit
Level 2: These require drivers to pay attention to their surroundings and be prepared to "take control of the vehicle in specific situations". Drivers have to keep their hands on the wheel just in case, too.
Level 3: These cars can make decisions for themselves without the need for driver inputs in certain situations. The Audi A8 can offer this tech - regulation permitting - with the ability to drive itself up to 37mph.
Level 4: These are true "driverless cars" that can navigate without any driver help and can independently indicate, brake and steer. These won't be on sale until 2021 at the earliest.
Level 5: The end goal is a car that doesn't need a driver at all - there might not even be pedals or a steering wheel.
The technological bias could be the fault of the way models are developed, as systems may be trained using examples of light-skinned pedestrians, meaning they are simply acting on what they have learned to identify.
The study also didn't test any detection models actually being used by self-driving cars, as manufacturers haven't made their data available to the public.
Instead, it analysed replica technology available for public access and used by academic researchers.
latest motors news
Potential racial bias isn't the first ethical dilemma faced during the development of autonomous car technology.
A previous social experiment dubbed the Moral Machine asked participants to chose who should be the victim of a driverless car collision, based on age, gender and social status.
And earlier this year, motorists were asked who should be killed if two children ran in front of an autonomous car that was unable to stop.