Tech

Study Finds Bias in Facial-Recognition Technology

Did you know that in recent studies more airports are adding facial-recognition technology to airports? The reason behind this is due to the incline of false identifications, mainly in African Americans and Asians.

Why Is This Happening?

First, police officers use facial recognition technology so they can identify suspects. With these inaccuracies happening the police cannot catch the guilted suspected possibly arresting the wrong person who had nothing to do with the situation.

What Studies Show

These issues have been causing the National Institue of Standards and Technology (NIST) to test exhibit bias further to find that 10 out of 100 African-American and Asians were falsely identified. Generally, white American men benefit from higher accuracy rates and that elderly and children are more likely to be misidentified.

As part of the testing process, the NIST tested 18 million photos of people all across the U.S. New evidence supports that most facial recognition technology worsens the results of these statistics based on the person’s age, gender, and race.

Who is Falsely Identified Most?

Asain and African American people are more likely, about 100% than anybody to be misidentified. Native Americans included. Depending on the search on how this misidentification works, the results heavily vary among ethnicities. African American women fall in the line of getting falsely accused more often in searches.

What is the Impact?

These reports have caused controversy and reports among the American Civil Liberties Union (ACLU). Searches like these are very important for airports, specifically when it comes time for passengers to board. These algorithms that are developed in the United States showed high error rates in searches for Asians, African Americans, Native Americans, and Pacific Islanders. Because of false identification, it makes it easier for imposters to cheat the system.

Show More
Back to top button