Another tool for neo-Nazis

The news is out: [Federal government researchers found evidence of bias against minorities in facial recognition software as its use is set to expand at airport security checkpoints.
The Transportation Security Administration and U.S. Customs and Border Protection have been testing facial recognition technology at airports across the county, expecting it will become the preferred method to verify a passenger’s identity.
The National Institute of Standards and Technology reported this month that facial recognition software showed a higher rate of incorrect matches between two photos for Asian and black people relative to white people.
Researchers studied the performance of 189 algorithms from 99 manufacturers representing most of the industry. Some algorithms performed better than others, they concluded, meaning that it’s likely the industry can correct the problems.
The institute found that U.S.-developed algorithms had the highest rate of incorrect matches, or false positives, for American Indians.
Researchers found a higher rate of false identifications of black women when matching their photos to an FBI mugshot database. Higher rates of mismatches increase the chance that a person could be falsely accused, the institute said.]

This new racist tool can be fixed with a few lawsuits.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s