There's a chance you could be arrested for protesting, even if you didn't take part in any at all -- that's according to a cybersecurity expert.

The police have been using facial recognition technology for years now, especially when it comes to identifying perpetrators of all kinds, including those participating in protests. It's a great piece of tech for sure, but what if the system fails and misidentifies you as a suspect?

According to David Harding, chief technology officer for ImageWare Systems, facial recognition systems in the U.S. were trained to identify white male faces, which means the likelihood of misidentifying someone who is not a man and not white is much higher.

"Anytime you're in an area where they [law enforcement or the government] are using facial recognition, you have to worry about being falsely matched to someone," Harding said. "Or what's even worse, someone being falsely matched to you."

Facial recognition sure is a handy tool among American law enforcement. In Minnesota, where protestors flooded the streets screaming justice for George Floyd, local authorities are still using Clearview AI, a facial recognition software that came under fire early in 2020 for scraping people's photos from social media. It resulted in a cease and desist order issued by Twitter and other companies.

Apart from Minnesota, several other states are still using Clearview AI, including New York, while others are using Amazon's Rekognition software, which is an equally awful facial recognition tech.

A peer-reviewed study from the Massachusetts Institute of Technology found that Rekognition was extremely bad at recognizing female and dark-skinned faces, more so than other similar services. The software misclassified women as men 19% of the time, according to a report by The New York Times. That error rate got even higher when skin color was taken into account: 31% of dark-skinned women were labeled as men.

According to Harding, females and non-whites are most vulnerable to incidents of misidentification. He emphasizes the huge difference between facial recognition as the sole tool in mass surveillance and law enforcement using a mug shot, fingerprints, and other evidence in a controlled environment alongside facial recognition software to find a specific suspect. And yet there are growing concerns that the controversial technology is violating human rights and its use is being called into question.

Protesters in America are fighting against racial prejudice. Human rights activists believe that if these people become subjects of state surveillance, it breaches their human rights and is discriminatory and authoritarian.