There is no doubt that facial recognition is becoming an important factor in this era of technology. After all, China has already begun implementing a nation-wide system utilizing facial recognition and artificial intelligence to watch its citizens’ every move and decision. It doesn’t just keep watch on criminal activities. It also analyses people’s every day activities and distributes points accordingly. If you’re caught jaywalking on camera, the algorithm will automatically deduct points from your social credit account. Though if you do positive things in society, you can gain back points. It’s an episode of Black Mirror made real. Though America is far from following the same steps as China, the technology is there and people fear its misuse.
One of the people’s main concern is of Amazon’s Rekognition, an image analysis service. When sent an image, the program can compare it against thousands of mugshots it has in its database. It seems efficient for police officers and law enforcers but it can be a nightmare for regular citizens, especially when it fails in accuracy for people of color. Amazon has replied to this concern by saying that law enforcers should only act on an image match when it is 99 percent accurate but has made no rules on its use. Police officers have countered that they take suggestions from 80 percent and up as simply a lead for their detectives. The fact remains that although Amazon has suggested rules are needed, they have not made any actions to implement those rules.
Amazon has a strong interest in government contracts as it is currently in a bid for a Pentagon contract. However, others such as ACLU senior legislative counsel Neema Singh Guliani are concerned that Amazon cannot be trusted as they can’t seem to take responsibility on their technology and enforce rules to prevent misuse. As new technology such as facial recognition and artificial intelligence take more steps to become an essential factor in our daily lives, it’s important to know their potential as well as dangers of potential misuse.