This is where all download will be listed, utilizing the Page Add plugin.
File Name | S22-Policy-Stoa-19-AFF-FacialRecognition.docx |
File Size | 81.77 KB |
Date added | October 25, 2021 |
Category | Policy (Stoa) |
Author | David W. Helton |
Resolved: The United States federal government substantially reform the use of Artificial Intelligence technology
Starting in the early 2000’s, US law enforcement began using Facial Recognition Technology (FRT) to identify suspects. The trouble is that there aren’t a lot of restrictions on how federal agencies like the IRS or FBI choose to use FRT. That means that no warrants are required, and FRT systems aren’t tested for bias. The issue here is that federal agencies can use FRT to identify anyone anywhere with very little accountability, and when the FRT they’re using is biased, they can end up misidentifying individuals, which can lead to false arrests. S.2878 solves these problems by requiring warrants and NIST testing of FRT systems. The NIST has extensive experience testing FRT, and their tests are great at detecting bias. In addition, law enforcement are already familiar with the warrant process and it therefore follows they won’t have any trouble adapting to applying warrants to FRT searches.