This is where all download will be listed, utilizing the Page Add plugin.
File Name | S22-Policy-Stoa-11-AFF-AlgoAccountability.docx |
File Size | 251.11 KB |
Date added | September 27, 2021 |
Category | Policy (Stoa) |
Author | David W. Helton |
Resolved: The United States federal government substantially reform the use of Artificial Intelligence technology
Case summary: Facebook, Amazon, and Google are some of the most well-known examples of companies who use artificial intelligence (AI) algorithms to hire workers, show certain ads, and mine data. Other companies and industries also use AI algorithms to make decisions. Banks use them to detect fraudulent transfers, social media platforms use them to filter inappropriate or unwanted content, and hospitals use them to analyze data and to diagnose illnesses. But what happens when Amazon’s hiring algorithm discriminates against women? Or when Facebook uses a housing ad algorithm that discriminates against minorities and veterans? Sure you can just sue the company, but that’s expensive and usually ineffective. That’s where H.R. 2231, the Algorithmic Accountability Act of 2019, comes in. The Algorithmic Accountability Act is targeted towards large companies who hold many people’s data. The act requires companies who have more than $50 million in revenue, or who have more than one million people’s data, to conduct mandatory impact assessments on their “high risk” automated decision systems, which includes machine learning, data processing, and other AI systems. These impact assessments evaluate how AI algorithms are used in terms of accuracy, fairness, bias, privacy, security, use of personal data, and security information systems and stores. With more and more companies seeking to use AI in hiring and in data processing, there must be a way to ensure AI algorithms are used properly and without bias. Keep in mind that these impact assessments may be carried out by third party auditors. That means that companies can hire algorithmic auditing firms to conduct the assessments. Some companies currently audit themselves without federal intervention, but the majority are more concerned about profits than bias. This means that we need a way to ensure accountability on the part of companies, and the best way to do that is to pass the Algorithmic Accountability Act.
Section 3(d)(2)(B) of HR2231 is where the bill says enforcement is done through the Federal Trade Commission under the existing rules contained in 18 US Code Section 41.