Three Democratic lawmakers Wednesday introduced a bill in the Senate to bring government oversight to the algorithms companies use for a wide variety of purposes.
The bill, called the Algorithmic Accountability Act of 2019, would empower the Federal Trade Commission to scrutinize the use of consumer-facing “automated decision systems” in an effort to identify bias, privacy and security risks.
The bill comes after some technology firms faced questioning over their use of algorithm-based systems that decide which users see content such as housing and job advertisements.
Facebook has faced particular scrutiny in recent months. The U.S. Department of Housing and Urban Development sued the social media giant over claims that the company allowed ads to be targeted to certain ages, races and genders. A recent study also found that Facebook’s system targeted particular groups with certain housing and jobs ads without being asked to by advertisers.
Last year, Reuters reported on an automated system for hiring at Amazon that had an obvious flaw: it preferred male applicants over female ones.
The bill, introduced by Sen. Ron Wyden, D-Ore., Sen. Cory Booker, D-N.J., and Rep. Yvette D. Clarke, D-N.Y., adds to growing political attention directed at algorithmic bias.
In May 2018, New York City Mayor Bill de Blasio trumpeted the city’s first algorithm watchdog group, assembling a panel of experts to examine how computer programs are used in municipal decision making, particularly law enforcement. The Automated Decision Systems Task Force is believed to be the first such group in the nation.
Andrew Ferguson, a law professor at the University of the District of Columbia and the author of The Rise of Big Data Policing, told NBC News that a federal bill like this is “overdue.”
“This type of federal effort will be critical for maintaining control over companies using personal data for commercial gain,” Ferguson said. “Equally importantly, I hope the bill sparks a national conversation about how to best set up local, state and additional federal laws to police algorithmic decision-making. Underneath the data is democracy, and our democratic leaders need to act."
The new federal legislation empowers the FTC to establish its own rules mandating “impact assessments,” which the law defines as a study that scrutinizes the “design and training data,” taking into account fairness, bias, discrimination and other considerations. In a statement, Booker said that his parents were subjected to “real estate steering” when they attempted to buy a house in his home state 50 years ago.
“However, the discrimination that my family faced in 1969 can be significantly harder to detect in 2019: houses that you never know are for sale, job opportunities that never present themselves, and financing that you never become aware of — all due to biased algorithms,” he said. “This bill requires companies to regularly evaluate their tools for accuracy, fairness, bias and discrimination. It’s a key step toward ensuring more accountability from the entities using software to make decisions that can change lives.”
If it is enacted into law, not all companies would be equally affected — it would only cover companies already under the FTC’s purview, and those that make more than $50 million annually. The bill has support from the Center on Privacy and Technology at Georgetown Law, among other advocacy groups.
“Algorithms control various aspects of a digital economy,” Francella Ochillo, the general counsel for the National Hispanic Media Coalition, said. “They determine which candidates will be interviewed and how much they will be paid; who will be targeted for or excluded from advertisements; and how much consumers will pay for goods and shipping online.”
The bill is expected to first be heard by the Senate Commerce Committee in the coming months.