Big Data Accelerates Traditional Civil Rights Issues, New America Foundation Hears
Policy and legal questions can obscure underlying civil rights issues that big data exacerbates, said experts at a New America Foundation event. “Whether we use the language of big data or privacy or civil rights, the questions are the same,” said Corinne Yu, managing policy director of the Leadership Conference on Civil and Human Rights. “We're talking about criminal justice, we're talking about jobs, we're talking about financial inclusion, we're talking about what kind of society we want to be in.” Big data has accelerated established discriminatory tactics such as profiling -- whether by police officers, the government or retailers, said American Civil Liberties Union Legislative Director Chris Calabrese. This intersection of big data analytics and these discriminatory practices “is a deep and fertile terrain,” he said.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
In late February, the Leadership Conference joined with 13 other civil society groups to issue “Civil Rights Principles for the Era of Big Data” (http://bit.ly/1kcGFrs). Five of the groups were on Friday’s panel. The principles involved eliminating discriminatory profiling, ensuring accurate automated decisions and empowering individuals to access and correct information collected about them.
The principles referred to modern-day profiling as “high-tech profiling” -- or profiling “on steroids,” said Yu. “High-tech profiling is done in a way it’s never been done before,” Yu said. “Now it’s cheap because of technology.” Government authorities set up cameras in mosque parking lots, which record the license plates of all cars parked there, then track those cars, said Yu. Chicago police maintain a 400-person “heat list” based on people’s affiliations -- perhaps their Facebook friends -- and other factors not related to their criminal history, said Calabrese. “If you are on that heat list, the police may come to your door and say, ‘We're watching you,'” he said. “A really good predictor if someone is going to be arrested for a crime is if the police are watching them."
Government verification programs -- automated and relying on large databases -- also put minority groups at a disadvantage, said Jason Lagria, senior staff attorney with Asian Americans Advancing Justice. Look at the E-Verify system, which checks people’s authorization to work in the U.S., he said. Naturalized citizens are 30 times more likely than natural-born citizens to get a false positive, indicating they can’t work in the U.S., said Lagria. Minority names are more likely to be confusing, reversed, or misspelled, and the database can face updating delays, he said. Thirteen percent of people who receive false positives spend over $100 trying to fix it, a significant amount of money for low- or no-income individuals, Calabrese said.
Difficulties correcting mistakes highlight the constitutional right to due process, said Calabrese. A man inadvertently placed on the no-fly list -- possibly due in part to his religion, Islam -- was offered a chance to be removed from the list if he became an FBI informant (http://bit.ly/Oqz2Qq), said Calabrese. That case is now pending before the U.S. District Court of Portland, according to the ACLU (http://bit.ly/1fG1Qzg). Without a clear right to dispute conclusions the government has made about a person, “you can see how these pernicious practices undermine our Constitution,” he said.
Correction also requires control, said Hazeen Ashby, legislative director for research and policy at the National Urban League. Outside of the Fair Credit Reporting Act, which allows people to access and correct information in their credit report, individuals get little control over their data, she said. “I want that right to see that data and see what attributes are being ascribed to me."
What happens when the problem isn’t that the data are inaccurate? asked Calabrese. He returned to Chicago’s “heat list.” Perhaps statistically, if one has three friends on Facebook involved in shooting incidents, one is more likely to engage in violent activity, he said. “Maybe that’s an accurate result,” he said. “Is that a fair one, though?” From a retailer perspective, Target might be able to use people’s income level to target them with coupons of varying dollar amounts, said Calabrese. “That a computer algorithm spit it out may give it a gloss of fairness,” he said. Is it actually fair? “That is a very difficult question we have to grapple with,” said Calabrese.