Congress doesn’t have to wait to apply different protective thresholds to sensitive information in light of the Supreme Court’s Carpenter decision (see 2005130056), said Privacy and Civil Liberties Oversight Board Chairman Adam Klein Wednesday during a PCLOB virtual forum. Carpenter said the government’s collection of at least seven days of cellsite location information is a Fourth Amendment-protected search, meaning police must obtain warrants. Klein agreed with Elizabeth Goitein, co-director of the Brennan Center for Justice’s Liberty & National Security Program. Goitein said Carpenter indicates Congress can apply that standard to other information collected under Section 215 of the Patriot Act. For instance, authorities can collect web browsing data without warrant using Section 215. Goitein said that’s one area Congress could legislate. The Senate recently rejected such a proposal (see 2005130056). PCLOB members Edward Felten, Jane Nitze, Travis LeBlanc and Aditya Bamzai asked how the board can better fulfill its mission and what authorities it should be scrutinizing. It could explore Section 215 to determine what intelligence value it's yielding, said University of Texas School of Law Associate Dean-Academic Affairs Robert Chesney. Goitein agreed Section 215 needs attention but disagreed the PCLOB should weigh intelligence value. The board’s focus is privacy and civil liberties so that should be the focus of any Section 215 review, she added. Nitze asked if the existence of the Foreign Intelligence Surveillance Act court lessens the incentive for congressional oversight of intelligence agencies. Goitein agreed but said removing the FISA court isn’t the answer. Bamzai asked what existing proposals the board should analyze. Georgetown University Law Center visiting professor Mary McCord agreed with the recently Senate-passed proposal requiring FISA court judges to appoint an amicus curiae in certain cases. The House requested conference on FISA reauthorization (see 2005280023). Davis Polk's Kenneth Wainstein warned against allowing amici to be too involved in court proceedings, saying it could impact national security. He recommended Congress pass the reauthorization and then consider a separate FISA revamp package based on DOJ inspector general findings detailing abuse (see 2003310068).
Give voice providers more call blocking authority and more liability protection, industry asked the FCC in comments posted through Monday in docket 20-93. Don't limit a safe harbor to one-ring scams because doing so wouldn't "provide the certainty against all illegal robocalls as new scams arise," USTelecom said. "T-Mobile and other carriers will be hesitant to take advantage of opt-out call blocking without a safe harbor," it said. The safe harbor should also "protect providers from liability due to inadvertent mislabeling or misidentification of a call’s level of trust," said CTIA. Avoid "prescriptive requirements governing how providers communicate with their subscribers -- for instance, call labeling requirements or a requirement to notify subscribers dialing international toll-generating numbers of the cost before connecting the call, the latter of which could be a complex and expensive undertaking," said NCTA. Incompas said a new rule for international gateway providers to verify "the nature and purpose of foreign originators would be unnecessary and overly burdensome." AT&T said FCC should require as "baseline best practices for robocall mitigation" that providers have "the capability to monitor traffic patterns that would flag suspect calling campaigns, and then robust application of the provider’s terms of service to eliminate problem customers."
The FCC Hospital Robocall Protection Group will introduce its committee chair and vice chair and establish working groups when the advisory group meets virtually and for the first time July 27 at 10 a.m. EDT, says Monday's Federal Register. HRPG stems from the Traced Act (see 2003250054).
Forty percent of U.S. broadband households were sheltering in place in May, even without mandates, said a Parks Associates survey fielded May 14-28. Increased demand on home networks resulting from sheltering in place raised the importance of data and privacy protections, said Parks Thursday. “Online attacks to the home network can now disrupt work and education along with entertainment and shopping activities,” said analyst Brad Russell. Nearly 80% of those surveyed said they're concerned about the possibility of a data security or privacy breach, he said.
Defund “police surveillance technology” used to “spy” on minorities and protesters, more than 100 advocacy groups wrote House leaders Wednesday. The American Civil Liberties Union, Color of Change, Free Press and Center for Democracy & Technology signed, urging barring federal funding of “unwarranted mass-surveillance programs” like those enabled by the Patriot Act: “Congress has failed to take sufficient action to prevent increased surveillance.”
Voxx hired a banker to help evaluate “strategic alternatives” for EyeLock, its iris-authentication subsidiary, said Voxx CEO Pat Lavelle on a fiscal Q4 call Tuesday. “This could be a spinoff, a financing partner, a joint venture or an outright sale.” The segment never was profitable and generated sales of $100,000 for the year. COVID-19 has created much “inbound interest” in EyeLock, said Lavelle. “With everyone wearing masks and gloves, iris is quickly becoming the preferred choice for authentication.” The result is “renewed interest in EyeLock’s technology and in the company,” he said. The competitive facial-recognition technology, he said, is facing “additional backlash, given the events of recent weeks” (see 2006110059). It’s “challenging to forecast” the consumer tech business during normal times, and with COVID-19 “that remains even more so,” Lavelle said. The company is “anticipating a slow start” to fiscal 2021, expecting growth to recover in the year’s second half, he said. Q4 ended Feb. 29.
Apple and Google should bar third-party contact tracing apps from using data for targeted advertising, New York Attorney General Letitia James (D) wrote the companies Monday (see 2004170060). She also recommended the apps be prohibited from using data to identify anonymous users and be required to delete data “on a rolling, 14-day basis.” The measures will help protect consumer data and ensure appropriate collection, she said. The companies didn’t comment.
Decisions by Amazon, IBM and Microsoft to suspend facial recognition deployment for police (see 2006110059) highlight the “lack of federal guardrails,” FTC Commissioner Christine Wilson tweeted Friday. She asked when Congress will act on facial recognition and privacy measures: “The same lack of guardrails applies to a much broader swath of tech impacting #privacy (and civil liberties). When will Congress act?”
Microsoft won’t sell facial recognition technology to U.S. police departments until a national law is in place, President Brad Smith said Thursday, following the lead of IBM and Amazon. IBM “no longer offers general purpose IBM facial recognition or analysis software,” CEO Arvind Krishna wrote Congress Monday, “outlining detailed policy proposals to advance racial equality.” Amazon implemented a “one-year moratorium on police use of Amazon’s facial recognition technology” Wednesday, though it will continue allowing use from organizations like Thorn, the International Center for Missing and Exploited Children and Marinus Analytics. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” Amazon said. Sen. Ed Markey, D-Mass., welcomed a “pause” on police use of the technology: “What Amazon should really do is a complete about-face and get out of the business of dangerous surveillance altogether.” It took two years, but the American Civil Liberties Union is “glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly,” said ACLU Northern California Technology and Civil Liberties Director Nicole Ozer. The group's Civil Liberties Attorney Matt Cagle urged Microsoft to halt “its current efforts to advance legislation that would legitimize and expand the police use of facial recognition in multiple states.” Electronic Frontier Foundation Policy Analyst Matthew Guariglia called Microsoft’s decision a good step, saying it “must permanently end its sale of this dangerous technology to police departments.”
COVID-19 response technology must be “non-discriminatory, effective, voluntary, secure, accountable, and used exclusively for public health purposes,” more than 80 advocacy groups said Thursday. The Leadership Conference on Civil and Human Rights, Lawyers’ Committee for Civil Rights Under Law and New America’s Open Technology Institute signed a set of principles to “guide employers, policymakers, businesses, and public health authorities” while reopening society. Decision-makers should “be mindful of the risks of overreach and unintended consequences, especially to marginalized communities already suffering disproportionately from the virus and economic hardships,” the groups wrote.