Zoom “falsely and repeatedly” claimed it used end-to-end encryption to protect users’ videoconferencing communications, Consumer Watchdog alleged in a lawsuit announced Tuesday. Zoom made false promises throughout its website when the company has always had access to data on the platform, the lawsuit said. Filed in D.C. Superior Court, the lawsuit alleges violations against the District of Columbia Consumer Protection Procedures Act. "We take privacy and security extremely seriously and are committed to continuous enhancements, including the timely beta testing and implementation of end-to-end encryption," a company spokesperson said.
U.S. and EU officials started talks for a potential “enhanced EU-U.S. Privacy Shield framework,” Commerce Secretary Wilbur Ross and European Commissioner for Justice Didier Reynders announced Monday (see 2007240031): “The European Union and the United States recognize the vital importance of data protection and the significance of cross-border data transfers to our citizens and economies.” The two sides are committed to privacy and rule of law and have collaborated for decades, they said.
The FTC should investigate whether the digital ad industry is improperly tracking consumers’ phones, computers and TVs, a bipartisan, bicameral group wrote the agency Friday with support from FCC Commissioner Geoffrey Starks: Industry groups are putting consumer data up for auction with real-time bids for the right to curate digital ads. Hundreds of participants collect sensitive information like location, gender, race and age without obtaining consent, they wrote. Sens. Ron Wyden, D-Ore.; Bill Cassidy, R-La.; Maria Cantwell, D-Wash.; Sherrod Brown, D-Ohio; Elizabeth Warren, D-Mass.; and Ed Markey, D-Mass., signed the letter with Reps. Anna Eshoo, Zoe Lofgren and Ro Khanna, all D-Calif., and Yvette Clarke, D-N.Y. Starks issued a statement with Clarke, saying communities of color and protesters are at particular risk: The industry has amassed and is exploiting “massive dossiers on Americans” regarding where “they exercise their rights to worship and protest.” The FTC confirmed Friday it received the legislators' letter.
U.S. and European regulators should “swiftly begin negotiations” to replace the Privacy Shield (see 2007240031) and allow an enforcement moratorium, 17 trade associations wrote officials Thursday. The Information Technology Industry Council, ACT|The App Association, Computer & Communications Industry Association, U.S. Chamber of Commerce and Software & Information Industry Association signed the letter to European Justice Commissioner Didier Reynders, U.S. Commerce Secretary Wilbur Ross and European Data Protection Board Chair Andrea Jelinek. Negotiators should begin immediately on a “solid legal framework to avoid trade disruptions to EU-U.S. data flows,” they wrote. EU data protection authorities should provide guidance for companies that used the PS and should allow a “reasonable enforcement moratorium,” they wrote. A Commerce Department spokesperson cited a previous statement from Ross. The European Data Protection Board didn’t comment.
TikTok will disclose algorithms and content moderation policies in real-time, CEO Kevin Mayer blogged Wednesday. This is meant to “drive deeper conversations around algorithms, transparency, and content moderation, and to develop stricter rules of the road,” he wrote. He encouraged other tech companies to do the same: “All companies should disclose their algorithms, moderation policies, and data flows to regulators. We will not wait for regulation to come.”
Decisions by Amazon, IBM and Microsoft to suspend facial recognition deployment for police (see 2006110059) highlight the “lack of federal guardrails,” FTC Commissioner Christine Wilson tweeted Friday. She asked when Congress will act on facial recognition and privacy measures: “The same lack of guardrails applies to a much broader swath of tech impacting #privacy (and civil liberties). When will Congress act?”
Microsoft won’t sell facial recognition technology to U.S. police departments until a national law is in place, President Brad Smith said Thursday, following the lead of IBM and Amazon. IBM “no longer offers general purpose IBM facial recognition or analysis software,” CEO Arvind Krishna wrote Congress Monday, “outlining detailed policy proposals to advance racial equality.” Amazon implemented a “one-year moratorium on police use of Amazon’s facial recognition technology” Wednesday, though it will continue allowing use from organizations like Thorn, the International Center for Missing and Exploited Children and Marinus Analytics. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” Amazon said. Sen. Ed Markey, D-Mass., welcomed a “pause” on police use of the technology: “What Amazon should really do is a complete about-face and get out of the business of dangerous surveillance altogether.” It took two years, but the American Civil Liberties Union is “glad the company is finally recognizing the dangers face recognition poses to Black and Brown communities and civil rights more broadly,” said ACLU Northern California Technology and Civil Liberties Director Nicole Ozer. The group's Civil Liberties Attorney Matt Cagle urged Microsoft to halt “its current efforts to advance legislation that would legitimize and expand the police use of facial recognition in multiple states.” Electronic Frontier Foundation Policy Analyst Matthew Guariglia called Microsoft’s decision a good step, saying it “must permanently end its sale of this dangerous technology to police departments.”
California might enforce its privacy law three months before final regulations by Attorney General Xavier Becerra (D), said privacy attorney Christina Gagnier on a Carlton Fields webinar Thursday. The AG hasn't announced timing for California Consumer Privacy Act rules, but “it’s been communicated that the regulations might not be out until October,” even though Becerra hasn’t budged on starting enforcement July 1, she said. COVID-19 has moved many things back but it’s also brought “a heightened awareness of privacy,” Gagnier said. “The AG’s office is basically balancing those two things.” The final rules probably won't deviate much from proposed regulations as revised a few months ago (see 2004020043), unless the legislature this summer passes major changes like what’s proposed in AB-3119 by Assemblymember Buffy Wicks (D), the lawyer said. Wiley heard the same, attorney Joan Stewart emailed us. "While the AG hasn’t provided guidance yet on how enforcement would work in a world without implementing regulations -- we anticipate that initially enforcement could be focused on the requirements of the statute, rather than compliance specifics tied to the regulations." Expect the AG to "go after businesses that have made no effort to comply rather than businesses that have made a good faith effort but fell short." The International Association of Privacy Professionals blogged Monday about the possible delay to CCPA rules. "For regulations to become effective July 1, they must be filed with the Office of Administrative Law by May 31," but they haven't been submitted, IAPP said. If the AG doesn't meet that deadline, "their effective date will likely slip until Oct. 1." Becerra is "committed to enforcing the law starting July 1," a spokesperson emailed. "We encourage businesses to be particularly mindful of data security in this time of emergency."
House Commerce Committee Republicans questioned Thursday whether TikTok violates children’s privacy law, and if it’s sharing information with the Chinese government. Congress has “significant concerns” over company data practices, including whether it “continues to violate the Children’s Online Privacy Protection Act and whether TikTok shares American user’s information with the Chinese Communist Party,” wrote ranking member Greg Walden, Ore., and House Consumer Protection Subcommittee ranking member Cathy McMorris Rodgers, Wash. "While we think the concerns are unfounded, we appreciate them and continue to further strengthen our safeguards while increasing our dialogue with lawmakers," the company emailed. "We've received the letter and will respond as we look forward to bringing greater clarity to our policies, practices, and operations."
A Canadian company falsely claimed its smart locks were secure and failed to follow industry best practices to protect data, the FTC alleged Wednesday in a settlement. Contrary to representations to consumers, Tapplock “failed to take reasonable precautions or follow industry best practices to protect the consumer data it collected through its app,” the agency said. With 5-0 commissioner approval, the settlement requires the company to “implement a comprehensive security program and obtain independent biennial assessments.”