Meta violated EU privacy law by enabling automated "data scraping" of personal information, an Irish Data Protection Commission (DPC) investigation found. The inquiry launched in 2021 based on media reports of the discovery of a collated dataset of Facebook personal data on the internet. The DPC examined Facebook search, Facebook Messenger contact importer and Instagram contact importer tools about processing Meta carried out between May 2018 and September 2019. The main issues involved whether the company complied with the EU general data protection regulation's requirement for data protection by design and default, said a Monday news release. The decision, backed by all other EU data protection supervisory authorities, requires Meta to bring its personal data processing into compliance and to pay a $275 million (265 million euro) fine. A Meta spokesperson stressed the DPC didn't say the incident constituted a personal data breach, hack or security failing. Meta is cooperating fully and "made changes to our systems during the time in question, including removing the ability to scrape our features in this way using phone numbers," he said: The company is "reviewing this decision carefully."
Soon-to-be Democratic “quadfectas” in Maryland, Michigan, Minnesota and Massachusetts -- where the governor, both legislative chambers and attorney general will be Democrats this January -- could boost privacy bills in those states, Husch Blackwell attorney David Stauss wrote Friday. Democrats in all four states proposed privacy bills in previous sessions. Massachusetts Gov.-elect Maura Healey (D) took on privacy issues when she was the state’s AG, Stauss said. AG support significantly boosted the Colorado and Connecticut privacy bills that became laws, though Democrats’ quadruple control in Washington state and Republicans’ complete control in Florida hasn’t resulted in privacy bills becoming laws in those states despite several attempts in recent years, the lawyer said. Some expect more internet regulation by states after Democratic wins in this year's elections (see 2211230062).
Conspiring with Google to “hijack” consumers’ smartphones without their knowledge or consent isn't a tool the Massachusetts Department of Public Health (DPH) “may lawfully employ in its efforts to combat COVID-19,” alleged a Nov. 14 privacy complaint (docket 3:22-cv-11936) in U.S. District Court for Massachusetts in Springfield. “Such brazen disregard for civil liberties” violates the U.S. and Massachusetts constitutions, “and it must stop now,” said the complaint. The DPH developed a COVID-19 contact-tracing software app for Android devices using a Google application programming interface, it said. An initial version of the app was made available in April 2021, “but few Massachusetts residents voluntarily installed that version,” it said. To increase adoption, the DPH worked with Google starting in June 2021 to “secretly install” the app on more than a million Android devices in Massachusetts, it said. The app causes the device to constantly connect and exchange information with other nearby devices via Bluetooth and creates a record of such other connections, it said. “If a user opts in and reports being infected with COVID-19, an exposure notification is sent to other individuals on the infected user’s connection record,” it said. Even if a user does not opt into the notification system, the app “still causes the mobile device to broadcast and receive Bluetooth signals,” it said. “In sum, DPH installed spyware that deliberately tracks and records movement and personal contacts onto over a million mobile devices without their owners’ permission and awareness,” said the complaint. “On knowledge and belief, that spyware still exists on the overwhelming majority of the devices on which it was installed.” The complaint seeks an injunction barring continued installations of the app, plus an order requiring DPH to work with Google on uninstalling the app from existing devices. Google and DPH didn’t comment.
The Oct. 21 class action alleging Google, through Google Assistant, “surreptitiously” collects, uses and stores voiceprints of each individual who speaks to an enabled device, in violation of the Illinois Biometric Information Privacy Act (see 2210260071), was reassigned randomly to U.S. District Judge Edward Davila for Northern California in San Jose, said a clerk’s order Thursday (docket 5:22-cv-06398). Davila is the judge presiding over the FTC’s lawsuit to block Meta’s Within Unlimited buy on antitrust grounds and who on Tuesday struck down Meta’s attempt to disqualify FTC Chair Lina Khan from the proceeding for her alleged bias against the company (see 2211030068). Ryan Segal, the plaintiff in the Google class action, declined Thursday to have his case tried before a magistrate judge, said a court document.
The California Privacy Protection Agency sought comment on proposed rules to implement the 2020 California Privacy Rights Act. Comments are due Nov. 21 by 8 a.m. PST, CPPA said Thursday. The CPPA board approved the revised draft rules Saturday (see 2210310074).
The California Privacy Protection Agency board directed staff “to take all steps necessary to prepare and notice modifications to the text of the proposed regulatory amendments for an additional 15-day comment period” on rules to implement the 2020 California Privacy Rights Act, the CPPA tweeted Saturday. The board finished weighing proposed edits Saturday, after starting to consider them Friday. “It was readily apparent during the meeting that the Board wants the regulations finalized as soon as possible,” blogged Husch Blackwell attorney David Stauss. Rules could be finalized by the end of January, CPPA General Counsel Philip Laird said at Friday’s meeting (see 2210280055).
“Unbeknownst to users,” Google, through Google Assistant, “surreptitiously collects, uses, and stores" voiceprints of each individual who speaks to an enabled device, in violation of the Illinois Biometric Information Privacy Act, alleged a class action Friday (docket 5:22-cv-06398) in U.S. District Court for Northern California in San Jose. “Google does not disclose its biometric data collection to its users, nor does it ask users to acknowledge, let alone consent to, these practices,” said the complaint. Google’s misconduct “also was deceptive, unjust, and unlawful” because it deceived consumers “into providing valuable biometric information, which Google used for its own benefit, without consent or compensation, in violation of California law,” said the complaint. “Given Google’s ongoing deception on concealment of this practice,” plaintiff Ryan Segal believes “additional information supporting his claim will be revealed after a reasonable opportunity for discovery,” it said. Google didn’t comment Wednesday.
Academics make up the entire roster of panelists for the FTC’s Nov. 1 PrivacyCon, the agency announced with its final agenda Tuesday. FTC Chair Lina Khan and Chief Technology Officer Stephanie Nguyen will give opening remarks. Academic panels will follow from 9:25 a.m. to 5 p.m. on consumer surveillance, automated decision-making, children’s privacy, devices, augmented reality, dark patterns and adtech.
Consumer Watchdog raised concerns with recent changes to draft California privacy rules required by the 2020 California Privacy Rights Act. The California Privacy Protection Agency (CPPA) board is scheduled to weigh proposed rules at meetings Friday, Saturday and Nov. 4 (see 2210240068). The agency “needs to get these rules right and these last-minute proposals would weaken otherwise tough rules in favor of California privacy rights,” Consumer Watchdog’s Justin Kloczko said Tuesday. One concerning change removes a proposed requirement that a business displays whether it processed a consumer’s opt-out, said the privacy advocate’s report. “But this simple notification will protect consumers from going through additional opt-out steps if they are unsure their rights have been honored” and will help them “flag websites for enforcement by the CPPA if those rights are not honored.” The consumer group is also concerned with the board proposing to delete a requirement that businesses identify third parties that collect personal information. “Consumers deserve to know who exactly will be handling their personal information when exercising their rights,” Kloczko said.
The California Privacy Protection Agency board adjusted its schedule for considering draft changes to state privacy rules required by the 2020 California Privacy Rights Act. The CPPA said Monday it will meet Nov. 4 at noon PDT. The agency also plans to meet this Friday and Saturday. It had planned to meet last Friday and Saturday but the meeting was canceled.