The EU named the Big Tech firms subject to stricter rules under the Digital Services Act (DSA) Tuesday, and the U.K. government floated legislation aimed at cracking down on their market dominance in Britain. The DSA governs providers of intermediary services such as social media; online marketplaces; very large online platforms (those with at least 45 million active monthly users in the EU); and very large search engines. Those designated very large platforms and search engines must offer users a system for recommending content that's not based on profiling, and analyze the systemic risks they create for dissemination of illegal content or harmful effects on fundamental rights (see 2210040001). The list of 17 very large online platforms includes Amazon Store, Google (Play, Maps and Shopping), Facebook, Instagram, Twitter, Wikipedia and YouTube. The two very large online search engines are Bing and Google Search. They have four months to comply with the DSA. Meanwhile, the U.K. said Tuesday it also intends to crack down on Big Tech market dominance. The government introduced a bill establishing new powers to boost competition in "digital markets currently dominated by a small number of firms," clamp down on subscription traps to make it easier for consumers to opt out, and tackle fake reviews that cheat consumers via bogus ratings. The measure would give a Digital Markets Unit (DMU) within the Competition and Markets Authority new powers to go after large tech companies whose market dominance "stifled innovation and growth across the economy, holding back start-ups and small firms from accessing markets and consumers." The DMU could set tailored rules for businesses deemed to have strategic market status in key digital areas, with the biggest firms potentially required to give customers more choice and transparency. Failure to comply could mean fines of up to 10% of a company's global revenue. The measure needs parliamentary approval.
Enforcers are committed to protecting consumers against bias and discrimination in artificial intelligence and automated systems, FTC Chair Lina Khan said in a joint statement Tuesday with Consumer Financial Protection Bureau Director Rohit Chopra, DOJ Civil Rights Division Chief Kristen Clarke and Equal Employment Opportunity Commission Chair Charlotte Burrows. “Private and public entities use these systems to make critical decisions that impact individuals’ rights and opportunities, including fair and equal access to a job, housing, credit opportunities, and other goods and services,” they said. “Although many of these tools offer the promise of advancement, their use also has the potential to perpetuate unlawful bias, automate unlawful discrimination, and produce other harmful outcomes.” Enforcers will “vigorously use” the agencies’ “collective authorities” to protect individuals’ rights, they said.
NTIA is seeking public comment on how policies should be designed to ensure artificial intelligence technology can be trusted, the agency announced Tuesday. The public has until June 10 to comment. A request for comment, scheduled for Federal Register publication Thursday, said the agency is focused on “self-regulatory, regulatory, and other measures and policies that are designed to provide reliable evidence to external stakeholders – that is, to provide assurance - - that AI systems are legal, effective, ethical, safe, and otherwise trustworthy.” NTIA will deliver a report on AI “accountability policy development.”
Small businesses support the Biden administration’s “activist” antitrust enforcement approach at the FTC and DOJ, FTC Commissioner Alvaro Bedoya said Monday at a University of Utah antitrust event. Bedoya addressed claims the FTC’s approach under Chair Lina Khan has been “bad for business.” Bedoya said he has met with small-business leaders in agriculture, pharmaceuticals and consumer goods in red states like West Virginia, Iowa, South Dakota and Utah. “They’re not saying, ‘Please enforce less because it hurts us,’” he said. “They’re saying, ‘What took you so long?’ They’re saying, ‘We don’t have a level playing field. And for us to do our communities right, we need more enforcement.’” They’re in favor of this “radical idea” that the law should be enforced rigorously, said Bedoya. Facts and law dictate when enforcers bring antitrust cases, DOJ Antitrust Division Chief Jonathan Kanter said on a separate panel. Enforcers need to recognize that in tech markets, competition might not come in traditional forms like brick-and-mortar rivals but instead from a disruptive technology or service that “disintermediates” a platform. When a merger substantially lessens competition, DOJ is mandated to take action under Section 7 of the Clayton Act, he said. DOJ has to enforce the law as written, said Deputy Assistant Attorney General-Antitrust Manish Kumar. Whether it’s good or bad for business, DOJ reserves authority under Sherman Act Section 1 for prosecution of the “most egregious” antitrust violations where the conduct is “irredeemable,” he said: “I don’t think any reasonable person can argue that engaging in this type of conduct is somehow good for business. I think it’s quite the opposite.” He and Kanter noted four district courts have sided with the antitrust division in motions to dismiss under this administration. Restraining a company’s monopoly power is never a “bad thing,” said University of Utah economics instructor Hal Singer.
Arizona Attorney General Kris Mayes (D) banned TikTok Wednesday on all devices owned by her office due to security risks. “We cannot risk the potential exposure of our data to foreign entities,” she said. “Banning TikTok on state-owned devices is a necessary measure to protect our operations, and I urge other state agencies to take the same proactive steps to safeguard their data.” She noted TikTok CEO Shou Zi Chew couldn’t “definitively state that the Chinese government cannot access data collected from U.S. users” when testifying before Congress (see 2303290048).
Proposed EU rules to fight child sexual abuse need changes, tech organizations said Tuesday. The organizations "are deeply committed to making the digital space safe for everyone, and, in particularly, to protecting children online," but some provisions of the regulation need "further reflection" to achieve their objective, said groups including the Computer & Communications Industry Association Europe, ACT|The App Association, Cloud Infrastructure Services Providers in Europe and the Information Technology Industry Council. They recommended the proposal's scope be narrowed to focus on service providers that are best placed to take effective mitigation and enforcement actions, such as on providers that present a high risk of online child abuse. Risk mitigation efforts should be broader and include voluntary measures that the industry carries out proactively, they said. The tech sector has been active in defining child safety online, and under the current voluntary system developed technology to help prevent, detect, report and remove the increasing amount of child sexual abuse worldwide, the groups said. Providers are also concerned there's no operational plan to transition from the current ePrivacy law, which allows voluntary scanning, to the proposed regulation, which would allow scanning only with a detection order that could be issued only after a long process of checks and balances. They also recommended the regulation explicitly protect encryption: By requiring service providers that employ end-to-end encryption to filter and scan for child sexual abuse material and grooming, the measure risks weakening or breaking encryption. The legislation calls for the creation of an EU Center, but the organizations noted there's already a framework for reporting child sexual abuse. Since this is a global issue, they added, there should be more cooperation with existing entities and the role of the EU body in the system should be clarified.
President Joe Biden signed an executive order barring federal agencies from using commercial spyware that “poses risks to national security” or has been used by foreign actors to enable human rights abuse, the White House announced Monday. The EO applies to spyware tools created by foreign and domestic companies, the White House said. The EO sets out a list of factors to determine risks, including concerns about whether the entity has used the spyware to gain access to U.S. government devices, whether the entity has abused data and espionage issues.
The FTC issued a proposed rule Thursday requiring companies to “make it as easy to cancel” subscriptions as “it was to sign up.” The proposed “click to cancel provision” would help consumers avoid the “seemingly never-ending struggles to cancel unwanted subscription payment plans for everything from cosmetics to newspapers to gym memberships,” the agency said. The commission voted 3-1 along party lines to issue a notice of proposed rulemaking that would modify the FTC’s Negative Option Rule. That rule regulates how companies interpret a consumer’s failure to act as the acceptance of an offer. States have been moving forward with laws targeting subscription renewal policies due to the rise in streaming and online services (see 2112200060). “Some businesses too often trick consumers into paying for subscriptions they no longer want or didn’t sign up for in the first place,” said FTC Chair Lina Khan. “The proposal would save consumers time and money, and businesses that continued to use subscription tricks and traps would be subject to stiff penalties.” The agency will gather public comment once the notice is published in the Federal Register. The proposal calls for companies to have a “simple cancellation mechanism.” A consumer would be able to cancel on the same website they subscribed in the same number of steps, the FTC said. Companies would have to ask consumers for consent to make additional offers when a user tries to cancel, the agency said: “In other words, a seller must take ‘no’ for an answer and upon hearing ‘no’ must immediately implement the cancellation process.” The rule also contemplates requiring companies to provide annual reminders “to consumers enrolled in negative option programs involving anything other than physical goods, before they are automatically renewed.” The proposed changes would apply to “all forms of negative option marketing in all media,” including phone, internet, print media and in-person transactions. Wilson said the proposal attempts an “end-run around” the Supreme Court’s decision in AMG (see 2104260065) to “confer de novo redress and civil penalty authority on the Commission for Section 5 violations unrelated to deceptive or unfair negative option practices.” Manipulative tactics involving subscription services have gotten worse over time, Khan said in a joint statement with Commissioners Rebecca Kelly Slaughter and Alvaro Bedoya: The rule creates a “powerful deterrent by introducing the risk of civil penalties.”
A new survey by Atlas VPN found 42% of people asked in 17 countries believe AI will replace jobs in their area of work. “AI models like ChatGPT can already write articles on specific topics or help copywriters write text for a company website,” said a Wednesday release: “Other AI tools help automate tasks, collect and analyze data, create graphic designs, or handle basic customer queries.” Almost as many respondents, 39%, disagreed and believe AI can't replace them. The survey found 67% of respondents remain optimistic about “the benefits AI can bring to society” and 60% are excited about AI technology. Conducted in September and October, the survey had 17,193 respondents.
The FTC requested public comment on cloud computing business practices and impacts on competition and data security. The agency’s request for information is open for comment until May 22. The agency is seeking input about economic reliance on a “small handful of cloud service providers,” cloud customers’ ability to negotiate, competition for storage services, artificial intelligence products and data security disclosure practices. “The RFI is aimed at better understanding the impact of this reliance, the broader competitive dynamics in cloud computing, and potential security risks in the use of cloud,” said FTC Chief Technology Officer Stephanie Nguyen Wednesday.