The White House Office of the National Cyber Director is seeking public input on “liability” regimes for holding software companies accountable when they sell technology lacking proper cyber protections, National Cyber Director Larry Coker said Wednesday during ITI’s Intersect event. The Biden administration’s national cybersecurity strategy calls for new liability when software companies “rush insecure code to market,” Coker said. He said his office is working with academic and legal experts exploring “different liability regimes.” The strategy calls for minimizing compliance burdens on companies, so the office is working with other agencies to harmonize requirements through public feedback (see 2311030046). Coker said that in the coming weeks, his office will release a paper that addresses memory safety and software measurability.
Commerce Secretary Gina Raimondo on Wednesday announced leaders of the AI Safety Institute. The National Institute for Standards and Technology established the AISI as directed by President Joe Biden in his AI executive order. Elizabeth Kelly, White House National Economic Council special assistant to the president for economic policy, will serve as AISI's director. Chief technology officer is Elham Tabassi, a senior research scientist at NIST. They “will provide the direction and expertise we need to mitigate the risks that come with the development of this generation-defining technology, so that we can harness its potential,” Raimondo said. Raimondo is scheduled to announce members of AISI’s AI Safety Institute Consortium on Thursday. Composed of AI creators, AI users, academics, researchers and civil society organizations, AISIC will support “development and deployment of safe and trustworthy artificial intelligence,” the department said.
The U.K.’s new online speech law is one of the “most fundamental” tools allowing its telecommunications regulator to shed light on how platforms handle misinformation, Jessica Zucker, Office of Communications (Ofcom) director-online safety policy, said Monday. Speaking at a Silicon Flatirons event, Zucker said the U.K.'s Online Safety Act of 2023 is intended to help tackle online malfeasance at scale with systemic improvements across the internet. The law established a duty of care requiring companies to implement policies for removal of illegal content, like child sex abuse material, and legal but “harmful” content. Zucker stressed the law doesn’t require Ofcom to instruct tech companies to remove individual pieces of content or investigate individual complaints.
The FTC is “pleased” Amazon and iRobot abandoned their $1.4 billion deal, the agency said Wednesday in a statement (see 2401290044). The FTC’s investigation of the agreement focused on “Amazon’s ability and incentive to favor its own products and disfavor rivals’, and associated effects on innovation, entry barriers, and consumer privacy,” said Associate Director-Merger Analysis Nathan Soderstrom. “The Commission’s investigation revealed significant concerns about the transaction’s potential competitive effects.” The companies cited European enforcers’ opposition to the deal in their announcement.
The Texas Cable Association wants the 5th U.S. Circuit Court of Appeals to hold the FCC’s Nov. 20 digital discrimination order unlawful and to set it aside, said its petition for review (docket 24-60048), filed Tuesday and posted Thursday. It shares the docket number with the petition that the U.S. Chamber of Commerce also filed Tuesday along with the Texas Association of Business and the Longview, Texas, Chamber of Commerce (see 2401300053).
Minnesota should ban dark patterns from social media platform design, Attorney General Keith Ellison (D) recommended in a report released Thursday. Minnesota’s Legislature in 2023 directed Ellison to deliver the report by February. Examples of dark patterns include features that optimize the amount of time users spend on platforms, auto-load content and notifications meant to maximize engagement, said Ellison. He highlighted the impact on teens and adolescents: “I will continue to use all the tools at my disposal to prevent ruthless corporations from preying on our children.”
With the FCC’s Nov. 20 order adopting a definition of “digital discrimination of access” to broadband internet service published Jan. 22 in the Federal Register, the U.S. Chamber of Commerce, the Texas Association of Business and the Longview, Texas, Chamber of Commerce filed a petition for review Tuesday at the 5th U.S. Circuit Court of Appeals seeking to have the order vacated. Tuesday’s petition differs little from the "protective" petition the groups filed Jan. 19 “out of an abundance of caution” in case the 5th Circuit determined that the date of public notice was the date of public release rather than the date of FR publication (see 2401230004). The U.S. Chamber “supports expanded access of fast, affordable, and reliable internet,” that “private sector innovation” drives, Neil Bradley, executive vice president-chief policy officer and head-strategic advocacy, said in a statement Tuesday. The FCC's new rule “will hinder efforts to bridge the digital divide -- hurting the very people it aims to help,” Bradley argued. It empowers the FCC to “micromanage the internet” and “subverts” the badly needed broadband investments included in the Infrastructure Investment and Jobs Act, he said.
The FTC is extended the deadline to decide on the video game industry’s request about using face-scanning technology to determine user ages, the agency announced Monday. The Entertainment Software Rating Board, Yoti and SuperAwesome filed an application in June seeking FTC approval for the age-estimation technology, which uses facial geometry to determine whether a user is an adult. In September, the agency extended the original October deadline to January and now is extending it to March 29. The agency solicited public comment on the application in July, as required under the Children’s Online Privacy Protection Act.
Competition policies like those contemplated in South Korea unfairly target American tech companies and harm consumers, the U.S. Chamber of Commerce said in a statement Monday opposing Korea’s Online Platform Competition Promotion Act. The chamber asked Korean officials to provide the full text of the platform legislation and ample time to respond. Opponents say the bill is a European-style law that runs counter to international trade norms. Proposals like this “trample on competition that clearly benefits consumers, ignore good regulatory practices fundamental to sound regulatory models, and place governments in a position of violating their trade commitments by arbitrarily targeting foreign firms,” said Charles Freeman, senior vice president-Asia.
Congress should approve legislation protecting children on social media, Fairplay said Monday, launching an initiative led by families of children who have died in social media-related suicides and accidents. Fairplay and David’s Legacy Foundation launched Parents for Safe Online Spaces, a parent-led initiative whose goal is raising awareness of social media's dangers for children. Members plan to attend the Senate Judiciary Committee’s hearing Wednesday when the CEOs of Meta, X, TikTok, Discord and Snap are scheduled to testify (see 2311290072). The advocacy groups called for passage of the Kids Online Safety Act (see 2312040058). KOSA is a “needed corrective to social media platforms’ toxic business model, which relies on maximizing engagement by any means necessary, including sending kids down deadly rabbit holes and implementing features that make young people vulnerable to exploitation and abuse,” said Josh Golin, Fairplay's executive director.