Export Compliance Daily is a Warren News publication.

Complying With EU's New AI Act Will Be 'Serious Undertaking,' Lawyers Say

Upcoming European rules surrounding artificial intelligence products will have significant supply chain implications for developers and suppliers, lawyers said last week. They also suggested one provision in the law could lead to a surge in EU AI imports before the restrictions take effect.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The AI regulations, which will undergo negotiations between the European Council and the European Parliament after they were approved by Parliament last month (see 2306140038), could place new due diligence requirements on importers and exporters, along with a host of others with ties to the industry (see 2305160001). Simmons & Simmons lawyer William Dunning said the law is “likely” to impose an “outright” ban on certain AI systems that pose an “unacceptable risk,” including systems used for biometric identification, biometric categorization, emotion recognition, social scoring and “subliminal manipulation.”

But the more challenging compliance hurdles will surround certain “high-risk” AI systems that won’t be banned but will instead face “detailed regulatory requirements." Those could include AI systems used in a host of applications, including in medical devices and machinery and for biometric identification, critical infrastructure projects and law enforcement purposes.

“If you're a provider of a high-risk AI system,” Dunning said during a webinar hosted by the law firm last week, “complying with these obligations is going to be a serious undertaking.”

Minesh Tanna, also a lawyer with Simmons & Simmons, said the European Parliament recently revised the rule to expand compliance requirements on supply chain actors that weren’t originally impacted by the restrictions. The rule now could set restrictions on “parties that supply services or processes or components that are integrated into high-risk AI systems,” such as hardware developers, [graphics processing unit] suppliers” and parties that help build certain high-risk AI systems “by providing datasets, data labeling services, and so on.”

Under the rules, those parties will “have to provide information capabilities or other assistance -- which is obviously quite wide -- to enable the provider” of the AI to comply with the EU’s law. “For the first time, what we're seeing is an extension of the supply chain further back,” Tanna said. “So that's potentially quite a burdensome obligation on the third party.”

He added that the law will “have a significant impact on contracts, particularly contracts for the supply” of these AI systems. He expects AI companies and providers to begin working on changes to contracts to “insulate themselves against some of the most onerous provisions of the act.”

Tanna also pointed to what he said is an overlooked provision in the law that says the rules won’t apply to high-risk AI systems that enter the EU market before the legislation takes effect, “unless they're subject to significant changes in their design or intended purpose.” He said he’s “slightly surprised” that provision hasn’t received more attention, adding that it could lead to a spike in AI imports.

“Will developers try and rush to get their high-risk AI models onto the market before the act comes into force? Will customers rush to procure high-risk AI systems before the act comes into force?” he said. “It’ll be interesting to see how it plays out.”