Export Compliance Daily is a service of Warren Communications News.

US Tech Firm Suggests Real-Time Export Control Monitoring for AI

A U.S. software company is asking the Commerce Department to rethink the way export controls are imposed and enforced over AI technologies, arguing for an automated approach that it said can prevent AI systems from being used in ways that violate U.S. licensing rules.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Docusign, a cloud-based platform that offers electronic signature services, said Commerce should “formally recognize” the “trust layer” -- a set of security and data protection frameworks for AI -- and incorporate it into export controls. It said this could allow for the “revocation of transaction authority if end-users violate usage policies or sanctions” related to an AI system or technology.

“This capability transforms export control from a static ‘license check’ at the border into a dynamic, revocable privilege managed through software governance,” Docusign said in public comments to Commerce this month on the agency’s upcoming AI exports program (see 2512150024, 2510220008 and 2507240019).

The company said export compliance can be “enforced at the Trust Layer to ensure the integrity of the exported US technology.” This would allow exporters to remotely verify end users and control how they access a specific technology so that “only approved end-users access and interact with the sensitive AI system or its resulting data/documents,” Docusign said.

In some cases, the AI system can feature a “soft pause” function, which would temporarily freeze the AI's task and revoke permissions to “buy time for human review and debugging without permanent cessation.” In other cases, the framework can be used to block specific “high-risk tools, systems, or data” while “allowing the benign parts of the system to remain operational.”

Docusign also said the “trust later” can be used to incorporate real-time sanctions and Entity List screening for AI systems. And if a particular end-user “acts contrary to U.S. foreign policy interests, their ability to generate legally binding signatures or execute agreements can be revoked instantly,” it said.

The company argued that current export control regulations might be ineffective in regulating AI.

“Traditional export controls face a ‘point-in-time’ limitation: once hardware or model weights are physically delivered to a foreign entity, revocation is operationally difficult,” it said. “We propose that the Trust Layer serve as a continuous, real-time enforcement mechanism capable of implementing a framework of graduated controls and mitigation that is capable of prioritizing interruption, recovery and scope limitation to protect U.S. interests.”

U.S. AI companies, along with some technology policy analysts, have criticized other proposals that would allow exported U.S. technologies to be tracked or deactivated if found to violate a license (see 2508060020 and 2507170040).