A wide-ranging executive order focused on artificial intelligence has called for large computing clusters to be registered with the government.

The EO, signed by President Joe Biden this week, also calls for artificial intelligence developers to reveal the presence of large models.

White House lawn Washington Capital
– Sebastian Moss

The order asks "companies, individuals, or other organizations or entities that acquire, develop, or possess a potential large-scale computing cluster to report any such acquisition, development, or possession, including the existence and location of these clusters and the amount of total computing power available in each cluster."

Specifically, the government wants details on "any model that was trained using a quantity of computing power greater than 1026 integer or floating-point operations, or using primarily biological sequence data and using a quantity of computing power greater than 1023 integer or floating-point operations."

It also wants information on "any computing cluster that has a set of machines physically co-located in a single data center, transitively connected by data center networking of over 100 Gbit/s, and having a theoretical maximum computing capacity of 1020 integer or floating-point operations per second [Flops] for training AI."

However, it is not clear what precision Flops the government expects companies to benchmark to - FP64, 32, 16, or even 8. The lower the precision, the higher the Flops number.

AI companies usually use 16-bit precision to report their performance, and this is the most likely metric the White House is using.

Companies whose models are covered by the mandate must perform safety tests on their models and share the results of those tests with the government before releasing the models more widely.

Developers will also have to provide updates to the federal government before and after deployment, showing testing that is "robust, reliable, repeatable, and standardized."

The National Institute of Standards and Technology will develop standards for red-team testing of these models by August 2024.

In addition, US infrastructure-as-a-service (IaaS) providers will have to ensure that foreign resellers verify the identity of any foreign person that obtains an IaaS account.

The Department of Homeland Security will work to make it easier for experts in AI and other critical and emerging technologies to move to the US. It is also tasked with mitigating the risk of critical infrastructure disruptions due to AI.

Separately, every US federal agency will have to designate a Chief AI Officer within 60 days, with an interagency AI Council to coordinate federal action. The Department of Commerce will develop standards for how to detect and label AI-generated content.