Tags on chips and a global registry of their locations can reduce the risks of AI espionage by hostile nations and criminal gangs, experts say.
The proposals were made in a new report on AI safety, which calls for a greater focus on the regulation of hardware.
Three Cambridge University institutes co-led the study paper, alongside OpenAI and the GovAI research community. They fear that governments are overlooking the dangers of compute, which could trigger disasters.
Without stronger protections, they warn that AI could turbocharge mass surveillance, large-scale influence operations, international instability, and even human extinction.
Governments are well aware of these dangers, but their safeguards largely concentrate on software. The Cambridge team advocates a change of priorities.
Haydn Belfield, a co-lead author of the report, noted that data and algorithms “are intangible and difficult to control.” Hardware, by contrast, is detectable, excludable, and quantifiable. It’s also produced via an extremely concentrated supply chain.
These characteristics make compute a good lever for regulation.
“Computing hardware is visible, quantifiable, and its physical nature means restrictions can be imposed in a way that might soon be nearly impossible with more virtual elements of AI,” Belfield said in a statement.
New guardrails for AI chips
Governments have attempted to curb the AI power of rival states by imposing export controls on semiconductors.
These measures have received a mixed response. Supporters say they’re effective in the short-term, but critics argue that they cause economic harm and ultimately accelerate the progress made by opponents.
The report present several alternative restrictions. One is adding a unique identifier to each chip, which would mitigate espionage and chip smuggling.
To reinforce these tags, an international registry could track the flow of chips destined for AI supercomputers.
All chip producers and sellers would be required to report every transfer and provide precise data on the compute controlled by each state and corporation. Regular audits would ensure that the records remain accurate.
“Governments already track many economic transactions, so it makes sense to increase monitoring of a commodity as rare and powerful as an advanced AI chip,” Belfield said.
Alongside the tags and registry, the report suggests “compute caps” to restrict AI chips and a “smart switch” to terminate dangerous use.
These proposals arrive amid a boom in the chip market. In the last few days alone, Nvidia surpassed Amazon in market capitalisation, while shares in semiconductor designer Arm soared by over 50%.
Any radical safety measures may, therefore, have a harder time gaining support from companies than governments.