Sam Altman, OpenAI’s CEO, Microsoft chief Satya Nadella, Alphabet CEO Sundar Pichai are joining the government’s Artificial Intelligence Safety and Security Board, according to The Wall Street Journal. They’re also joined by Nvidia’s Jensen Huang, Northrop Grumman’s Kathy Warden and Delta’s Ed Bastian, along with other leaders in the tech and AI industry. The AI board will be working with and advising the Department of Homeland Security on how it can safely deploy AI within the country’s critical infrastructure. They’re also tasked with conjuring recommendations for power grid operators, transportation service providers and manufacturing plants on how they can can protect their systems against potential threats that could be brought about by advances in the technology.
The Biden administration ordered the creation of an AI safety board last year as part of a sweeping executive order that focuses on regulating AI development. In the Homeland Security’s website, it said the board “includes AI experts from the private sector and government that advise the Secretary and the critical infrastructure community.” Homeland Security secretary Alejandro Mayorkas told the Journal that the use of AI in critical infrastructure can greatly improve services — it can, for instance, speed up illness diagnoses or quickly detect anomalies in power plants — but they carry a significant risk which the agency is hoping to minimize with the help of this board.
That said, one can’t help but question if these AI tech leaders can provide guidance that aren’t meant to primarily serve themselves and their companies. Their work centers around advancing AI technologies and promoting their use, after all, while the board is meant to ensure that critical infrastructure systems are using AI responsibly. Mayorkas seems to be confident that they’ll do their jobs properly, though, telling the Journal that the tech leaders “understand the mission of this board,” and that it’s “not a mission that is about business development.”
This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.