It would be irresponsible for the EU to cast aside regulation of European foundation model developers. To support its SMEs and ensure AI works for people and society, the EU must create rules for these companies in the AI Act, writes Connor Dunlop.
Connor Dunlop is the EU Public Policy Lead at the Ada Lovelace Institute.
The European Union has a long history of regulating technologies that pose serious risks to public safety and health. Whether it’s automobiles, planes, food safety, medical devices or drugs, the EU has established product safety laws that create clear rules for companies to follow.
These rules keep people safe, protect their fundamental rights, and ensure the public trusts these technologies enough to use them. Without regulation, essential public and commercial services are more likely to malfunction or be misused, potentially causing considerable harm to people and society.
AI technologies, which are becoming increasingly integrated into our daily lives, are no exception to this.
This is the lens through which to view the current debate in the EU over the AI Act, which seeks to establish harmonised product safety rules for AI. This includes foundation models, which pose significant risks given their potential to form AI infrastructure that downstream SMEs build from.
That is why EU legislators have proposed guardrails for foundation model providers, including independent auditing, safety and cybersecurity testing, risk assessments and mitigation.
Given the range and severity of risks that foundation models raise, these proposals are reasonable steps for ensuring public safety and trust – and for ensuring that the SMEs using these products can be confident they are safe.
But last week, France, Germany and Italy rejected these requirements and proposed that foundation models should be exempt from any regulatory obligations.
This position has now raised the prospect of indefinitely delaying the entire EU AI Act – which covers all kinds of AI systems, from biometrics technologies to systems that impact our electoral processes.