Workshop
What Third-Party AI System Access Provisions, If Any, Should Be Mandated by Governments, and How? (Workshop co-hosted with the Oxford Martin School)
November 2024
Wednesday
27
12:00 - 15:00
(CET)
with
Programmes
While there is broad agreement among civil society, governments, and industry on the need for measures to enable external scrutiny of AI systems, current practices rely largely on voluntary access granted by AI developers. This approach may fall short in effectively addressing potential risks, prompting the critical question: should governments mandate such access, and if so, how can this be implemented effectively?
To tackle these pressing concerns, Lisa co-hosted a workshop with the Oxford Martin School, convening policymakers, technical experts, and representatives from industry, civil society, and research institutions. The goal was to explore the feasibility and implications of mandating third-party access to AI systems for evaluation, auditing, and research.
The workshop examined key considerations, including:
-Technical and procedural requirements: What infrastructure, security protocols, and eligibility criteria are necessary to facilitate responsible and effective access?
-Implementation scenarios: From minimal pre-deployment evaluations to tailored post-deployment access for specific research or compliance needs, what frameworks offer the right balance of oversight and practicality?
Engaging experts across diverse sectors, the workshop sought to inform governance decisions, including those tied to mandates under the EU AI Act. The discussions aimed to shape actionable recommendations for policymakers, laying the groundwork for robust and effective mechanisms to ensure external scrutiny in the evolving AI landscape.