SURVEY

Policy prototyping - AI act

18.03.2022

As a Knowledge Centre, we closely monitor relevant EU policy developments and initiatives related to data and AI. An important example is the proposed AI Act ("AIA") published in April 2021. This is a very comprehensive policy proposal and may have far-reaching impacts on actors in the AI ecosystem. As a Knowledge Centre, we want to be able to better assess that impact. Therefore, we are collecting feedback from different stakeholders by means of a policy prototyping exercise. We plan to use the results of this exercise to enrich the Flemish, Belgian and European debate on the AIA.


What is policy prototyping?

Policy prototyping (PP) refers to an innovative way of policy making, similar to product or beta testing. In PP, there is a phase in which design/prototype rules are tested in practice before they are finalised and declared. More information on PP can be found here and here.

Why do we do a PP of the scope of the proposed AI act?

Flanders invests in AI and the AIA imposes some red lines that AI applications have to follow. For instance, an AI application can be a 'prohibited' or 'high-risk' application, which has consequences in terms of obligations to be respected. However, if the scope and definitions of the AIA are too broad or too strict, now is the time to adjust them by providing timely feedback.

The Knowledge Centre therefore uses a PP exercise to review the definition of AI system and Articles 5 (connected AI systems) and 6 (high-risk AI systems) of the AIA. Thus, in the context of this exercise, the AIA in its current wording is the prototype. In order to test these provisions and simulate their compliance by organisations, we have prepared two surveys.

How do we proceed?

To test the scope of the AIA, we made two surveys:

The first checklist survey
helps to evaluate whether your AI application would fall under the scope of the regulation. If it does, you can find out if the application is in the prohibited or high-risk category. You can let us know if you agree or not. To complete this survey, it is useful to have a list of definitions at hand. This list can be found here.

The second survey assesses the clarity and usefulness of the concepts of the AIA and can be used to provide feedback if, for example, the definitions are unclear or inconsistent.

Where possible, we will also conduct some in-depth interviews with respondents to get a more accurate picture of their feedback.

Why participate?

The AIA is coming. This exercise offers a chance to help determine the final content of the AIA and a chance to see how future-proof your AI application would be if the regulation were to pass unchanged. In short, by participating, you can:

  • Find out if your AI application will be considered prohibited or high-risk under the AIA so you can assess whether your product or application is future proof.
  • Have an impact on the debate by pointing out things that are unclear, too broad or too narrowly defined.

What happens to the results?

The results of the first survey can give us an insight into how many of the respondents would indeed fall under the scope of the AIA and to what extent they use prohibited or high-risk AI applications.

The results of the second survey will give us an understanding of how comprehensible and useful the concepts used by the AIA are to the respondents.

The Knowledge Centre Data & Society will use the aggregated results to inform and guide the policy debate at the Flemish, Belgian and European levels. We will not make individual results public, unless approved by the party involved.

Proposed steps

You have an AI system or application

You can start this policy prototyping survey with a concrete AI application in mind. If you want to know in which category this application fits, use the checklist survey.

If certain definitions and concepts are unclear during this survey, you can also go through the second survey first. Relevant terms and concepts are framed and questioned here.

You do not have an AI system or application


You can immediately start the second survey and give feedback on the clarity and usability of the concepts that are central to the AI regulation.