policy monitor

European Commission – Liability Rules for Artificial Intelligence

The European Commission recently published two Proposals to adapt liability rules to the digital age, circular economy and the impact of global value chains: one that revises the Product Liability Directive (‘revised PLD’) and another one that introduces an extra-contractual civil liability regime for AI systems (‘AI Liability Directive’). The revised PLD substantially modifies the current regime by including software in its scope, integrating new circumstances to assess the product’s defectiveness, and introducing presumptions of defectiveness and causation. The AI Liability Directive contains rules on the disclosure of information and the alleviation of the burden of proof in relation to damage caused by (high-risk) AI systems.

What: Legislative proposal

Impactscore: 1

For who: policy makers, AI-related companies, sector organisations, citizens and legal professionals

URL: https://ec.europa.eu/commission/presscorner/detail/nl/ip_22_5807

Summary

The European Commission recently published two Proposals which aim to adapt (tort) liability rules to the digital age, circular economy and the impact of global value chains: one that introduces an extra-contractual civil liability regime for AI systems (‘AI Liability Directive’) and one that revises the Product Liability Directive (‘revised PLD’).

AI Liability

The purpose of the AI Liability Directive is to improve the functioning of the internal market by laying down uniform requirements for certain aspects of non-contractual civil liability for damage caused with the involvement of an AI system.

Article 3 of the Directive deals with the disclosure of evidence. A court may, for instance, order the disclosure of relevant evidence about specific high-risk AI systems that are suspected of having caused damage. The requests for evidence should be addressed to the parties listed in the Directive (e.g. a provider or a user of an high-risk AI system). They should be supported by facts and evidence sufficient to establish the plausibility of the contemplated claim for damages and the requested evidence should be at the defendants’’ disposal. Several elements are included to safeguard the position of defendants (e.g. providers and user of high-risk AI systems). The claimant should, for instance, first undertake all proportionate attempts at gathering the relevant evidence from the defendant before the national court can order the disclosure of the evidence. In any case, where a defendant fails to comply with an order by a national court to disclose or to preserve evidence at its disposal, a national court is allowed to presume the defendant’s non-compliance with a relevant duty of care (and thus a fault).

Article 4 introduces a rebuttable presumption of a causal link in the case of fault. The presumption of causality only applies when different conditions are met: (i) the proof of a defendant’s fault (consisting in the non-compliance with a duty of care), (ii) it can be considered reasonably likely that the fault has influenced the (failure of) output produced by the AI system and (iii) the incurrence of damages. The defendant, however, has the right to rebut the presumption of causality. Additional provisions apply for provider and users of high-risk AI systems violating certain provisions of the AI Act, as well as for low-risk AI systems and in case a defendant uses the AI system in the course of a personal, non-professional activity.

Product Liability

The revised PLD aims to address some of the implementation challenges that have been raised by new technologies, such as AI and connected products, and the increased role software plays in products. It contains common rules on the liability of economic operators for damage suffered by natural persons caused by defective products. Its scope of application is thus broader than the currently applicable PLD. A novelty is that the revised PLD explicitly refers to software and digital manufacturing files as products (Article 2). The inclusion of software is rather surprising, yet essential.

Article 7 lists the types of ‘economic operators’ that can be held liable for defective products. This list is broader than the previous one, as it also includes manufacturers of a defective component, fulfilment service providers and providers of online platforms allowing consumers to conclude distance contracts.

Article 6 defines the notion of defect, which is somewhat similar to the currently applicable PLD. A product is deemed defective if it fails to ‘provide the safety which the public at large is entitled to expect, taking all circumstances into account’. However, the non-exclusive list of such circumstances that allow to assess the product’s defectiveness is expanded, including the effect on the product of any ability to continue to learn after deployment.

Article 8 is a key provision with respect to (AI) products. It allows Member States’ courts to require the defendant to disclose to the injured person relevant evidence that is at its disposal. The claimant should, however, present facts and evidence sufficient to support the plausibility of the claim for compensation. Similar to the AI Liability Directive, other mitigating factors are included as well to safeguard the defendant’s position (e.g. courts need to limit the disclosure of evidence to what is necessary and proportionate).

Article 9 is another key provision as it contains a presumption of defectiveness and causality under certain conditions. Similar to the existing PLD, Article 10 contains defences that allow economic operators to escape liability. An economic operator will, for instance, not be held liable when it is probable that the defect that caused the damage did not exist when the product was placed on the market/ put into service or that its defectiveness came into being after that moment. An economic operator, however, will not be exempted from liability where the defectiveness of the product is due to any of the following, provided that it is within the manufacturer’s control: (a) related service; (b) software, including software updates or upgrades; or (c) the lack of software updates or upgrades necessary to maintain safety.

Conclusion

Both proposals will now need to be discussed by the European Parliament and the Council of the EU. Many discussions will likely arise and the necessary answers will need to be formulated to remedy the remaining legal challenges and unclarities.