This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields Risk & Compliance

| 5 minute read

The EU Product Liability Directive: Key Implications for Software and AI

The European Union‘s updated Product Liability Directive (PLD), adopted by the European Council today and expected to come into effect in a few weeks, marks a significant shift in how civil liability for defective products is addressed in the digital age. The PLD replaces the EU’s current product liability framework that has been in place since 1985 and introduces groundbreaking new provisions for digital products, particularly software, artificial intelligence (AI), and Internet of Things (IoT) devices. For a general overview on the new rules please see our previous blog post here.

No-fault liability and the digital landscape

The concept of no-fault liability is not restricted to tangible products anymore, but will be expanded to digital products and services. Software is no longer seen as a mere accessory or service but as a product subject to the same rigorous liability rules as physical goods. This extension also applies to interconnected devices, such as those forming part of the IoT, reflecting the evolving complexity of digital ecosystems. Producers of software, AI systems, and IoT devices can now for the first time be held responsible for damage caused by their defective products regardless of fault. 

The new PLD also means that the scope of possible defendants for damage claims increases: In addition to the manufacturer and the EA importer it, inter alia, addresses authorised representatives of the manufacturer as having an own liability – not surprising given the importance of EU legal representatives under the AI Act and other pieces of legislation on tech regulation. Natural or legal persons that substantially modify a product outside the manufacturer’s control and thereafter make it available on the market or put it into service are considered a manufacturer of the product, as well, and assume liability. While this largely corresponds with the wording of the AI Act for (high-risk) AI systems, it might raise substantial problems with regards to the fine-tuning of GPAI models, for which the AI Act does not explicitly provide for the concept of “substantial modification”.

Defining defectiveness of digital products

The PLD considerably expands the criteria for determining product defectiveness. According to Article 7(1), a product is considered defective if it fails to meet the safety requirements set out by Union law, including key regulations like the AI Act. This means that for AI systems, a non-compliance with these legal frameworks suffices to establish defectiveness in the sense of the PLD.

Article 7(2) goes further to address unique aspects of digital products, specifically software and AI. The capacity of AI to ‘learn’ or evolve after deployment introduces a new layer of complexity to assessing product safety and liability. The PLD ensures that liability encompasses changes in the product over time, especially if these updates compromise safety or performance. For instance, an AI system might be deemed defective not only based on its initial performance but also on how it behaves after receiving software updates or interacting with other interconnected devices. The previously recognised ‘factory gate principle’, which refers to the legal test for determining a product's defectiveness at the moment it leaves the manufacturer's control, is being extended for digital products. Liability remains in place as long as the manufacturer retains control over the product (e.g. through updates).

Private data protection

The PLD acknowledges the critical importance of protecting personal data, a key issue in the age of digital products. According to Article 6, damage to private data - such as its destruction or corruption - can be grounds for a product liability claim.

The PLD’s emphasis on data protection is topical, considering the increasing cybersecurity risks posed by interconnected devices. This means that not only is the physical safety of users considered but also their “digital well-being”, which is equally exposed to risk emanating from defective digital products.

Restricted exemptions from liability for software and AI

Under the old regime, the liability framework was primarily designed for physical goods and included several exemptions from liability. The PLD adopts these exemptions but explicitly states in Article 11(2) that liability is not excluded for a related service or software (including updates), even if it is probable that the defectiveness did not exist at the time the product was placed on the market. This different approach compared to physical goods continues the idea of the test for defectiveness, according to which a manufacturer's liability continues after the product has been placed on the market as long as they retain control over the product. This prevision might effectively create an obligation to update products to the latest safety standards (where this obligation is not already existing through provisions under the EU’s tech regulation framework).

Disclosure of source code?

A major concern for software and AI developers under the new PLD is the significant risk posed by the requirement to disclose sensitive information, such as source code, in the course of product liability litigation. Article 9 introduces provisions for disclosing necessary evidence in legal proceedings, which could compel businesses to reveal proprietary technologies and confidential details central to their operations, such as source code, individual algorithms, training and validation data. The PLD also provides for important protective measures aimed at mitigating some of these concerns. For example, the PLD stipulates that any disclosure of sensitive documents must be ‘necessary and proportionate’. Courts are also required to take steps to prevent misuse of disclosed information. Additionally, the directive emphasizes the importance of confidentiality and trade secret protection throughout the disclosure process. Given that most Continental European jurisdictions do not have sophisticated disclosure regimes, it remains to be seen how this balance can be struck in practice.

Presumptions of defectiveness and causal link

The PLD introduces important changes to the burden of proof, making it easier for claimants to successfully pursue claims. Traditionally, claimants had to fully prove that a product was defective and that this defect caused the damage. The PLD provides for several presumptions of defectiveness and / or the causal link in Article 10. In particular, it introduces a presumption of defectiveness and / or causal link in cases where the disclosure obligation has not been (fully) met or proving the defect is excessively difficult for the claimant due to the technical or scientific complexity of the product. For AI and software, which often involve highly complex technologies, this shift is especially significant. 

Conclusion: A new era for digital product liability

The PLD marks a transformative shift in the legal landscape surrounding digital products. By extending no-fault liability to software, AI, and interconnected devices, expanding the definition of defectiveness, and ensuring protection for personal data, the directive reflects the increasing role of digital products in every-day life. As the introduction of the disclosure obligation and new burden of proof rules will make it easier for consumers to assert their claims, companies involved in developing digital products must now take even greater care to ensure their products are safe, secure, and compliant with evolving legal standards.

Manufacturers of digital products should not be distracted from the PLD by the AI Liability Directive (AILD), whose name suggests a more specific liability regime for AI. The planned, but stalled AILD provides procedural alleviations for claimants under fault-based non-contractual liability and thus carries fewer risks for manufacturers than PLD. For more information on the AILD, please see our blogpost here.

Tags

ai, consumer protection, eu ai act, litigation, life sciences, product liability, product risk team, europe, manufacturing, retail and consumer goods