Digital & Technology
Artificial intelligence (AI) has become an area of strategic importance and a key driver of economic and social development across the world. This emerging technology can bring solutions to many societal challenges from treating diseases to minimising the environmental impact of farming. However, related socio-economic, legal and ethical impacts have to be carefully addressed and a balance must be drawn between, on the one hand, the rights of every individual and on the other, the plethora benefits this technology can offer.
In recognition of the importance to stay at the forefront of this technological revolution, the European Commission has put forward a European approach to Artificial Intelligence and Robotics. Within this context, the Commission has published a “Report on Liability for Artificial Intelligence and Other Emerging Digital Technologies” which details the findings of a Group of Experts on the liability rules specifically applicable to damages and losses resulting from the use of emerging digital technologies such as AI.
Assessment of existing liability regimes across the European Union
The Group examined whether and to what extent existing liability schemes are adapted to the emerging market realities following the development of new technologies such as artificial intelligence, advanced robotics, the internet of things and cyber security issues. In particular, they were asked to examine whether the current liability regimes across the Member States are ‘adequate to facilitate the uptake of new technologies’ and to assess their suitability to deal with damages and losses resulting from the use of such technologies.
In its assessment, the New Technologies Formation of the Expert Group has concluded that the liability regimes currently in force in the Member States ensure merely a basic protection of victims whose damage is caused by the operation of such new technologies. More specifically, they have found that, while the laws of the Member States do ensure basic protection of rights, also referred to as primarily damages in tort and contract, these laws are not specifically applicable to this dynamic, complex and fast developing area. The specific characteristics of these technologies and their applications – including complexity, modification through updates or self-learning during operation, limited predictability, and vulnerability to cybersecurity threats – may make it more difficult to offer individuals a claim for compensation in all cases where this seems justified. It is also sometimes the case that the allocation of liability is unfair or inefficient.
An example of technology given by the Group to highlight the complexities in this field is a smart home system, which has a number of interacting devices and programmes. Someone who has suffered damages as a result of a failure of this system would have a number of financial and technical obstacles to overcome in order to prove causation i.e. that the software design or algorithm caused the failure of the device or system. The more systems involved and interacting the more costly and complex this becomes. According to the Report, in order to rectify this gap, certain adjustments need to be made to existing EU and national liability regimes.
Key findings of the Group
What follows is a summary of the key findings of the Group on how liability regimes should be designed and, where necessary, changed to adapt to this evolving area of digital technology:
-
- A person operating a permissible technology that nevertheless carries an increased risk of harm to others, for example AI-driven robots in public spaces, should be subject to strict liability for damage resulting from its operation.
- In situations where a service provider ensuring the necessary technical framework has a higher degree of control than the owner or user of an actual product or service equipped with AI, this should be taken into account in determining who primarily operates the technology.
- A person using a technology that does not pose an increased risk of harm to others should still be required to abide by duties to properly select, operate, monitor and maintain the technology in use and – failing that – should be liable for breach of these duties if at fault.
- A person using a technology which has a certain degree of autonomy should not be less accountable for ensuing harm than if the said harm had been caused by a human auxiliary.
- Manufacturers of products or digital content incorporating emerging digital technology should be liable for damage caused by defects in their products, even if the defect was caused by changes made to the product under the producer’s control after it had been placed on the market.
- For situations exposing third parties to an increased risk of harm, compulsory liability insurance could give victims better access to compensation and protect potential tortfeasors against the risk of liability.
- Where a particular technology increases the difficulties of proving the existence of an element of liability beyond what can be reasonably expected, victims should be entitled to facilitation of proof.
- Emerging digital technologies should come with logging features, where appropriate in the circumstances, and failure to log, or to provide reasonable access to logged data, should result in a reversal of the burden of proof in order not be to the detriment of the victim.
- The destruction of the victim’s data should be regarded as damage, compensable under specific conditions.
- It is not necessary to give devices or autonomous systems a legal personality, as the harm these may cause can and should be attributable to existing persons or bodies.
- A person operating a permissible technology that nevertheless carries an increased risk of harm to others, for example AI-driven robots in public spaces, should be subject to strict liability for damage resulting from its operation.
What should manufacturers and operators of such technologies do for now?
Given the identified ambiguities, it is advisable for manufacturers and operators of such technologies to make it clear, through warnings, instructions, marketing, and otherwise, that emerging technologies are being used and obtain the consent of the general public before interacting with them. Besides, all AI and/or technology driven decisions should always be reviewed by individuals/experts and a cost-benefit analysis and risk assessments should always be made before incorporating and applying such technologies.
Concluding remarks
The law of tort of EU Member States is largely non-harmonized, with the exception of product liability law under Directive 85/374/EC, some aspects of liability for infringing data protection law (Article 82 of the General Data Protection Regulation (GDPR), and liability for infringing competition law (Directive 2014/104/EU). There is also a well-established regime governing liability insurance with regard to damage caused by the use of motor vehicles (Di-rective 2009/103/EC20), although without touching upon liability for accidents itself. EU law also provides for a conflict of tort laws framework, in the form of the Rome II Regulation. On a national level, it can generally be observed that the laws of the Member States do not as of yet contain liability rules specifically applicable to damage resulting from the use of emerging digital technologies such as AI.
While it is possible to apply existing liability regimes to emerging digital technology, as these technological developments are constantly evolving, steps should be taken now to consider how to implement the recommendations of the Group. The EU’s first dedicated AI legislation is expected to be published very soon and it will be very interesting to see if any of the above issues are addressed in that draft in order to mitigate the risks due to the confirmed lacuna in the law.
Our firm works at the convergence of business and technology to help our clients understand, deploy and integrate emerging, transformative and disruptive technology to their best business advantage. For further information on our digital & technology legal services, please click here or contact us.