TY - JOUR AU - Dickinson,, Jill AB - Abstract On 19 July 2018, the Automated and Electric Vehicles Act 2018 (AEVA) received Royal Assent. As motor vehicles are becoming increasingly technologically based, with driving aids having taken over many of the more mundane (and possibly dangerous) aspects of driving from the driver, it is imperative that legislation keeps pace to determine the responsibilities of the parties. Motor insurance provides protection for those involved with vehicles and who may suffer harm, injury, and loss due to the negligence of the actors. This is most frequently driver error, but may also include manufacturing defects, which result in deaths and less serious injuries. It is also here where the intersection between torts and insurance laws needs careful management. It would be particularly unfair to ask drivers or third-party victims of motor vehicle accidents to seek redress from a manufacturer for losses incurred during the actions of an autonomous vehicle. Consumer law has historically removed this burden from affected consumers and it is entirely sensible for the law to protect individuals in an emerging field—and perhaps even more so given the trajectory of vehicles with driver-enabled qualities and the numbers of vehicles currently featuring such innovations. Yet, the AEVA consists of aspects which are troubling in respect of the motor insurance industry’s dominance of this market, the application of compulsory insurance, and exclusions and limitations to responsibility which expose policy holders and victims to EU-breaching levels of risk. 1. INTRODUCTION The Government, in its paper ‘Industrial Strategy: Building a Britain Fit for the Future’, identified four ‘grand challenges’ it faced for the effective future of the country. One of the challenges was to position Britain to become a world leader in the way people, goods, and services move.1 This article was highlighted by the Law Commission in its consultation document regarding the position automated vehicles held in achieving this goal. The Government has begun a process where it aims to be at the forefront of enabling the transition to the use of autonomous vehicles. While at a relatively early stage, the Government has committed over £200 million for research into their use,2 has published Codes of Practice3 relating to the testing of autonomous vehicles, has established the Centre for Connected and Autonomous Vehicles (CCAV), and has begun the transition to regulation and legislating for their use through insuring risk via the AEVA. The AEVA is an Act to extend compulsory motor vehicle insurance to the use of automated vehicles that are operating in an autonomous mode for the protection of victims injured by such a vehicle. The aim is to simplify protection by making the insurer liable to pay compensation awarded to any victim (in the first instance) and following this, the insurer has a mechanism to recover costs as appropriate under existing common law and product liability laws. The Act’s full implementation will be effective through statutory instruments in the coming years. As part of the creation of the AEVA, one of a (necessary) series of laws dealing with an emerging system of travel which encompasses many facets of law—insurance, product liability, criminal responsibility, and so on—the Law Commission, in November 2018, embarked on a three-year review of national driving laws with regards to the use of automated vehicles. The first part of this review involved a public consultation exercise where interested parties could provide submissions about their views of the AEVA and the future direction of regulation. This resulted in 178 responses from private individuals, academics, insurers, representatives from the police and local government, and car manufacturers (among others). The nature of this initial exercise was to enable the Commission to draw conclusions as to the efficacy of the current legal framework and to consider its role in the wider public and private transport network system. The United Kingdom is one of the first countries to undertake the task of regulation and legislating on this developing technological paradigm. It also demonstrates, as do the inclusions in the AEVA, the seriousness of the Government in ensuring its readiness for the influx of a new suite of self-driving cars and vehicles with ever greater automation present. This necessitates an effective system of risk awareness, risk allocation, individual and collective responsibility, and legal and policy frameworks, which facilitate the deployment of automated vehicles into the public arena. The review is an important step in the United Kingdom’s readiness for the evolution to the influx of autonomous vehicles on Britain’s roads. The movement to autonomous vehicles is expected to generate a number of advantages for road users. An obvious advantage of autonomous vehicles is the increased mobility it will provide those for whom even heavily modified cars currently available are inaccessible. This group will be afforded the ability to achieve a level of independence otherwise denied to them and will have ramifications for their capacity to maintain social contacts, employment, and so on. It is also envisioned that their use will reduce deaths and serious injuries as a result of motor vehicle accidents (as the main cause of such accidents is human error), improving road safety,4 and the lowering of insurance premiums. There are downsides to the introduction of the technology. Driverless vehicles will have to transition into a transport system predominated by human drivers whose behavior is anticipated to deteriorate following their introduction into the mainstream. Drivers and pedestrians may take more risks if they feel a computer is in charge of vehicles with on-board systems containing micro-chips with powerful processing power and multi-directional cameras to avoid danger. The reality is likely to lie somewhere in-between these states. However, taking the positive perspective first, autonomous vehicles may reduce the number and consequences of motor vehicle accidents on the road. 2. MOTOR VEHICLE ACCIDENTS IN THE UNITED KINGDOM The latest available statistics from the Office for National Statistics on road traffic accidents in Great Britain were published in September 2018.5 The number of fatalities had seen a marked reduction in the previous 10 years. In 2007, 2946 people died as a result of a road accident. The figure reduced to 1901 deaths in 2011 and then plateaued in 2016 and 2017 to 1792 deaths (a 39% reduction between 2007 and 2017). In 2017, there were 24,831 serious injuries reported to the police due to road traffic accidents and 170,993 causalities of all severities reported in the same year (a figure 6% lower than reported in the previous year and indeed it is the lowest ever recorded figure). However, levels of motor traffic increased by 1.1% between 2016 and 2017. It is important to note that these statistics are only those available to the police through official reporting procedures and do not represent the full scale of accidents and injuries occurring on public roads in Great Britain. Furthermore, and importantly for motor vehicle insurance and developments in the law at EU level, the statistics also omit data away from the public highway—thus accidents occurring on private land and those on ‘[an]other public place’ environments (for the purposes of section 145 of the Road Traffic Act 1988—car parks, private driveways, etc) are not included. When looking at the detail of those reported accidents, car occupants accounted for 44% (n = 787) of road deaths, pedestrians for 26% (n = 470), motorcyclists for 19% (n = 349), and pedal cyclists for 6% (n = 101) of deaths on the road. These last three categories are identified as vulnerable road users and typically have a much higher casualty rate per mile traveled than other road user groups. Of car occupants, 68% of victims were the drivers with 32% being passengers. Given that figures for March 2019, as presented by the Department for Transport,6 found that 82.5% of road users were cars, and only 3.3% of the users were motorcycles, motorcyclists seem much more likely, as a proportion, to suffer injuries and death on the roads than do the occupants of a car. The location of accidents is also relevant for the implications of autonomous vehicle use. There were 99 fatalities and 7759 total casualties in 2017 on motorways (per 69 billion vehicle miles). For urban roads, there were 626 fatalities and 107,347 total casualties per 117 billion vehicle miles. Finally, there were 1068 fatalities and 55,876 total casualties per 145 billion vehicle miles on rural roads. Thus, motorways carry approximately 21% of traffic yet only account for 6% of fatalities. It is unclear how the use of autonomous vehicles will impact on these numbers as the numbers of such vehicles and reporting and trend analysis is not yet possible. However, those accidents reported in the use of autonomous vehicles tend to identify issues where the on-board computer is confused with road markings/unusual occurrences. These are likely to be less prevalent on motorways given the largely predictable nature of road surfaces and travel, and equally, rural roads may cause the largest problem to autonomous vehicles (as it may do to human drivers too). However, attempting to find trends in road accidents and fatalities is problematic. There are influences on the factors driving road casualties. These may include the distances traveled by users; the mode of transport used; the behavior of users (and their appreciation of danger); and the composition of the users (the elderly, the young, new/inexperienced drivers, etc). External factors can also play a role on casualties—for example the weather7 and changes in the risk on roads. Globally, 1.25 million people die each year as a result of road traffic accidents, between 20 and 50 million people suffer non-fatal injuries, and road traffic accidents are the leading cause of death among people aged 15–29 years of age.8 It is envisioned that the roll-out of autonomous vehicles will help to reduce the number of accidents and thereby the numbers of injured persons. While this is to be welcomed and such technology will likely be available in more expensive, new vehicles in the first instance before cascading down to more affordable vehicles and the markets where such vehicles are sold, the same Parliamentary briefing report which presented the above statistics9 highlighted that more than 90% of road traffic deaths occur in low- to middle-income countries. The single highest rates are found in the African region. The AEVA and its mechanism for the regulation and operation of vehicles, their insurance and the management and maintenance of appropriate infrastructure to facilitate the use of automated and electric vehicles may act as a benchmark or plan for other countries to use when the viability for adopting this technology is realized. 3. AUTONOMOUS VEHICLES—THE FUTURE? According to the Bank of England,10 by 2030, some 30/50% of new sales of personal/commercial automobiles will be autonomous vehicles (predicting the adoption commercially will be higher than that for personal vehicles). The report11 on which the article is based further predicts that this innovation will lead to a reduction in accidents of up to two thirds, a 60% reduction in fraud and a 50% reduction in vehicle thefts. There is also predicted to be a marginal increase in software failures. The report continues that currently 46% of insurance claims from accidents involving driven motor vehicles is based on bodily injury, and this is estimated to rise to 60% with the use of autonomous vehicles. However, while accidental damage presently accounts for 23% of the claims profile experienced by the insurance industry, this will drop to 16% with autonomous vehicles. In the United Kingdom, the car remains the dominant form of daily travel. As reported by the RAC, ‘In 2017, 61 per cent of trips were made by car, either as a driver or as a passenger. The car is also the most common mode for distance traveled, accounting for 78 per cent of the total distance traveled in 2017’.12 Society also seems to expect technological advancement in driving. Motorists are used to features that a decade ago would have been prohibitively expensive and expect many driving aids as standard in their new cars. This, especially in respect of safety features, is increasingly becoming part of the legislative framework for new vehicles.13 However, humans’ understanding of what technology should do, and what it actually can do (and consistently) is often misaligned. There is a widespread expectation that the era of autonomous vehicles will bring with it a panacea of perfect roads with vehicles driving in harmony together (not unlike a cleverly choreographed performance) and removing the hazardous event driving presently is. The reality of the dangers of driving by humans, and presumably the acceptance of this danger, is not matched when humans think about what driving will be like when technology takes over. Google has been testing cars on roads in the United States as has Tesla, and Uber has vehicles already taking passengers on journeys, indeed the company was the first to be prosecuted in the United States for the death of a pedestrian struck by one of its autonomous vehicles. Essentially, autonomous vehicles must be used on the road to generate the data necessary to improve its functionality and safety, but so doing involves acceptance of the risk of relying on technology which will inevitably fail and lead to injury and loss of life for third-party victims. 2016 was the year of the first reported case of a death caused by a driverless car. This involved the ‘driver’ of a Tesla motor car which, while in ‘autopilot’ mode, struck the side of a tractor trailer due to the bright sky at the time and the trailer colour being white. It appears that the cameras of the vehicle were unable to correctly detect the vehicle and apply the brake in time. Tesla’s arguments against liability included a disclaimer on the system which instructs users that the driver must remain in control of the vehicle when using the autopilot mode. In the United Kingdom, a Tesla driver was prosecuted and disqualified from driving when he was caught sitting in the passenger seat of his vehicle while the car was driving along the M1 and in autopilot mode. The first reported incident of a pedestrian’s death due to an automated vehicle was in March 2018 where a woman in Arizona, United States, was killed by an Uber vehicle. The victim was walking across the US equivalent of a pelican crossing when the car, in autonomous mode, and traveling at 40 mph failed to recognize her and stop. In the aftermath, it was asserted by John Simpson, a project director with advocacy group Consumer Watchdog, that ‘…robot cars cannot accurately predict human behavior, and the real problem comes in the interaction between humans and the robot vehicles’. This, he concluded, should lead to a moratorium on their use.14 While it may seem there is a lack of appetite from the public for errors in the operation of autonomous vehicles which lead to injuries and deaths on the road, in the United Kingdom, as noted above, driver error is the most common reason for such events. Data taken from the Department for Transport’s own sources led to the following ‘top ten’ reasons for causes of road traffic accidents: At number one, the driver failed to look properly;15 the driver failed to judge other person’s path or speed;16 the driver was careless, reckless or in a hurry;17 the driver had a poor turn or poor maneuver;18 the driver lost control of the vehicle;19 the pedestrian failed to look properly;20 the accident occurred due to a slippery road surface;21 the driver was traveling too fast for the conditions;22 the driver was following too close to the vehicle in front of it;23 and at number 10, the driver was exceeding the speed limit.24 It will be noted that 8 of the top 10 reasons for accidents are a direct result of driver error, one is due to the driver not taking sufficient care of the surroundings and weather conditions to appreciate the risk of the vehicle on the particular surface they were traveling on. Only one involved pedestrians failing to notice the vehicles traveling on the road—which could also be attributed in some ways to the driver not taking into account the dangers faced with pedestrians (many of whom may be vulnerable, have mobility issues or may simply not have the awareness to appreciate the risks posed by vehicles on a road). Further, 4% of all road traffic accidents (n = 6070) involved at least one driver or rider being over the legal alcohol limit.25 By their nature, these driver errors and willful actions to inhibit the driver/rider’s reaction times would be removed with the use of fully autonomized vehicles. 4. LEVELS OF AUTOMATION Technological advances are a feature of automobiles. From driver assistance systems including aided braking and steering, cars have also featured parking sensors and cameras to hands-free parking. The Department for Transport identifies a fully autonomous vehicle as one ‘…in which a driver is not necessary’. While this is certainly a possible future of road travel, there are various levels of automation from full driver input to no driver input in controlling the vehicle. The Society of Automotive Engineers (SAE) International Standard J3016 identifies the following as levels of automation. At Level 0, there is no automation and the driver controls all aspects of driving. Level 1 includes steering and acceleration/deceleration assistance systems aiding the driver (who assumes all remaining functions). Level 2 is a designation given to partial automation where the assistance system provides steering and acceleration/deceleration using information from the driving environment, but the driver performs all other aspects of driving. At Level 3, the driver is expected to intervene and respond when requested but all other aspects of the driving are taken by the automated system. Level 4 is the high automation where the automated system takes all aspects of driving, and this applies to situations where the driver fails to respond or intervene when requested. Finally, Level 5 is used to describe an automated driving system where all tasks in all roadway and environmental conditions are taken by the system. This is referred to as a full automation system. While this is an international standard, the SAE has been criticized, for example by the House of Lords, in the debate of the passage of the Bill that led to the AEVA. It is considered by some to be a rather imprecise, generalized and broad standard, and one which lacks a robustness and clarity.26 5. THE AUTOMATED AND ELECTRIC VEHICLES ACT 2018 The AEVA received Royal Assent on 19 July 2018 and became law, subject to a commencement order where the sections will be enacted through regulation as provided by the Secretary of State through statutory instruments. Its objectives were, according to Jesse Norman the Roads Minister, to ensure that both the United Kingdom’s infrastructure and its insurance system are ready ‘…for the biggest transport revolution in a century’. The AEVA is divided into two main parts. Part 1 sets out the various responsibilities of the Secretary of State, motor-vehicle insurers and owners in relation to automated vehicles within England, Wales and Scotland.27 Part 2 covers liabilities for electric vehicle charging points across Great Britain.28 Thus, the introduction of a single Act of Parliament demonstrates the view of the Government that automated and electric vehicles and their associated technologies are directly related. Each of these Parts will be outlined in turn. (A) Part 1: Automated Vehicles Section 1 of the Act obliges the Secretary of State to maintain and publish a current list of motor vehicles that are ‘capable…of safely driving themselves’29 and may be lawfully driven in such a manner ‘on roads or other public places’ across Great Britain.30 The Act leaves the details required for producing such a list to the Secretary of State’s discretion, envisaging how vehicles may be listed according to type,31 registration documentation,32 or ‘in some other way’.33 The rest of the obligations set out in Part 1 apply to these listed ‘automated vehicles’.34 The Act has not provided instruction as to who will have to pay for the collation and maintenance of the list, such as whether this will be levied through a charge to insurers and/or manufacturers. Section 2 of the Act covers the liabilities for any accident caused, either wholly or partially,35 by an automated vehicle which is ‘driving itself on a road or other public place in Great Britain’.36 The Act elucidates how a vehicle will be deemed to be ‘driving itself’ if it is in operation without an individual’s control and ‘does not need to be monitored’ by them.37 If the vehicle is insured, and an insured person or anyone else suffers resulting damage, the insurer will be liable.38 Insurers are allowed to either exclude or limit their liability if the damage is caused either: as a ‘direct result’ of any ‘software alterations’ that have been made to the vehicle either by the insured person or with their knowledge,39 or due to a failure to install ‘safety-critical software updates’ that the insured person either knew, or should have reasonably known, to be ‘safety-critical’.40 In either of those instances, the Act specifically enables the insurer to recover the monies paid out from the insured person to the extent that the policy reserves the right for them to do so.41 In the event that the vehicle is not insured, the owner will be responsible.42 Whether the vehicle is insured or not, then the reference to damage includes death or personal injury, and any damage to property43 except for: the automated vehicle itself;44 any goods that it was carrying which were either for ‘hire or reward’;45 and, any property ‘in the custody, or under the control, of [either] the insured person (if the vehicle is insured)46 or otherwise the person in charge of the automated vehicle at the time of the accident’.47 There is a cap on liability for property damage caused per accident which currently stands at £1,200,000.48 It also enables the insurer or the automated vehicle owner to recover any settlement payment from any other person49 to the extent that they are liable for the accident.50 If the insurer recovers more than that amount, they are liable to pay the difference to the injured party.51 The Act specifically provides for the defence of contributory negligence where the injured party is at least partially to blame for their injuries.52 Section 3(2) also makes it clear that the insurer or owner of the automated vehicle will not be liable where the accident ‘wholly’ results from the negligence of the person in charge of the automated vehicle. In terms of monitoring the effectiveness of these new statutory requirements, the Secretary of State must produce a report for Parliament within two years of publishing the first list of automated vehicles.53 This assessment must cover both ‘the impact and effectiveness’ of the listing requirements,54 and ‘the extent to which’ the remaining provisions of Part 1 of the Act ‘ensure that appropriate insurance or other arrangements are made’ regarding such vehicles.55 (B) Part 2: Electric Vehicle Charging Points Section 10 of the Act provides for regulations to be imposed on those who operate publicly accessible points for charging or refueling either electric or hydrogen-powered vehicles respectively56 (‘points’) where such vehicles are ‘intended or adapted for use on roads’.57 The Act envisages how such regulations could cover: payment methods and access58 (for example, requiring operator-collaboration in ‘sharing facilities’),59 the ‘performance, maintenance and availability’ of such points60 and the components which facilitate the connection between the point and the vehicle.61 It is also anticipated that ‘large fuel retailers’ and ‘service area operators’ (such terms to be defined by the regulations)62 will be subject to general requirements to provide both public charging or refueling points63 (at specific times),64 and associated ‘services or facilities’.65 If the particular area’s mayor makes a request for such regulations to be made, the Secretary of State is under a statutory duty to consider it provided that ‘prescribed requirements’66 and certain conditions have been complied with. Essentially, the mayor must have published proposals for such regulations,67 and consulted with: relevant local authorities,68 those ‘who are likely to be subject to the requirements of the regulations’,69 and anyone else that the mayor, in their discretion, deems appropriate.70 The mayor is also required to have passed consultation responses onto the Secretary of State.71 If the Secretary of State decides against making such regulations, they must provide a reasoned account of their decision to the mayor.72 This aspect of the Act was one of the various amendments insisted by the House of Lords during the Bill’s passage through Parliament. It was the Lords’ intention that the entire infrastructure of the modernized automated and electric vehicle renaissance was managed and integrated to avoid it slipping into obscurity and adversely affecting consumers. Hence, the empowerment to locally elected Mayors to determine where the electronic vehicle infrastructure was located in their cities, and the imposition on the Secretary of State to report on the effectiveness of the refueling infrastructure in conjunction with the Government’s clearer air targets. Furthermore, the very use of the word ‘refuelling’ rather than ‘charging’ was a Lords’ amendment to provide for the longevity of electric vehicles and to ensure the AEVA supported the development of hydrogen fuel cells beyond the use of battery cells presently the more common form of power. Section 13 of the Act also enables the Secretary of State to make regulations that oblige operators of public charging or refuelling points to provide ‘useful’ information ‘to users or potential users’.73 The Act cites examples of such information, including the location of points and hours of operation,74 costs of access,75 and methods for connection.76 The Act also envisages the Secretary of State making regulations to cover the ‘transmission’ of any data collected from the points; for example, relating to ‘energy consumption and geographical information’.77 In terms of the points themselves, section 15 enables their hire, loan, gift or sale,78 and installation,79 to be regulated and envisages how such requirements could cover their technical capacity to accept and handle,80 respond81 and transmit82 information, ‘monitor and record energy consumption’,83 and ‘achieve energy efficiency’.84 In enforcing any of these Part 2 regulations relating to the charging of electric vehicles, section 16 provides illustrative examples of areas for additional regulations to be made, including procedures for determining breaches,85 levying of fines,86 and making appeals.87 Finally, Part 2 sets out similar monitoring provisions as Part 1 in terms of the ‘need for’ any regulations made88 and their ‘impact and effectiveness’.89 The Secretary of State must prepare a report for Parliament within the first two years of the Act,90 and every year after that which assesses these aspects.91 The Lords’ insisted on the obligation being imposed on the Secretary of State to report on the effectiveness of the measures introduced. This not only included the production of a list of automated vehicles, but also the effectiveness of section 1 relating to the classification of the vehicles in scope and how effectively the insurance provisions were operating to ensure the safety of the use of automated vehicles. It is in the context of this latter aspect that we concentrate our discussion of the issues still present in the effective regulation and operation between national and EU provisions on motor insurance law. 6. FULLY THOUGHT THROUGH? CONCERNS WITHIN THE ACT This article is concerned with a review of the main aspects of Part 1 of the AEVA—those focusing on autonomous vehicles. The Act is to be welcomed as a first step in ensuring developments in the automotive industry are being reflected and countenanced by insurers. At section 1 AEVA, the Secretary of State is obliged to maintain a listed of automated vehicles and, at section 1(4), it is this list which will identify those ‘automated vehicles’ subject to the legislation. This is important because the Act itself does not designate the levels of automation as provided for by the SAE and this is vital given what those levels provide (very limited/partial driver aids to full automation and little to no user input at the most ‘automated’ level). This matter led the Law Commission, in a consultation published in July 2019, to enquire whether it would be appropriate in future legislation/statutory instruments for a ‘driver’ to be available in instances of the use of autonomous vehicles and if so, in which circumstances would it be appropriate for them to take over control of the vehicle. This issue of designating a ‘person-in-charge’ of the vehicle has yet to be determined and requires attention given the complexity of the vehicles and what an individual may or may not be expected to do behind the wheel (assuming that fully automated vehicles in the future still come equipped with a steering wheel). It would be expected that a person-in-charge of the vehicle will be required at Levels 1–4, but at Level 5, it may be expected that actual human intervention in the driving process is not anticipated and thus rendered unnecessary. It will require careful examination of the vehicles, the manufacturers’ description of their level of automation, independent testing and agreement as to the level of automation and approval by the Government before the Secretary of State includes it on the list. This should establish the extent and limit potential liability rather than any inclusions of a person-in-charge being ultimately responsible for the vehicle in use. It is also a concern that the Act chose to separate the levels of automated vehicles so harshly (effectively concerning only vehicles at Levels 4–5) rather than accepting one position covering all automated vehicles. For instance, the wording at section 1 provides for a list of vehicles that: (a)are…designed or adapted to be capable, in at least some circumstances or situations, of safely driving themselves… (authors’ emphasis) This differentiates automated vehicles (Levels 1–5) and thereby will provide an opportunity for insurers to avoid insurance liability in a similar vein to that seen with contemporary insurance allowed under the Road Traffic Act 1988 (RTA88) which is, in many respects, in breach of EU law. The existing model of motor insurance is based on the fault liability of the user and fully automated vehicles would clearly remove this element of its business model. This continues with the AEVA which could have moved away from such a mechanism of underwriting the consequences of negligent driving to one which led to a simple coverage through product liability. However, what is left for the insurance industry is it providing the compulsory cover in what is likely to be a very low-risk venture given insurers’ ability to seek recovery of the payment of awards from those at fault and from the efforts of manufacturers who will not wish to be exposed to risk when providing their automated vehicles to the public. What is currently missing, and is needed more immediately, is a statutory mechanism for covering automated vehicles currently available and at significantly lower levels of sophistication (Levels 1 and 2) than the Level 5 vehicles envisioned (which are perhaps 10 years away from ‘mainstream’92 availability). While the Government’s Code of Practice on the use and testing of autonomous vehicles does not impose specific requirements at present (beyond the general requirements for compulsory insurance for vehicles used on a road or other public place), autonomous cars at Level 5 will require specific permission and communications with CCAV to be tested on the road. Level 5 autonomous vehicles are essentially prototypes and not tested. It is not uncommon for motor vehicles on the road at present to have a host of driver assistance features which might affect the liability of the user depending on the level of autonomy (adaptive cruise control, emergency braking systems, (remote control) automatic parking systems, lane-centering steering, etc). Ultimately, the driver/user’s responsibility is going to change in these different circumstances. This in turn will also cause problems for insurers in how to determine risk in the way the current system operates. A driver’s age, driving record, prosecutions and health will no longer be relevant factors in actuarial data determining risk at Level 5 automation. Rather, the model of vehicle, manufacturer, software used and security systems (including reliability data) will be more relevant. This may reduce situations of discrimination against individuals, but will require much greater reliance on data generated from the use of vehicles and communications with the manufacturers. Indeed, through deliberations it has been asserted that this may evolve to real-time data being used to make insurance applicable to each specific journey based on the risks involved.93 Section 1 continues, in explaining the requirement of the Secretary of State to maintain a list of automated vehicles, that such vehicles are those ‘which may lawfully be used when driving themselves’. At section 8(1)(a), ‘driving itself’ means ‘…operating in a mode which is not being controlled and does not need to be monitored by an individual’. Typically motor vehicle insurance has been based on the driver/registered keeper maintaining ultimate responsibility to keep the vehicle safe, be responsible for its use, and to be appropriately covered by insurance. Where the vehicle is automated and takes over functions from the driver, this causes issues relating to the responsibility of the interested parties. Is the manufacturer, software developer, service provider, or driver responsible to victims of accidents occurring when the vehicle causes damage or loss? The AEVA fails to differentiate between the levels of automation of vehicles on the road and the progression to Level 5 (where monitoring and intervention are not needed by an individual). It will be many years before fully automated vehicles are available generally and there may even be, as suggested by Baroness Sugg, restrictions on which roads truly autonomous vehicles may initially be used (for example motorways with dedicated AV lanes).94 Until then, how does the AEVA, as the principal legislative instrument for the insurance of automated vehicles, guide users and insurers as to their responsibilities for other ‘connected’ vehicles which are increasingly seeing ‘driver assistance features’ as standard? This may be even more confusing where vehicles are currently available which are sold as being ready for full autonomous driving. The company Tesla sells its Model S vehicle with, among others, an ‘autopilot’ mode which is designed to assist the driver and to ‘take away the most burdensome parts of driving’. These allow the driver to remotely summon the vehicle, for it to ‘autopark’ (park itself at the touch of a button) and ‘auto lane change’ (where the vehicle automatically changes lane for the driver when driving on motorways). The Model S vehicle is an expensive car and probably beyond the financial scope of most drivers. This has been, undoubtedly, due in part to the research and technology available in the vehicle. Tesla is also developing a cheaper Model 3. Perhaps most significantly, under a heading of ‘Full Self-Driving Hardware’ the company claims ‘Every new Model 3 comes standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time’.95 Thus, some cars on the road in the United Kingdom today are ready, following a software update, to be fully autonomous vehicles. The AEVA is applicable to the Tesla vehicles when they become Level 4–5 autonomous, yet until then the Act is not applicable to the car as it presently operates. Tesla also intends for its cars to be self-learning. As a self-learning car will generate data from its own surroundings, it will likely share this with a central server to be used to better understand road conditions and the actions of road users, and to give and receive data in the form of communications and software updates this will lead to issues of privacy (as discussed later). Further in section 1 AEVA at (1)(b) the Secretary of State must prepare and keep up to date a list of all automated vehicles which: may lawfully be used when driving themselves, in at least some circumstances or situations, on roads or other public places in Great Britain (authors emphasis). This replicates RTA88 section 145 but has been, since 2014, incorrect regarding the geographical scope of the requirement to hold compulsory motor vehicle insurance. The case law of the Court of Justice of the European Union (CJEU) in Vnuk v. Zavarovalnica Triglav dd,96 extended the geographic scope of compulsory motor vehicle insurance insofar as it applies to private land and is not restricted to ‘a road or other public place’. Given that, as we have argued elsewhere,97 the current restrictive definition in the RTA88 must now be considered bad law, its replication in the AEVA must equally be considered as such. Section 2 of the AEVA does provide protection for third-party and innocent victims of accidents involving autonomous vehicles. Given the development of accidents occurring as a result of an autonomous vehicle, and the further removal of any blame being attributed to the driver (as is the current situation of accidents involving motor vehicles), rather than face the unenviable position of having to bring legal proceedings against a manufacturer as the responsible party for the consequences of the accident, the victim has a direct right of action against the vehicle’s insurer. It would be then open for the insurer to start proceedings for recovery against the manufacturer if they can establish a defect in the vehicle which led to the accident was the manufacturer’s fault. (A) Contributory Negligence Sections 3(1) and 6(3) of the AEVA relate to the application of contributory negligence on the driver/user and follow the Law Reform (Contributory Negligence) Act 1945 where strict liability applies. While there are parallels in the laws (such as ‘…the amount of the liability is subject to whatever reduction under the Law Reform (Contributory Negligence) Act 1945 would apply to a claim.’),98 differences exist. For instance, the Consumer Protection Act 1978 provides for liability of the producer99/supplier100 where the damage is ‘wholly or partly caused by a defect in the product’.101 The AEVA is not clear on damage caused ‘partly’ by the vehicle—it only uses the word ‘wholly’ in section 3(2). This is due to section 8(3)(b) where ‘a reference to an accident caused by an automated vehicle includes a reference to an accident that is partly caused by an automated vehicle’. The differences in sections 3(2) and 8(3)(b) of the AEVA should be addressed to identify the application of contributory negligence and product liability due to defects which partly contribute to damage. In the following section of this article, we discuss the issue of the potential for an insurer to limit or exclude the liability under the policy where the vehicle has either had its software tampered with by the user or where the user has failed to install safety-critical software updates. Section 3(2) AEVA removes the liability of the insurer to the person ‘in charge of the vehicle’ where the ‘accident that it caused was wholly due to the person’s negligence in allowing the vehicle to begin driving itself when it was not appropriate to do so’. It is here where the Act provides for the contributory negligence of the driver/user. Significantly, this does not prevent third-party victims of accidents from making a direct claim against the insurer, instead this section of the Act is designed to hold the driver/user responsible where they have contributed to their own misfortune through this element of negligence. The AEVA lacks instruction as to two main elements of this section. The first is the interpretation of the word ‘appropriate’ and whether this will refer to road conditions, parts of the road which have been designated as safe or applicable for autonomous vehicle use, or indeed the condition of the driver/user—will drivers/users be required to maintain sobriety when using an autonomous vehicle or be, as the Law Commission has included as part of its recent consultation exercise, a ‘person-in-charge’ and ready to take over control of the vehicle as appropriate? The second element is where the vehicle is to ‘drive itself’ what this means in relation to an individual’s negligence. Would this be determined on the basis of software updates, the environment in which the vehicle is being used, the user having checked the camera and proximity sensor systems adopted in the vehicle to ensure its spatial awareness are working adequately, and so on? The questions just raised are likely to be addressed in future guidance and the parameters of liability and of driver safety will be established. Yet at present, section 3(2) AEVA does provide insurers with quite a broad spectrum of argument to limit the extent of their liability, particularly in the event of a single vehicle accident. Where software failures or glitches lead to a spate of accidents the issue of contributory negligence is unlikely to be raised. With single accidents affecting only the victim and the vehicle, the entire basis of the AEVA of being designed to prevent protracted legal arguments between users and manufacturers will fail. The imposition of strict liability through AEVA section 2 also appears to be unfair to the victims of accidents as it discriminates between vehicles and their level of automation. Level 5 vehicles are covered by the Act and therefore the victim has a right of action against the insurer (which is strictly liable for the consequences of the accident). Yet, such strict liability is not so imposed on insurers of vehicles (due to the nature and coverage of vehicles in the AEVA) at Level 1–4 SAE. It is quite conceivable that these vehicles are of greater danger to the innocent victims of accidents involving motor vehicles than those at Level 5 with the removal of all human interaction. This issue is not the main concern with the strict liability imposition through section 2. The interaction between the RTA88 and the AEVA also causes a potential for avoidance of the liability of insurers. Section 2(1)(b) AEVA applies where the accident is caused by an automated vehicle which ‘is insured at the time of the accident’. Section 151 RTA88 imposes the duty of insurers to satisfy judgments against persons insured or those secured against third-party risks. Section 152 RTA88 provides insurers a right to apply for a declaration voiding the policy of insurance where the policy was induced through fraud.102 The result of a successful application under s section 152 RTA88 is that the policy of insurance is treated as though it never existed. Where no policy of insurance is present, the insurer may argue that no strict liability applies because of section 2(1)(b). The individual whose policy of insurance is voided may attempt an argument under AEVA section 2(6), where ‘…liability under this section may not be limited or excluded by a term of an insurance policy or in any other way’. There would in those circumstances be no policy of insurance, but the latter aspect of the section ‘…in any other way’ does read as sufficiently vague to give the insurer an argument as to the meaning of the term if a court interpreted it as preventing the voiding of the policy for this part of the AEVA. Of course, EU law has already clarified this point. In relation to the policy itself and its application to the policy holder, in Fidelidade-Companhia de Seguros SA v. Caisse Suisse de Compensation and Others103 the CJEU, in 2017, ruled that it is contrary to EU law for an insurer to avoid paying a claim by avoiding the insurance policy, even where the Motor Insurers’ Bureau (MIB—the national guarantee fund to satisfy claims where no insurance policy exists) would still settle an award. This point was conceded by the Secretary of State in R (Roadpeace) v. Secretary of State for Transport & MIB104 yet as recently as 2019, at the High Court in Colley v. Shuker105 O’Farrell J held that an insurer could rely on its section 152 RTA88 declaration, despite the incompatibility with EU law. Even when the Motor Vehicles (Compulsory Insurance) (Miscellaneous Amendments) Regulations 2019 come into force on 1 November 2019 which will repeal section 152(2) RTA88 in respect of it applying to declarations of avoiding the insurance policy, this applies only to declarations obtained after the accident. Those declarations granted before the accident (leading the claim for compensation) will continue to apply and exempt an insurer from any liability to make payment under section 151 RTA88. (B) Permitted Exclusions of Liability The rights of insurers to exclude liability to policy holders and third-party victims through application of the RTA88 has been problematic when compared with the requirements in the Motor Vehicle Insurance Directive (MVID).106 These conflicts have been demonstrated in cases such as Byrne v. Motor Insurers’ Bureau and Delaney v. Pickett107 with successful infringement proceedings against the United Kingdom under state liability. Section 148 RTA88 permits insurers to exclude their liability to satisfy claims—which transgress the MVID and the jurisprudence of the CJEU. In the AEVA, section 4 outlines the parameters for insurers to exclude or limit their liability under the policy on one of two bases: (i) where an accident occurs as a direct result of software alterations made by the insured person, or with their knowledge, which are prohibited under the policy or (ii) where there has been a failure to install safety-critical software updates that the insured person knows, or reasonably ought to have known, is safety critical. In many ways, this appears to be a sensible approach. Given that autonomous vehicles will be, at Level 5 at least, completely reliant on software to operate the vehicle, any individual tampering with the software or using it contrary to its intended purpose is exposing themselves and other road users to a risk for which the policy should not necessarily cover. Likewise, as roads and satellite navigation systems need to be updated to reflect changes to assist the vehicle in navigating to the intended destination, and this will occur without the user’s interaction, software updates and particularly those dealing with safety issues, must be maintained to ensure the vehicle and its use is a safe as possible. It is likely that manufacturers and their software developers will provide very regular updates and systems should be in place to prevent the use of the vehicle without the safety checks having been verified and performed. Unfortunately, at this stage, there is very little guidance as to what the owner or driver must do in these circumstances. There undoubtedly will be early adopters who will have an autonomous vehicle and be particularly interested in the technology used. Such people may be mindful of the need for regular software updates and would be vigilant to ensure that these are satisfied. Conversely, there will be a segment of the autonomous vehicle user base which is not technological savvy and indeed will have purchased such a vehicle on the basis that they do not have to operate the car and it will take care of all of the updates itself. Guidance has not yet been issued relating to the knowledge required of the driver in respect of software alterations or updates. There is no guidance as to whether safety systems will be hardwired into the vehicle to prevent its operation unless the software updates as required have been performed. There is no guidance as to the use of warning systems where an individual user has not updated the vehicle yet intends to take it onto the road. There is as yet no guidance on what the term ‘safety critical’ means in practice or in law. There is no guidance as to whose responsibility it will be to issue such ‘safety-critical’ information—the manufacturer or the software developer/supplier? Finally, there is no guidance or instruction at present regarding whether safety-critical updates will be provided over the air or will necessitate the manufacturer to install these at the dealer’s garage. On this last point, were the vehicle to be involved in an accident en route to the garage to have safety-critical software updated, would this exclusion clause take effect or will the user continue to be covered by the policy as they would be able to prove, at least insofar as the data may be extracted from the vehicle’s blackbox or other such hard drive, that they were going directly to have the update completed? Guidance notes, statutory instruments and/or further legislation will be needed to address these issues unless it will be left to the courts to determine the parameters of exclusion clauses in this area. (C) Product Liability, Autonomous Cars, and the Apportioning of Blame The positioning of the RTA88 and the AEVA is contested when it comes to the apportioning of blame in the event of motor vehicle accidents. The current system of motor vehicle insurance is based on the driver being personally liable/responsible for accidents involving the use of the vehicle. The driver has the responsibility to check the general safety of the vehicle before traveling to ensure it is safe to be used on the road (the depth of the tyre tread; that the windscreen, windows and mirrors are clean; all the lights work; the brakes work, etc). Indeed, it is in RTA88 section 145 where the personal liability of the driver is required to be covered by insurance. However, with the advent of autonomous vehicles, it is more likely that issues relating to the reliability of the technology itself will be in question rather than the (in)actions of the driver. This has not only been a concern of the insurers and drivers but also the government has had cause to consider this new dimension to motor liability. The Government Actuary’s Department has acknowledged the complications regarding establishing responsibility and the transformation from personal liability to product liability.108 It must be accepted that motor vehicles will continue to experience mechanical failures and where these result in an accident liability will, presumably, remain the responsibility of the registered owner/driver. The software that runs the autonomous vehicle will be created by humans, with the possible failings/glitches in coding and algorithms. The software running the vehicle is also designed to be self-learning and to use its ‘experience’ of driving to better judge how to respond to the unpredictability of events on the road. When the technology (hardware or software) fails, will this fall under the responsibility of the insured (and thereby be required to be covered through compulsory third-party insurance)?109 It may be more likely to fall under the remit of the Consumer Protection Act 1987 and thereby the manufacturer. However, the software manufacturer may also be the reason for the failure, or perhaps the owner failed to update the vehicle’s software or its software was subject to hacking.110 The result is, at this stage at least, uncertainty in determining the responsible party in an accident involving an autonomous vehicle and consequently protracted litigation. However, as is usual with motor vehicle insurance law, reference to the MVID and the jurisprudence of the CJEU provides instruction and guidance so often disregarded at the national level. Article 3 of the MVID instructs each member state to ‘take all appropriate measures to ensure that civil liability in respect of the use of vehicles normally based in its territory is covered by insurance’ (authors’ emphasis). It continues ‘The insurance…shall cover compulsorily both damage to property and personal injuries’. The Article does not seek nor does it permit member states to differentiate between mechanical or software defects in the requirement for the compulsory insurance of vehicles. It was in June 2019 where the CJEU in Linea Directa Aseguradora SA v. Segurcaixa Sociedad Anonima de Seguros y Reaseguros111 held that fire damage caused to a building resulting from a spontaneous fire due to an electrical fault in a stationary vehicle parked in a private garage fell within the concept of use of vehicles for Article 3. For the CJEU, the parked status of the vehicle was an integral aspect of its ‘normal’ function or use and Article 3 did not require the identification of the specific fault or component failure to establish liability. This stands in stark contrast to the Supreme Court’s decision in Pilling v. UK Insurance Ltd112 where the United Kingdom, through section 145 RTA88 and its causation approach to liability in the ‘use’ of the vehicle on a road or other public place restricted the apportioning of liability. Similar to Linea above, a motor vehicle caught fire within premises which caused substantial damage to the property. In the present case, however, the fire which caused the damage was started by a mechanic who had begun welding activities on the underside of the vehicle, left the vicinity to answer a phone call and returned to find the sparks from the welding had ignited the vehicle. It was held by the court at first instance and the appeal courts that the insurance policy covering the vehicle did extend to accidents occurring on private land. Differences existed in relation to the interpretation of the ‘use’ of the vehicle. Judge Waksman QC at the court at first instance considered the car being repaired was not in ‘use’ at the time of the accident. The Court of Appeal disagreed, holding that the judge had erred in principle as to the interpretation of RTA88 section 145. Finally, the Supreme Court determined that the policy of insurance and section 145 limited cover to ‘use’ of a vehicle on ‘a road or other public place’. Therefore, the claim failed. Its interpretation of section 145 meant ‘“use” is on a road or other public place’ and not ‘any use…consistent with the normal functions of that vehicle’. Continuing this theme, the Supreme Court, for the purposes of section 145(3), held that a relevant use of a vehicle occurs only where a person ‘… uses or has the use of a vehicle on a road or public place, including where he or she parks an immobilized vehicle in such a place (as English case law requires), and the relevant damage has arisen out of that use’.113 The result is the Supreme Court having been caught in a national law paradigm, unable or unwilling to look to the jurisprudence of the CJEU and adopt an accordingly compliant purposive interpretation of the MVID. The decision does not bode well for an EU-compliant interpretation of the AEVA, even with the application of the European Union (Withdrawal) Act 2018 section 6(3). However, all is not lost given the most recent decision, in June 2019, of the Court of Appeal in MIB v. Lewis.114 Here the Court provided instruction which not only affects the restrictive geographic scope of RTA88 section 145, but also the status of the MIB as an emanation of the State and the direct effect of Articles 3 and 10 of the MVID. Where the operation of RTA88 section 145 (and its restrictive geographic scope to compulsory insurance), or the AEVA (relating to victims injured due to a vehicle’s mechanical or software defect) frustrate a victim’s action for compensation through the insurer or the MIB, the direct effect of Articles 3 and 10 now give a direct cause of action by the claimant against the MIB. 7. BROADER ISSUES AND PRESSING NEEDS: BEYOND THE ACT The AEVA is the first legislative instrument to deal with insurance issues for this new form of motor vehicle and one which is likely to revolutionize travel, affecting road users and pedestrians alike. It is also intentionally very narrow in scope and therefore omissions are to be expected. That these are present within the Act obviously does not mean that the Government and interested parties have overlooked the following issues. They are raised here to generate awareness as to matters which will have to be legislated for in the near term. The main concern, which does relate directly to the AEVA, is of vehicles that are in a semi-automated mode and the lack of direction as to when the driver takes responsibility for an accident and when does the manufacturer. Ultimately, the driver will remain responsible for an accident given the application of RTA88 section 145, but will they have a claim on the basis of AEVA against the manufacturer or software developer/programmer? It is not beyond contemplation that class action/collective redress claims may be made by driver groups (and perhaps even the MIB) against the manufacturer for accidents which the driver/user blames on the software used in the vehicle. There may be freedom of information requests made to determine how the programmer of the software instructed the car to react in the event of predicted and unpredicted events. For example, did the car react in the way that a human would have reacted? What algorithm was being used to determine whether an accident would have been in the interests of the driver and other road users rather than an attempt by the vehicle to avoid an initial accident but in so doing having caused much greater injury and loss were an attempt made to avoid that accident. As way of a hypothetical example, imagine that the autonomous vehicle chooses to drive into (and kill) a dog running unexpectedly in front of its route rather than attempting to avoid this crash and veering into on-coming traffic on the other side of the road/mounting the pavement which pedestrians could have been using. Practical, ethical and legal issues come into play in such scenarios and comparisons between the reactions of humans and autonomous vehicles, their reaction times, and the appropriateness of decision-making in light of unpredicted events will all need to be assessed and critiqued in the advent of autonomous vehicles becoming a common feature on public roads. The AEVA does not make any provision for the changes necessary in the driving test and the competencies that will be needed when using automated vehicles—even those partial and driver assistance features currently available. While disclaimers appear on the dashboard/screens when operating many of these features (such as those which automatically reverse and park a vehicle for the driver), these basic features are very likely to be superseded by more complex operations which require the appropriate levels of instruction and competency assurance. The highway code will require revision, details on public safety for users of the vehicles and pedestrians on the new technology and its limitations will have to be issued and, perhaps, all new drivers to these vehicles (including existing driving license holders) will have to be assessed and pass a new driving test as a matter of public safety. A particularly significant issue not covered in this legislation but one which surely will be legislated for in the coming years is data privacy and the implications of users having a ‘connected device’. That such information could be tracked, where that information will be stored and how it will be used raises a number of ethical and privacy issues which is also likely to involve consideration of human rights laws. The information generated will also enable greater tracking of accidents and determine issues, possibly as the technology becomes ubiquitous, of culpability for insurance companies. The data are likely to inform premiums, could be used to discriminate in areas of risk, and may be used to determine which software systems are most reliable/have a propensity to be involved in accidents. The Investigatory Powers Act (IPA) 2016 was enacted to make provision about the interception of communications, equipment interference and the acquisition and retention of communications data, bulk personal datasets and other information. It regulates the use of data and allows for the generation of data for the purposes of its retention. It was introduced when Teresa May was Home Secretary and was widely referred to as the United Kingdom’s ‘snooper’s charter’ given the broad hacking and bulk data collection powers inherent in the Act. There was further concern when the legislation was introduced as 76% of Britons sampled were completely unaware of the legislation in question.115 It is beyond the scope of this article to adequately explore the possible issues and concerns related to the generation, retention and use of data created from autonomous vehicles. However, given the volume of data that will be available from these computerized vehicles and the national laws, which allow communication data and metadata116 to be held and used, this area will lead to much commentary when the implications become visible. Even in 2004, the Information Commissioner warned of the UK sleep walking into a surveillance society through an increase in the recording and monitoring of people’s behavior. The IPA 2016 had been described as the ‘most intrusive surveillance law of any democracy in history’117 and if one compares the information available through the use of mobile telephones such as the time, duration, and location of a communication, including the phone numbers and/or email addresses of the sending and receiving parties,118 the extension to autonomous vehicles is concerning. Commentators including Schneier119 have described how computers constantly produce data through their input and output. Whether this is simply using a word processor or accessing the world wide web (accessing key strokes, terms searched, websites visited, etc) the computer and network its attached to is generating data. The data further extend to those sites on the web that are visited which can identify individual users—the type of computer being used, software version installed, features enabled and unique device identifiers. In returning to the information generated by mobile phones, such devices, especially following the advent of the smart phone, use GPS receivers and apps, similar to what undoubtedly will control autonomous cars, to become, what Almehmadi calls ‘the spy in your pocket’.120 Almehmadi continues that smart phones not only consist of tools which are available to collect data from the user, they are mechanisms for harvesting data from their built-in sensors and the apps (which, albeit are controlled by permissions, many users do not fully appreciate and are often inclined simply to agree without question). Similarly, modern cars presently generate an estimated 25 gigabytes of data per hour of use and this, coupled with the in-built sensors permanently monitoring speed, acceleration, braking, tyre pressures, engine temperature, braking processes, and the collecting of a variety of other data including the driver’s eye movements, hand position on the steering wheel and data from radar sensors, diagnostic systems, in-dash navigation systems and built-in cellular connections, give just a flavor of the data that autonomous vehicles will produce.121 In the same article, Fleutiaux predicts that automated vehicles will produce 3600 gigabytes of data per hour. The modern car is no longer a car with technological features, it is a computer with wheels and an engine and is subject to legislation which enables the retention of data produced and which may compromise users’ privacy and human rights. 8. CONCLUSIONS The UK Government’s aim to be at the forefront of the development and roll-out of automated and electric vehicles is to be commended. It has legislated to provide certainty to insurers and manufacturers as to the expectations imposed on them when such vehicles, and the underlying infrastructure, become viable and mainstream. However, national motor vehicle insurance is in a state of flux with both the RTA88 and the AEVA containing features, which are inconsistent with EU law. This may or may not be resolved following the United Kingdom’s intended exit from the EU, but other issues remain. The AEVA in Part 1 refers to product liability and to compulsory insurance for vehicles on a road or other public place. Both aspects of the law are wrong when compared with requirements at an EU level and in respect of the recent MIB v. Lewis case decided by the Court of Appeal. The AEVA refers to no-fault liability for automated cars at a level which is not technologically available at present. Indeed, it is unlikely that vehicles will progress from Levels 1 and 2 SAE to Level 5 without any intermediatory steps. Even for cars which claim to be ‘ready’ for fully autonomous mode (e.g. the Tesla Model S), there will be a disparity between those cars, which are a proportionately very small part of the market, and the other autonomous cars which may be more akin to driver-assisted vehicles rather than fully autonomous ones. The issue of privacy and use of data have largely been absent from explicit reference in the legislation. This is likely, along with the ethics of the use of autonomous vehicles, to be of great importance going forwards. The General Data Protection Regulation reflects issues surrounding data, who generates it, who controls it, and how it is stored and disseminated. This is necessary and research has already been published addressing aspects of this.122 Yet the data itself and its use at a national/state level have not yet been subject to such scrutiny. This is surprising as the transfer of data from servers to the vehicle will amount to communications and wherever a network exists, there is a telecommunication operator who controls it one way or another. As the telecommunication operator can be issued with a retention notice, the law is thus applicable. The AEVA has merely scratched the surface of an area, which will see much legislative activity for the foreseeable future. Footnotes 1 The authors here will resist the urge to comment on the efficacy of leaving the EU Single Market and its free-movement principles to achieve this lofty goal. 2 . 3 Such as through the Centre for Connected & Autonomous Vehicles Code of Practice: Automated Vehicle Trialing (Department for Transport 2019). 4 Tesla, a company at the forefront of autonomous vehicles, states on its UK website ‘All Tesla vehicles have the hardware needed in the future for full self-driving in almost all circumstances, at a safety level we believe will be at least twice as good as the average human driver’. . (All weblinks accessed 22 August 2019 unless otherwise stated). 5 . 6 Department for Transport ‘Vehicle Licensing Statistics: 2019 Quarter 1 (Jan–Mar)’ Statistical Release, 2019 . 7 Leading to the closure of roads, police advice discouraging travel, closing roads, and so on. 8 Figures provided by the World Health Organization and provided in a briefing to Parliament. . 9 L Jackson and R. Cracknell ‘Road Accident Casualties in Britain and the World’, Briefing Paper Number 7615, 23 April 2018. 10 . 11 . 12 . 13 For instance, new regulations agreed with the EU will result in speed limiters (‘Intelligent Speed Assistance’) becoming mandatory in all new vehicles sold across the EU from 2022. The United Kingdom has agreed to follow the initiative even if it leaves the EU. . 14 . 15 42,189 accidents reported. 16 21,211 accidents reported. 17 17,845 accidents reported. 18 15,560 accidents reported. 19 12,151 accidents reported. 20 8687 accidents reported. 21 7327 accidents reported. 22 6468 accidents reported. 23 6040 accidents reported. 24 5102 accidents reported. See . 25 Department for Transport ‘Reported Road Casualties in Great Britain: Estimates for Accidents Involving Illegal Alcohol Levels: 2016 (Final)’ Statistical Release 9 August 2018. . 26 . 27 AEVA, section 22(1). 28 AEVA, section 22(2). 29 AEVA, section 1(1). 30 AEVA, section 1(1)(b). 31 AEVA, section 1(2)(a). 32 AEVA, section 1(2)(b). 33 AEVA, section 1(2)(c). 34 AEVA, section 1(4). 35 AEVA, section 8(3)(b). 36 AEVA, section 2(1)(a). 37 AEVA, section 8(1)(a). 38 AEVA, section 2. 39 AEVA, section 4(1)(a). 40 AEVA, section 4(1)(b). 41 AEVA, section 4(4). 42 AEVA, section 2(2). 43 AEVA, section 2(3). 44 AEVA, section 2(3)(a). 45 AEVA, section 2(3)(b). 46 AEVA, section 2(3)(c)(i). 47 AEVA, section 2(3)(c)(ii). 48 Road Traffic Act 1988, section 145(4)(b). 49 AEVA, section 5(1). 50 AEVA, section 5(4). 51 AEVA, section 5(3). 52 AEVA, section 3(1). 53 AEVA, section 7(2). 54 AEVA, section 7(1)(a). 55 AEVA, section 7(1)(b). 56 AEVA, sections 10(1) and 1(a), (b). 57 AEVA, section 9(2). 58 AEVA, section 10(1)(a). 59 AEVA, section 10(2)(b). 60 AEVA, section 10(1)(b). 61 AEVA, section 10(1)(c). 62 AEVA, section 11(3). 63 AEVA, sections 11(1)(a), (b). 64 AEVA, section 11(2)(b). 65 AEVA, section 11(2)(c). 66 AEVA, section 12(1)(c). 67 AEVA, section 12(3)(a). 68 AEVA, section 12(3)(b)(i). 69 AEVA, section 12(3)(b)(ii). 70 AEVA, section 12(3)(b)(iii). 71 AEVA, section 12(4). 72 AEVA, section 12(6). 73 AEVA, sections 13(1), (2). 74 AEVA, section 13(2)(a). 75 AEVA, section 13(2)(c). 76 AEVA, section 13(2)(e). 77 AEVA, sections 14(1), (2). 78 AEVA, sections 15(1), (4)(a). 79 AEVA, section 15(1)(a). 80 AEVA, section 15(2)(a). 81 AEVA, section 15(2)(b). 82 AEVA, section 15(1)(c). 83 AEVA, section 15(1)(d). 84 AEVA, section 15(1)(f). 85 AEVA, section 16(2)(a). 86 AEVA, sections 16(2)(b), (c), (d). 87 AEVA, section 16(2)(f). 88 AEVA, section 19(1)(b). 89 AEVA, section 19(1)(a). 90 AEVA, section 19(3). 91 AEVA, section 19(4). 92 Tesla has its Model S vehicle, which it claims is ready, via software updates, to be fully autonomous when the law permits. 93 . 94 And for a broader examination of the Bill’s passage through the Lords. 95 . 96 Case C-162/13 Vnuk v. Zavarovalnica Triglav dd EU:C:2014:2146; [2014] 9 WLUK 139. 97 J Marson and K Ferris ‘The Compatibility of English Law with the Motor Vehicle Insurance Directives: The Courts Giveth…At Least until Brexit Day’ (2020) Law Quarterly Review (in press). 98 AEVA, section 3(1). 99 Consumer Protection Act 1978, section 2(2). 100 Consumer Protection Act 1978, section 2(3). 101 Consumer Protection Act 1978, section 2(1). 102 RTA88, section 152(2). 103 Case C-287/16 Fidelidade-Companhia de Seguros SA v. Caisse Suisse de Compensation and Others [2017] ECLI:EU:C:2017:575. 104 R (Roadpeace) v. Secretary of State for Transport & MIB [2017] EWHC 2725 (Admin). 105 Colley v. Shuker [2019] EWHC 781. 106 Council Directive 2009/103/EC [2009] OJ L263/11. 107 Byrne v. Motor Insurers’ Bureau [2008] EWCA Civ 574 and Delaney v. Pickett [2011] EWCA Civ 1532. 108 . 109 A position to which insurers may be very reluctant to accede due to the aggregation of possible risks based on a flaw in one software system/upgrade or patch. 110 As reported in 2015 which led to Chrysler recalling 1.4 m of its vehicles due to the possibility of hackers remotely taking control of in-car functions including the brakes and accelerator. . 111 Case C-100/18 Linea Directa Aseguradora SA v. Segurcaixa Sociedad Anonima de Seguros y Reaseguros [2019] ECLI:EU:C:2019:517. 112 Pilling v. UK Insurance Ltd [2019] UKSC 16. 113 at [45]. 114 MIB v. Lewis [2019] EWCA Civ 909. 115 A Sulleyman ‘Snooper’s Charter: Majority of Public Unaware of Government Online Surveillance’ (22 May 2017) . 116 Metadata are generated whenever a person uses an electronic device (such as a computer, tablet, mobile phone, landline telephone, or even a modern automobile) or an electronic service (such as an email service, social media website, word processing program, or search engine). Often, this results in the creation of considerable amounts of information (metadata). BC Newell ‘The Massive Metadata Machine: Liberty, Power, and Secret Mass Surveillance in the U.S. and Europe’ (2014) I/S A Journal of Law and Policy for the Information Society 10:2, 481, 488. 117 Liberty ‘State Surveillance’ . 118 See Newell, above n 116. 119 B Schneier Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (W.W. Norton 2016) 15. 120 A Almehmadi The Spy in Your Pocket (CreateSpace Independent Publishing Platform 2017). 121 F Fleutiaux ‘Vehicle Data is More Profitable than the Car Itself’ (2018) . 122 M Channon, L McCormick and K Noussia The Law and Autonomous Vehicles (Informa Law from Routledge 2019). © The Author(s) 2019. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - The Automated and Electric Vehicles Act 2018 Part 1 and Beyond: A Critical Review JO - Statute Law Review DO - 10.1093/slr/hmz021 DA - 2003-06-01 UR - https://www.deepdyve.com/lp/oxford-university-press/the-automated-and-electric-vehicles-act-2018-part-1-and-beyond-a-K7cpBYSTbW SP - 1 VL - Advance Article IS - DP - DeepDyve ER -