Seton Hall University Seton Hall University
eRepository @ Seton Hall eRepository @ Seton Hall
Student Works Seton Hall Law
2023
Liability Issues with Autonomous Vehicles: Current Uncertainty Liability Issues with Autonomous Vehicles: Current Uncertainty
and Future Solutions and Future Solutions
Dorian G. Minond
Follow this and additional works at: https://scholarship.shu.edu/student_scholarship
Part of the Law Commons
Minond 2
Liability Issues with Autonomous Vehicles: Current Uncertainty and Future Solutions
Dorian Minond
Table of Contents
Introduction ..................................................................................................................................... 4
Taxonomy of Automated Vehicle Technology............................................................................... 6
Background and Definitions........................................................................................................ 6
Level 0: No Driving Automation ................................................................................................ 8
Level 1: Driver Assistance .......................................................................................................... 9
Level 2: Partial Driving Automation......................................................................................... 11
Level 3: Conditional Driving Automation ................................................................................ 13
Level 4: High Driving Automation ........................................................................................... 14
Level 5: Full Driving Automation............................................................................................. 15
Potential Liability Implications of Automated Driving Technology ............................................ 16
Existing Auto Liability and Insurance Framework ................................................................... 17
Potential for Product Liability for Manufacturers of Automated Vehicles............................... 20
Review of Automated Vehicle Regulations in Selected Jurisdictions .......................................... 24
Nevada....................................................................................................................................... 24
California................................................................................................................................... 26
Florida ....................................................................................................................................... 28
Minond 3
The importance of who is deemed the “driver” or “operator .................................................. 29
Alternative Models for Automated Vehicle Liability Regulations ............................................... 33
United Kingdom ........................................................................................................................ 33
Canada ....................................................................................................................................... 34
Federal Preemption Potential and Examples............................................................................. 35
Conclusion .................................................................................................................................... 38
Minond 4
Introduction
Every year, new vehicles are released with more driving automation features to lighten the
driver’s workload or provide added safety. Examples of these features include adaptive cruise
control, which automatically adjusts the vehicle’s speed to maintain a safe following distance; lane
centering, which automatically applies steering force to keep the vehicle centered in the lane; and
automatic emergency braking systems, which detect an impending collision and apply the brakes
or even steer to avoid or lessen the collision.
1
More advanced driving automation features exist as
well, including some which allow the driver to take their hands off of the steering wheel and their
feet off of the pedals entirely in certain driving situations.
2
Automated driving technologies have the potential to benefit society in several ways. The
National Highway Traffic Safety Administration (NHTSA) found in a 2015 study that 94% of
crashes were attributable to human causes, including a driver’s deficient hazard recognition,
decision, or performance.
3
As automated driving systems progressively take over more of the
driving task, human drivers and their inherent shortcomings will have less opportunities to cause
crashes. Automated vehicles may also provide wider access to affordable transportation to those
who are unable to operate traditional vehicles, increasing productivity and social involvement.
4
Widespread adoption of automated vehicles may also bring efficiencies in vehicle construction
and usage that result in emissions-related environmental benefits.
5
1
SAE INTERNATIONAL, STANDARD J3016: TAXONOMY AND DEFINITIONS FOR TERMS RELATED TO DRIVING
AUTOMATION SYSTEMS FOR ON-ROAD MOTOR VEHICLES 8 (2021) [hereinafter J3016]
2
See, e.g., AUTOPILOT AND FULL SELF-DRIVING CAPABILITY, https://www.tesla.com/support/autopilot (last visited
Mar. 22, 2022).
3
SANTOKH SINGH, NHTSA, DOT HS 812 115, CRITICAL REASONS FOR CRASHES INVESTIGATED IN THE NATIONAL
MOTOR VEHICLE CRASH CAUSATION SURVEY 1 (2015),
https://crashstats.nhtsa.dot.gov/Api/Public/Publication/812115.
4
JEREMY A. CARP, AUTONOMOUS VEHICLES: PROBLEMS AND PRINCIPLES FOR FUTURE REGULATION, 4 U. Pa. J.L. &
Pub. Aff. 81, 89 (2018).
5
Id. at 91.
Minond 5
While all these systems currently on the US market as of the spring of 2022 certainly make
the driver’s job easier, safer, or more pleasant, none of them truly automate the driving task and
relieve the driver of his or her ultimate responsibility for the safe operation of the vehicle.
6
Thus,
in the event of a collision, the driver remains responsible for the operation of his or her vehicle,
regardless of whether any of these driver assistive technologies were active.
7
As automated driving
technologies advance, however, the question of who or what was in control of the vehicle will
become both more difficult and more important to answer. Manufacturers of automated vehicles
will benefit from clarity and consistency in the rules for liability and compensation in the event of
a collision involving an automated vehicle.
This paper will describe the various levels of driving automation technologies, as defined
by the industry group SAE International, formerly called the Society of Automotive Engineers.
8
These levels, which range from Level 0 (no driving automation) to Level 5 (full driving
automation), dictate the allocation of responsibility for the driving task between the driver, the
automated driving system, and other vehicle systems, and have a direct impact on the allocation
of responsibility for adverse events.
9
This paper will then examine the potential liability
implications for owners, operators, and manufacturers of automated vehicles, with an exploration
of the applicable statutes of three US states who have begun to address this issue. Finally,
automated vehicle liability schemes from foreign jurisdictions will be considered, along with
several options for federal regulation.
6
See discussion and explanation of levels of automated driving systems, infra.
7
J3016, supra note 1, at 25.
8
Id. at 2.
9
J3016, supra note 1, at 4.
Minond 6
Taxonomy of Automated Vehicle Technology
Background and Definitions
SAE J3016 is a comprehensive and broadly accepted industry standard which defines the
taxonomy of driving automation systems and automated vehicles.
10
It is published by SAE
International, a major automotive industry group formerly known as the Society of Automotive
Engineers, and is intended to serve as a set of voluntary standards to provide a common framework
and common terminology for automated driving systems worldwide.
11
The framework of the
J3016 standard has been adopted explicitly and implicitly by several state governments and by the
federal government.
12
Though vehicle manufacturers may use a variety of terms in their marketing materials to
describe driving automation technologies, the SAE J3016 standard prescribes standardized
terminology for clarity and consistency. The preferred terms are automated vehicle, driving
automation, and automated driving system.
13
These terms are preferred over the deprecated terms
“autonomous vehicle” or “self-driving vehicle,” as the latter terms are imprecise, potentially
misleading, and frequently misused.
14
J3016 specifically disavows the use of the term
“autonomousto describe driving automation because “autonomous” implies a capacity for self-
governance that automated vehicles lack even the most advanced automated vehicle is governed
by algorithms and user commands and is not truly self-governing.
15
10
Id. at 1.
11
Id. at 2.
12
See, e.g., NEV. REV. STAT. ANn. § 482A.036 (West) (defining a “fully autonomous vehicle” as one “designed to
function at a level of driving automation of level 4 or 5 pursuant to SAE J3016”); NHTSA, DOT HS 812 442,
AUTOMATED DRIVING SYSTEMS: A VISION FOR SAFETY (AV 2.0) 1 (2016) (adopting SAE Internationals
terminology to ensure consistency in taxonomy usage”).
13
J3016, supra note 1, at 6-7.
14
Id. at 34.
15
Id.
Minond 7
Several definitions from the SAE J3016 standard will help the reader understand the
taxonomy of automated driving systems. The entire set of “real-time operational and tactical
functions required to operate a vehicle in on-road trafficis referred to as the dynamic driving
task.
16
The dynamic driving task is further broken down into an extensive list of subtasks, the most
relevant of which include lateral vehicle motion, referring to steering control of the vehicle;
longitudinal vehicle motion, referring to control of acceleration and deceleration; and object and
event detection and recognition.
17
Object and event detection and recognition (OEDR) refers to
constant monitoring of the driving environment, recognizing objects and events which impact the
driving task, and executing appropriate responses to these objects and events.
18
The tasks and subtasks defined in the previous paragraph are all, of course, tasks which a
human driver must execute when driving a car manually. As the level of driving automation
incrementally increases, responsibility for the different subtasks is transferred from the human
driver to the automated driving system.
A final helpful definition is the operational design domain (ODD). An automated driving
system’s ODD refers to the “operating conditions under which a given driving automation system
or feature thereof is specifically designed to function.”
19
This may be defined as a geographic
restriction (within a certain neighborhood or campus), an environmental restriction (only
operational in dry sunny weather), a time-of-day restriction, or a restriction based on roadway
characteristics (only on limited-access highways, for example).
20
If any of its ODD parameters are
not met, the automated driving system will fall back to either a human driver (termed a fallback-
16
Id. at 9.
17
Id.
18
Id. at 17.
19
Id.
20
Id.
Minond 8
ready user) or to another automated driving system, depending on the level of driving automation
involved.
21
Figure 1: SAE J3016, Summary of Levels of Driving Automation
Level 0: No Driving Automation
A designation of Level 0 indicates that there is no driving automation whatsoever and the
human driver is responsible for all parts of the dynamic driving task.
22
The presence of active
safety systems in a vehicle does not by itself elevate the vehicle out of Level 0. Active safety
systems are those systems which monitor for and intervene during a high-risk event and include
21
Id.
22
Id. at 30.
Minond 9
automatic collision avoidance systems (which automatically apply the brakes), lane keeping
systems (which automatically apply steering input to avoid roadway departure), backup collision
avoidance systems (which automatically brake for cross-traffic or obstacles when reversing), anti-
lock brake systems and traction control systems (which automatically modulate brake inputs to
help the driver maintain positive control).
23
These active safety systems do not elevate the vehicle’s automation level because the
sustained performance of the driving task remains the driver’s responsibility, though these systems
may provide momentary assistance to avoid hazards.
24
Level 0 is logically the default driving automation level, and any vehicle which does not
qualify for a higher level is categorized in Level 0. Most vehicles on US roads are Level 0. The
driver retains all responsibility for object and event detection and recognition and all responsibility
for control of vehicle motion at Level 0.
25
Level 1: Driver Assistance
At Level 1, driving automation systems begin to have sustained control of a portion of the
dynamic driving task.
26
The driving automation system may have sustained control of either lateral
(steering) or longitudinal (acceleration and deceleration) vehicle motion, but not both
simultaneously.
27
Further, the driving automation system’s control over lateral or longitudinal
23
Id. at 6.
24
Id. at 4.
25
Id. at 25.
26
Id.
27
Id.
Minond 10
motion is constrained within a specific operational design domain (ODD).
28
This constraint can be
based on speed, geography, environmental conditions, or other factors.
A concrete example of Level 1 driving automation is a vehicle driven on the highway with
adaptive cruise control engaged. Adaptive cruise control systems automatically control
longitudinal vehicle motion by applying the accelerator or brakes to maintain a minimum
following distance from a lead vehicle.
29
In this scenario, the driving automation system has
control of longitudinal vehicle motion, but the driver remains responsible for lateral vehicle motion
by steering. Further, the adaptive cruise control system, at Level 1, is constrained in its ODD. For
example, the adaptive cruise control may be designed to operate only in a certain speed range or
with sufficiently clear environmental conditions to allow safe operation.
While an adaptive cruise control enhances safety and reduces the driver’s workload, the
driver remains fully responsible for object and event detection and recognition at this level of
driving automation.
30
The driver of a vehicle with Level 1 driving automation must constantly
supervise the driving automation system and be immediately ready to assume full control of the
vehicle. For example, an adaptive cruise control system is not designed to adjust the vehicles
speed automatically and sufficiently in the event of a ‘cut-in’ or ‘cut-out scenario. In a both
scenarios, a sudden and drastic speed difference is developed between the vehicle and a vehicle
in front of it which the driving automation system cannot accommodate, leading to a potential
collision.
31
Examples ofcut-in’ and ‘cut-out’ scenarios are shown below in Figures 2 and 3, as
warnings from vehicle owners’ manuals.
28
Id.
29
Id. at 8.
30
Id. at 25.
31
NATL HIGHWAY TRAFFIC SAFETY ADMIN., ODI RESUME: INVESTIGATION PE 16-007 8 (2017).
Minond 11
Figure 2: ACC Cut-In Scenario Warning. 2016 BMW 7-Series Owner's Manual
Figure 3: ACC Cut-Out Scenario Warning, 2016 Volvo XC-90 Owner's Manual
32
Level 2: Partial Driving Automation
Level 2, ‘Partial Driving Automation,’ is similar to Level 1 with the additional capability
for the vehicle to control both longitudinal and lateral vehicle motion control simultaneously.
33
Thus, a vehicle equipped with Level 2 driving automation systems can control steering and
braking/acceleration at the same time. A Level 2 vehicle, like Level 1, has a limited operational
design domain. Outside of the designated ODD parameters, the driving automation systems will
disengage, and full vehicle control will revert to the human driver. Further, like Level 1, the
driver of a Level 2 vehicle remains fully responsible for supervising the driving automation
systems and for object and event detection and recognition.
34
32
Id.
33
J3016, supra note 1, at 25.
34
Id.
Minond 12
A vehicle driving on the highway with both lane centering and adaptive cruise control
activated at the same time is an example of Level 2 driving automation. As noted above, the
driver is responsible for constant supervision of the vehicle’s systems and is responsible to
immediately take over the driving task if the system disengages or if required for safety.
Level 2 driving automation is currently available from several automakers in the US
market. Hyundai, Kia and Genesis offer ‘Highway Driving Assist,’ which combines adaptive
cruise control (longitudinal control) and lane centering (lateral control) and is available only on
limited-access highways.
35
With this system engaged, drivers may momentarily remove their
hands from the steering wheel; however, the system provides escalating audio and visual
warnings to prompt the driver to place their hands on the wheel, before eventually disengaging if
the driver fails to do so.
36
Similar systems include Ford’sBlue Cruiseand General Motors’
Super Cruise’ systems.
37
Tesla’s current (as of April 2022) Autopilot and ‘Full Self-Driving’ technology, despite
its name, is also a Level 2 system as it requires constant driver supervision and readiness to take
over the driving task.
38
35
WHAT IS HIGHWAY DRIVING ASSIST AND HOW DOES IT WORK?, https://www.jdpower.com/cars/shopping-
guides/what-is-highway-driving-assist-and-how-does-it-work (last visited Mar. 20, 2022).
36
Id.
37
See FORD BLUECRUISE HANDS FREE DRIVING, https://www.ford.com/support/how-tos/ford-technology/driver-
assist-features/what-is-ford-bluecruise-hands-free-driving/ (last visited Mar. 20, 2022); SUPER CRUISE: HANDS-FREE
DRIVING, CUTTING EDGE TECHNOLOGY, https://www.cadillac.com/ownership/vehicle-technology/super-cruise (last
visited Mar. 20, 2022).
38
See supra note 2.
Minond 13
Level 3: Conditional Driving Automation
A major shift occurs when an automated driving system achieves Level 3 driving
automation. At Level 3, the automated driving system performs the entire dynamic driving task
while it is engaged.
39
It may only be engaged by the driver within a defined operational design
domain, as with lower-level driving automation systems.
40
While the Level 3 automation system
is engaged, the human sitting in the driver’s seat is relieved of all driving duties. The human in the
driver’s seat, now termed a user rather than a driver, is not required to supervise the automated
driving system, and is not required to keep alert and perform the OEDR (object and event detection
and recognition) task.
41
The user must, however, be ready to assume control if the automated driving system issues
a timely request for user intervention. For example, if the Level 3 system senses that it is about to
exceed its operational design domain limits (by approaching the programmed destination or a
highway off-ramp, or perhaps because of worsening visibility), or if it experiences a system failure
(of a sensor or camera, for example), the user will be prompted to intervene and retake control.
42
Thus, the user of a Level 3 automated vehicle cannot be intoxicated, asleep, underage, or otherwise
unable to assume control of the vehicle.
An example of a Level 3 automated driving system is Mercedes Benz’s ‘Drive Pilot
technology, which was recently approved by German authorities for a limited roll-out on
designated portions of the Autobahn.
43
The ‘Drive Pilotsystem is designed to fully operate the
39
J3016, supra note 1, at 28.
40
Id.
41
Id.
42
Id.
43
ANGUS MACKENZIE, Mercedes-Benz Drive Pilot First Drive: It Actually Drives Itself*, MOTORTREND (Jan. 21,
2022), https://www.motortrend.com/reviews/mercedes-benz-drive-pilot-autonomous-first-drive-review/
Minond 14
vehicle under certain conditions on limited-access highways, freeing the user to take their hands
off the wheel and eyes off the road until they are prompted by the system to retake control.
44
As
currently configured, the Drive Pilot’ Level 3 system has relatively restrictive parameters defining
its operational design domain. It is currently intended to automatically handle slowdowns or traffic
jams rather than freely cruise the Autobahn, being limited to speeds below about 37MPH,
sufficient daylight, road moisture conditions, and road design (limited-access highways).
45
Level 3 automated vehicles are a significant step toward full automation from the driver’s
perspective, as Level 3 is the first level which truly allows the driver to occupy themselves with
other tasks while the automated driving system is engaged. Drivers are free to use electronic
devices and extend work hours to include their commute, for example, or they can watch movies
or television.
Level 4: High Driving Automation
Level 4 driving automation is defined by the automated driving system performing the
entire driving task and being capable of performing ‘fallback’ action without any expectation that
the user will intervene at any point.
46
In Level 3 driving automation, the user is responsible for
fallback action assuming control in the event the system departs from its designed operational
design domain or suffers from a failure. In contrast, Level 4 systems are capable of handling
fallback without user intervention by automatically transitioning into a minimal risk condition.
47
44
MERCEDES-BENZ DRIVE PILOT 2, https://group.mercedes-benz.com/innovation/case/autonomous/drive-pilot-
2.html (last visited Mar. 22, 2022).
45
J3016, supra note 1, at 28.
46
Id.
47
Id.
Minond 15
A minimal risk condition is defined as a “stable, stopped condition” achieved “in order to
reduce the risk of a crash when a given trip cannot be continued.”
48
Depending on the event or
failure triggering the fallback and the driving environment the automated vehicle finds itself in, a
minimal risk condition may simply be coming to a stop along the travel path, or may entail pulling
onto the shoulder or returning to a dispatch facility.
49
Level 4 automated vehicles are limited in their operational design domain.
50
The defined
operational design domain for a Level 4 automated vehicle may be limited in terms of geography,
speed, road conditions, weather, or any combination of these and other factors. For example, an
automated people mover shuttle which automatically runs a defined route on a college or corporate
campus would likely qualify as a Level 4 automated vehicle. Another example is a small,
automated delivery vehicle which navigates public roads without human supervision. Domino’s
Pizza has partnered with Nuro to test automated delivery vehicles without human drivers or
occupants in the Woodland Heights neighborhood of Houston, Texas.
51
These vehicles are limited
in their operational design domain by geographical boundaries, time of day, and likely also by
weather and road conditions.
52
Level 5: Full Driving Automation
Level 5, or Full Driving Automation, builds on Level 4 systems by removing any
limitations on the operational design domain.
53
Level 5 automated driving systems perform the
48
Id. at 15.
49
Id.
50
Id. at 26.
51
DOMINOS SELF-DRIVING DELIVERY, https://selfdrivingdelivery.dominos.com/en (last visited Apr. 11, 2022).
52
Id.
53
J3016, supra note 1, at 26.
Minond 16
entire driving task and are responsible for handling fallback to a minimal risk condition, without
any expectation of human intervention and without any limitation on the operational design
domain.
54
There are currently no publicly available Level 4 or Level 5 automated vehicles.
A Level 5 automated driving system, when engaged, does not require any human
supervision in any circumstances.
55
The automated driving system is designed to handle any
driving tasks or road conditions which “can be reasonably operated by a typically skilled human
driver.”
56
When the system is engaged, a human in the vehicle, even if they are seated in the
traditional ‘driver’s seat,’ becomes nothing more than a passenger with no responsibility to
supervise the vehicles operation or to take control of the vehicle.
57
Users of these vehicles, or
dispatchers of remotely controlled fleets of these vehicles, are responsible only for engaging the
automated driving system and indicating the destination.
58
Potential Liability Implications of Automated Driving Technology
Despite the promising technological advancements involved in the development of fully
automated vehicles, the automated vehicle industry and our society must still plan for when things
go wrong. Though automated driving systems may be more alert than their human counterparts
and may have quicker reaction times, it would be naïve to fail to plan for collisions, injuries, and
even fatalities.
54
Id.
55
Id.
56
Id. at 32.
57
Id. at 30.
58
Id. at 29.
Minond 17
The question becomes, then, whether the current automobile liability ecosystem can
properly allocate liability when one or more of the vehicles involved in a collision is not driven by
a human driver. If a human is in her Level 5 automated vehicle, with the automated driving system
activated, she is merely a passenger and not required to supervise or intervene in the vehicles
operation. Should she still be held liable if her vehicle hits someone? If a Level 3, 4, or 5 automated
vehicle, where the automated driving system is rated to perform the entire driving task, is deemed
to be at fault in a collision, should the manufacturer be held liable? If so, under what theory of
liability?
The answers to these questions, and whether they are answered consistently from one case
to the next and from one jurisdiction to the next, will inform decisions that vehicle owners,
manufacturers, and insurers will have to make as the technology matures.
Existing Auto Liability and Insurance Framework
Following a motor vehicle accident, economic recovery for an injured party generally
follows one of two paths, depending on the jurisdiction: tort-based schemes or no-fault schemes.
59
Because automobile liability insurance systems are governed by state law, they differ significantly
between jurisdictions.
In a conventional tort system, governed by common law and statutory law of the
jurisdiction, an injured party may recover damages if they show that the other party was negligent
and caused the injury that is, if the other party was ‘at fault’ for the accident.
60
In an action for
59
RAND CORPORATION, The U.S. Experience with No-Fault Automobile Insurance 7 (2010).
60
Id.
Minond 18
negligence, the plaintiff has the burden of proving that the defendant was held to a particular duty,
that he breached that duty, that the breach caused the injury, and that some specific damages
resulted.
61
If the defendant driver is insured, their insurer typically takes on the task of defending
against the claim because damages are paid out of the liability insurance policy, up to the policy
cap. Researchers have noted that some efficiency is gained in the auto tort claim system by the
often-repeating circumstances leading insurance adjusters, insurers, and attorneys to develop and
accept a ‘shorthand’ set of rules.
62
A driver who rear-ends another is typically considered to be at
fault, for example, without a requiring a full examination of all of the elements of the tort claim.
63
Motor vehicle codes and regulations can also be used to define and evaluate the duty and
breach elements of a negligence claim more efficiently.
64
Depending on the law of the particular
jurisdiction and the text of the statute, violation of a traffic law may prove negligence in and of
itself (“negligence per se”) or may be relevant evidence that a duty existed, and that the duty was
breached.
65
The alternative to the conventional tort-based auto liability system is the no-fault system.
In a typical no-fault automobile insurance system, there is no need to prove negligence as a
prerequisite to recovering damages. Instead, the injured party’s own insurer pays for the insured
party’s loss, though often limited solely to economic losses.
66
This system seeks efficiency and
lower costs by providing quicker and broader compensation to an injured party without the need
61
Restatement (Second) of Torts § 328A (Am. Law Inst. 1975).
62
RAND, supra note 59, at 8.
63
Id.
64
Id.; see Restatement (Second) of Torts § 288B (Am. Law Inst. 1975).
65
See Restatement (Second) of Torts § 288B (Am. Law Inst. 1975).
66
RAND, supra note 59, at 11.
Minond 19
to resort to often costly and lengthy litigation.
67
Inured drivers simply file claims with their own
insurance company, without regard to who is at fault in the accident.
68
In a no-fault insurance system, injured drivers are typically prevented from suing the at-
fault driver unless certain additional conditions are present.
69
These conditions, or ‘thresholds,’ are
either monetary or verbal. A monetary threshold refers to the amount of monetary loss if an
injured party suffers damages over the monetary threshold, they may sue the at-fault party despite
being in a no-fault state. Verbal thresholds refer to the severity of the injury, allowing the injured
party to sue in tort if the injury meets or exceeds some described level of seriousness, defined in
statute or case law.
70
In some no-fault jurisdictions, the insurer may have the right of subrogation against the
tortfeasor. In these scenarios, the no-fault insurer pays the economic damages of the insured party,
and then steps into the insured party’s shoes to pursue a claim against the at-fault driver or his
insurance company.
71
The current automobile insurance system, whether conventional tort-based or no-fault, is
built around the presumption that there is a human driver at the wheel, because this has been the
only reality since the system’s inception. Shifting responsibility for a vehicle’s operation away
from the human driver and toward an automated driving system designed and programmed by the
67
No-Fault Auto Insurance, INSURANCE INFORMATION INSTITUTE (Nov. 6 2018)
https://www.iii.org/article/background-on-no-fault-auto-insurance.
68
Id.
69
RAND, supra note 59, at 12.
70
Id.
71
See THOMAS J. GOGER, Annotation, No-Fault: Right Of Insurer To Reimbursement Out Of Recovery Against
Tortfeasor, 69 A.L.R.3d 830 (Originally published in 1976).
Minond 20
vehicle’s manufacturer may strain the operation of the existing auto insurance system, requiring a
new approach to determining liability.
72
Potential for Product Liability for Manufacturers of Automated Vehicles
Potential liability for manufacturers of motor vehicles in the event of a crash is not a new
concept. The field of products liability law is well developed and has been applied to, and
developed by, controversies involving motor vehicles.
73
The law of products liability provides that manufacturers, sellers, and distributors of
defective products are “subject to liability for harm to persons or property caused by the defect.
74
Products maybe found defective under any of several theories. A product may be defective because
it contains a manufacturing defect, because its design is defective, or because of inadequate
warnings or instructions.
75
Manufacturing defects occur when a product “departs from its intended design.”
76
The
approach to manufacturing defects embraced by the Restatement (Third) of Torts imposes strict
liability on manufacturers for harm caused by manufacturing defects, regardless of any showing
of the level of care or quality control the manufacturer employed.
77
For example, if an automated
vehicle is assembled improperly or is loaded with the wrong software and an injury results, this
72
But see BRYANT WALKER SMITH, Automated Driving And Product Liability, 2017 Mich. St. L. Rev. 1 (2017)
(arguing that the existing product liability regime is sufficient to deal with the new realities of automated vehicles).
73
See, e.g., Jackson v. Gen. Motors Corp., 60 S.W.3d 800 (Tenn. 2001).
74
Restatement (Third) of Torts: Prod. Liab. § 1 (Am. Law Inst. 1998).
75
Restatement (Third) of Torts: Prod. Liab. § 2 (Am. Law Inst. 1998).
76
Id.
77
Id. at cmt. a.
Minond 21
would likely be considered a manufacturing defect and an injured party may attempt to bring a
claim against the manufacturer under a theory of strict liability.
Strict liability based on manufacturing defects is unlikely to play a significant role in
resolving controversies surrounding automated vehicle crashes. Recovery for a manufacturing
defect requires a showing that the vehicle was in fact assembled contrary to the plans or
specifications. Thus, this theory is not applicable in cases where the automated vehicle was
assembled and programmed ‘properly’ (i.e., according to the design or plan) and nonetheless
caused or contributed to an injury.
78
The next step, then, is to ask whether the design or plan of the automated vehicle was
correct, or at least reasonable, and whether the vehicles manufacturer may face liability because
of a faulty design or programming. Design defects claims are evaluated under either a consumer
expectations test, or under a risk-utility test.
79
The consumer expectations test asks whether the design of the product is such that it is
dangerous “to an extent beyond that which would be contemplated by the ordinary consumer.”
80
This test is limited to those products about which an ordinary consumer would have settled and
reasonable expectations. If a technology is complex, or typically hidden from the view of a typical
consumer to the extent that they do not have an articulable expectation of its proper function, then
the consumer expectation test is inapplicable.
81
In Pruitt v. General Motors Corp., the California
Court of Appeals held that the consumer expectations test was inappropriate for evaluating the
78
See KEVIN FUNKHOUSER, Paving the Road Ahead: Autonomous Vehicles, Products Liability, and the Need for A
New Approach, 2013 Utah L. Rev. 437, 455 (2013).
79
Id. at 456.
80
Restatement (Second) of Torts § 402A cmt. i (Am. Law Inst. 1975).
81
See Pruitt v. Gen. Motors Corp., 72 Cal.App.4
th
1480, 1484 (Ct. App. 1999) (noting that the consumer
expectations test is inappropriate with complex technical items, here air bags).
Minond 22
operation of vehicle airbags, as their activation is “not part of the everyday experience’ of the
consuming public” and their proper operation constitutes a “complex technical issue.”
82
If the
function of airbags is considered too complex for this test, then the operation of an automated
driving system is also likely too complex. It is unlikely that consumers will have well-formed
expectations of the proper behavior of automated vehicles under varying road and traffic
conditions, at least not until they are on the market for many years.
83
The risk-utility test is an alternative test for evaluating design defect claims which has been
embraced by the Third Restatement.
84
This test considers whether there exists a reasonable
alternative design which would have reduced the foreseeable risks of harm, at reasonable cost, and
whether failure to use that alternative design rendered the product not reasonably safe.
85
Putting
this test into practice in the context of automated vehicles will likely face a similar challenge as
the consumer expectations test: the complexity of the technologies at issue. Application of this test
would require comparison of the automated vehicle to a reasonable alternative. Though the
industry is rapidly developing, there are currently few alternative designs to bring forward and
compare.
Plaintiffs seeking to use the risk-utility test to prove a design defect may also encounter an
obstacle to their claim if the manufacturer claims that the technology in question is “state of the
art “the safest and most advanced technology developed and in commercial use.”
86
Commentary
82
Id. at 1483-44.
83
FUNKHOUSER, supra note 78, at 57.
84
Restatement (Third) of Torts § 2 cmt. d (Am. Law Inst. 1998).
85
Id.
86
Id.
Minond 23
to the Third Restatement recognizes the difficulty of proving the feasibility of a reasonable
alternative design when the technology in question is at the cutting edge.
87
Both existing primary tests for product design defects, then, face significant challenges in
their application against the advanced and opaque technology of automated vehicles.
That said, the entry of more automakers into the automated vehicle market and the
technological nature of the product may alleviate some of these issues. As more manufacturers are
able to place operational automated vehicles into production, automated vehicle manufacturers
facing a design defect suit are likely to find their design or programming compared to that of other
manufacturers. If the defendant’s design is proven to be less safe than a reasonable alternative
design from another manufacturer, the plaintiff may prevail. To protect from such losses, it will
be incumbent upon automated vehicle manufacturers to keep abreast of market developments and
ensure that their designs are demonstrably as safe or safer than competitorsdesigns.
The high-tech and data-driven nature of automated vehicle technology may also alleviate
some of the opacity and evidentiary challenges raised above. Logged data from automated driving
systems multitude of sensors and logs of the algorithms decisions may prove to be valuable
evidence and is likely to be parsed by both sides expert witnesses in future suits to prove the
design’s reliability or fallibility, depending on which side is doing the analysis. At least one state,
California, has already enacted legislation mandating the preservation of a certain amount of
logged data from automated vehicles following an incident.
88
87
See id.
88
CAL. CODE REGS. tit. 13, § 228.02(a) (2022).
Minond 24
Review of Automated Vehicle Regulations in Selected Jurisdictions
This section will discuss the existing automated vehicle regulations in three US states:
Nevada, California, and Florida.
Nevada
In Nevada, automated vehicles are governed by Chapter 482A of the Nevada Revised
Statutes, which authorizes supplementary administrative rulemaking contained in Chapter 482A
of the Nevada Administrative Code.
89
An “autonomous vehicle” is defined by Nevada statute as
one which functions at SAE J3016 Level 3, 4, or 5, with the subcategory of “fully autonomous
vehicle” encompassing only Levels 4 and 5.
90
The automated driving systems of automated vehicles of Level 3 or higher perform the
entire dynamic driving task, including object and event detection and recognition, without
requiring a human driver to keep watch on the road. A Level 3 automated vehicle requires a human
user to be present in the vehicle and ready to take control if requested by the system; in Level 4
and 5 vehicles, the automated driving system will not ask a human user to take control.
91
Nevada Administrative Code provides thata person shall be deemed the operator of an
autonomous vehicle which is operated in autonomous mode when the person causes the
autonomous vehicle to engage, regardless of whether the person is physically present in the vehicle
while it is engaged.”
92
89
NEV. ADMIN. CODE § 482A (2019); NEV. REV. STAT. § 482A (2017).
90
NEV. REV. STAT. § 482A.030 (2017); NEV. REV. STAT. § 482A.036 (2017).
91
J3016, supra note 1, at 26.
92
NEV. ADMIN. CODE § 482A.020 (2019).
Minond 25
However, NRS 484A.080 further defines driver in the autonomous vehicle context. For an
“autonomous vehicle” (Level 3, 4, or 5) with the automated driving system engaged, the “driver
is the person who caused the automated driving system to engage.
93
For a fully autonomous
vehicle” (only Levels 4 or 5), driver does not include the natural person who engaged the
automated driving system, unless that person is also the owner of the vehicle.
94
Thus, Nevada statutes make a distinction between Level 3 automated vehicles (where a
human operator is still required to be sitting in the driver’s seat, though he can amuse himself with
other tasks while the vehicle is driving) and Level 4 or 5 automated vehicles, where the technology
will at no point ask a human to take over the driving task. In the former case, the human who
activates the automated driving system is still considered the “driver;in the latter, the human who
activates the automated driving system is only considered the driver if she is also the owner of the
vehicle.
Presumably, this distinction exists to prevent a passenger in a Level 4 or 5 automated ride-
sharing service (such as Uber without a human driver) from being deemed the ‘driver of the
vehicle they have hired. Unfortunately, no statements of legislative intent are apparent, and there
has yet to be published litigation addressing the issue or importance of who is considered the
driveror ‘operator’ of an automated vehicle in Nevada.
93
NEV. REV. STAT. § 484A.080 (2017).
94
Id.
Minond 26
Nevada is a mandatory insurance state. Autonomous vehicles are required to carry the same
minimum level of insurance as other motor vehicles in the state.
95
Nevada is not a no-fault state,
and conventional tort remedies are available as recourse for motor vehicle accidents.
96
California
The State of California regulates manufacturerson-road testing of automated vehicles in
addition to the deployment (consumer use) of automated vehicles.
97
Like Nevada, California
defines an “autonomous vehicle as a vehicle with capabilities listed in SAE J3016 Levels 3, 4, or
5.
98
To test autonomous vehicles on public roads, manufacturers are required to get a permit,
have a test driver in the vehicle who is certified by the manufacturer, and prove their financial
ability to respond to a judgment for damages for personal injury, death, or property damage.
California’s regulations require minimum liability coverage of $5 million, which can be satisfied
by insurance, surety bond, or a certificate of self-insurance.
99
Deployment refers to the actual operation of autonomous vehicle on public roads by a
member of public who is not associated with the manufacturer making the vehicle commercially
available outside of a testing program.
100
The same financial responsibility requirements are
imposed on manufacturers for deployment, namely $5 million in liability coverage by insurance,
95
NEV. ADMIN. CODE § 482A.050 (2017); NEV. REV. STAT. § 485.185 (2017).
96
NEVADA CAR INSURANCE REQUIREMENTS, https://www.nolo.com/legal-encyclopedia/nevada-car-insurance-
requirements.html (last visited Apr. 20, 2022).
97
See CAL. CODE REGS. tit. 13, § 227.00 et seq. (2022); CAL. CODE REGS. tit. 13, § 228.00 et seq. (2022).
98
CAL. CODE REGS. tit. 13, § 228.02 (2022).
99
CAL. CODE REGS. tit. 13, § 227.04 (2022).
100
CAL. CODE REGS. tit. 13, § 228.02 (2022).
Minond 27
surety bond, or proof of self-insurance to the $5 million figure above.
101
Existing California
regulations do not explicitly indicate when the manufacturer’s insurance will be the source of a
crash victim’s recovery, though the fact that the legislature implemented these requirements for
manufacturers testing or deploying automated vehicles is evidence of the expectation that
manufacturers may provide at least some of the recovery for injured parties in the event of an
automated vehicle accident.
This financial responsibility requirement imposed on the manufacturer is in addition to the
insurance requirement each individual vehicle owner must meet.
102
Vehicle owners may satisfy
their insurance requirement in California with a liability insurance policy, a cash deposit of
$35,000 with the DMV, a self-insurance certificate, or a surety bond for $35,000.
103
The minimum
liability coverage required for private passenger vehicles is $15,000 for injury or death to one
person; $30,000 for injury or death to more than one person; $5000 for damage to property.
104
California law defines the “operator” of an autonomous vehicle as “the person who is
seated in the driver’s seat, or, if there is no person in the driver’s seat, causes the autonomous
technology to engage.”
105
Unlike the other states surveyed for this report, California mandates that autonomous
vehicles be capable of recording “technical information about the status and operation of the
vehicle’s autonomous technology sensors for 30 seconds prior to a collision.”
106
California’s
regulations do not indicate what will be done with this information in the event of a collision, nor
101
CAL. CODE REGS. tit. 13, § 228.04 (2022).
102
CAL. VEH. CODE § 34687 (West 2022).
103
Id.; CAL. VEH. CODE § 16056 (West 2022).
104
CAL. INS. CODE § 11580.1b (West 2022).
105
CAL. VEH. CODE § 38750(a)(4) (West 2022).
106
CAL. CODE REGS. tit. 13, § 228.02(a) (2022).
Minond 28
do the regulations mandate the sharing of this information with other drivers, insurers, the state,
or the manufacturer. However, the mandate to collect this information will likely make it available
via discovery in the event of litigation, if it is properly preserved.
California is not a no-fault state. Injured motorists retain their right to sue other parties
for negligence.
107
Florida
Florida defines an “autonomous vehicle” as any vehicle equipped with an automated
driving system, which is further defined as a system capable of performing the entire dynamic
driving task, regardless of whether it is limited to a specific operational design domain.
108
In the
standardized parlance of the SAE J3016 standard, then, Florida considers Level 3, 4, and 5 vehicles
to be “autonomous vehicles.”
109
Florida provides for minimum insurance requirement for all vehicles used upon a
highway.
110
The general minimum liability coverage is $10,000 for bodily injury or death to one
person in any one crash; $20,000 for bodily injury or death to two or more persons in any one
crash; and $10,000 for property damage in any one crash.
111
Florida imposes an additional minimum insurance requirement for fully autonomous
vehicles which are “logged on to an on-demand autonomous vehicle network or engaged in a
prearranged ride” of $1 million in primary liability coverage for death, bodily injury, or property
107
RAND, supra note 59, at 55.
108
FLA. STAT. §316.003(3) (2021); FLA. STAT. §316.003(3)(a) (2021).
109
See supra text accompanying notes 39-58.
110
FLA. STAT. §324.021 (2021).
111
FLA. STAT. §324.021(7) (2021).
Minond 29
damage.
112
This regulation is likely targeted at rideshare companies or common carriers who
would come to use automated vehicles in their networks.
When it is engaged, the automated driving system is deemed to be the operator of the
vehicle for the purposes of traffic laws, regardless of whether a natural person is physically present
in the vehicle at the time.
113
It is unclear whether Florida’s legislative declaration that the automated driving system is
the ‘operator’ will have any effect on tort liability determinations. There are no published court
opinions addressing this issue, as the deployment of truly autonomous vehicles is limited in Florida
as of March 2022. As stated above, the commercially available “self-driving” systems from Tesla
(“Full Self-Driving”), General Motors (“Super Cruise”), and Ford (“Blue Cruise”) require the
driver to always remain alert for roadway hazards and ready to resume control, making them Level
2 systems at most (and so not “autonomous vehicles” under Florida law).
114
The importance of who is deemed the “driver” or “operator
All three states surveyed above have enacted laws or regulations defining who is the
“driver” or “operator of an autonomous vehicle. However, for the purposes of this paper, such
definitions or distinctions are inconsequential if they do not reflect a legislative intent to allocate
liability along those lines following a crash of an automated vehicle.
112
FLA. STAT. §627.749(2)(a) (2021).
113
FLA. STAT. §316.85 (2021).
114
See supra notes 2, 37.
Minond 30
The following table summarizes each state’s definition of “driver” or “operator,” and
indicates the language used to qualify that definition.
Jurisdiction
Nevada
California
Florida
Definition
Operator is the
person who causes the
autonomous vehicle to
engage.
115
Driver of a Level 3,
4, or 5 AV is the person
who caused the ADS to
engage. In a Level 4 or
5 AV, that only applies
if the person is also the
owner of the vehicle.
116
“Operator” is the
person in the driver’s
seat, or if none, the
person who caused the
automated driving
system to engage.
117
“Operator” is the
automated driving
system when it is
engaged, regardless of
whether a natural
person is present in the
vehicle.
118
Qualifying
Language
“Operator” is defined
“for purposes of this
chapter.” NAC 484A is
titled “Autonomous
Vehicles.”
119
“Driver” is defined in
NRS 484A “as used in
Chapters 484A to
484E,” which
encompasses Nevada’s
traffic laws.
120
“Operator” is defined
“for the purposes of this
division.” Division 16.6
is titled “Autonomous
Vehicles.”
121
“Operator” is defined
“for the purposes of this
chapter, unless context
otherwise requires.”
Chapter 316 of Florida
Statutes is titled “State
Uniform Traffic
Control.”
122
115
NEV. ADMIN. CODE § 482A.020 (2019)
116
NEV. REV. STAT. § 484A.080 (2017).
117
CAL. VEH. CODE § 38750(a)(4) (West 2022).
118
FLA. STAT. §316.85 (2021).
119
NEV. ADMIN. CODE § 482A.001 (2019)
120
NEV. REV. STAT. § 484A.010 (2017).
121
CAL. VEH. CODE § 38750(a) (West 2022).
122
FLA. STAT. §316.85(3)(a) (2021).
Minond 31
As noted, all the definitions are qualified as being for the purposes of the chapter or division
they are found in. Unfortunately, none of these chapters directly address the issue of allocation of
liability following a collision.
However, a definition of the autonomous vehicle’s driver or operator in the context of
traffic control laws, such as Nevada’s definition, may form the basis for a legal argument and an
extension of that definition to issues of liability. Consider a scenario in which a person enters a
Level 4 or 5 automated vehicle in Nevada and activates the automated driving system. That
individual is no longer required to monitor the vehicle or the roadway and will not be asked to
retake control.
123
Under Nevada law, if that person is not also the vehicle’s owner, they are not
deemed the “driver.” If the automated vehicle runs a red light, the person sitting in the vehicle
would not be liable for the traffic ticket. It would seem contradictory, then, if the person were
nonetheless held liable for a collision if their vehicle struck another vehicle running that same red
light. If the legislature saw fit to absolve the occupant of a Level 4 or 5 vehicle of liability for
traffic infractions, why shouldn’t that extend to liability for traffic collisions?
Unfortunately, due to the nascent state of the automated vehicle industry, no binding or
precedential on-point cases have been located in any of the surveyed states or any other
jurisdiction. Nor have any of the surveyed states directly addressed the issue of liability in their
laws, regulations, or legislative statements.
Though not precedential, one settled 2018 California case can shed light on one
automakers response to an automated vehicle negligence action. In Nilsson v. Gen. Motors LLC,
a motorcyclist brought suit against the manufacturer of a Chevy Volt which was alleged to have
123
This assumption is drawn from the definition of Level 4 and 5 vehicles. See supra, notes 46, 55.
Minond 32
been operating in “self-driving” mode where the driver had his hands off of the wheel.
124
Though
the driver allegedly commanded the vehicle to change lanes, the vehicle veered back into its
original lane, striking the plaintiff.
125
The plaintiffs cause of action was negligence based on the
defendants vehicle not following the traffic rules and regulations, not product liability.
126
Surprisingly, in their answer, General Motors admitted that the vehicle was required to use
reasonable care in driving.
127
This case was settled by the parties in the pleading phase, and so the
case carries no precedential value, nor are we able to see the parties’ proposed legal arguments in
the record or in their briefs. It would certainly be an overreach to use this single undeveloped case
to state that all automakers accept that their vehicles bear the burden of liability. However, it is
nonetheless interesting to see that an automaker so easily accepted that their vehicle had a duty to
follow the rules of the road, even where the vehicle in question could not have been higher than a
Level 2 automated vehicle.
Ultimately, until either legislatures or courts directly address the issue of allocation of
liability between an automated vehicle’s owner, occupant, operator, or manufacturer, there will
remain a large amount of uncertainty and risk. The following section will explore approaches used
or considered by other jurisdictions, as potential models for US jurisdictions to follow.
124
Nilsson v. Gen. Motors LLC, No. 18-471 (N.D. Cal. Jan. 22, 2018), ECF No. 1.
125
Id.
126
Id.
127
Nilsson, ECF No. 18, paragraph 15.
Minond 33
Alternative Models for Automated Vehicle Liability Regulations
United Kingdom
The United Kingdom’s Automated and Electric Vehicles Act of 2018 defines the liability
scheme for insurers of automated vehicles.
128
This legislation explicitly addresses a situation where
an automated vehicle is involved in an accident while “driving itself,” defined in the statute as
“operating in a mode in which it is not being controlled, and does not need to be monitored, by an
individual.” This definition corresponds to Level 3, 4, or 5 automated vehicles by SAE J3016
definitions.
129
Under this legislation, a person injured by an automated vehicle “driving itselfrecovers
directly from the insurer covering that vehicle.
130
If that vehicle is not insured (which would be
contrary to law), then the owner of the vehicle is liable for the damages.
131
The intent of this policy
is to compensate injured parties quickly and fairly and avoid the need for injured parties to sue
automated vehicle manufacturers through the courts for compensation.
132
The insurer or vehicle owner (whoever paid the injured party) is then able to bring a claim
against “any other person liable to the injured party in respect of the accident.”
133
If the accident
was caused by a defect in the vehicle or its programming, for example, the insurer can pursue a
claim against the vehicle manufacturer. In this way, this statutory scheme functions like the no-
fault insurance schemes described in previous sections, prioritizing compensation of injured
128
Automated and Electric Vehicles Act 2018, c. 18 (Eng.), https://www.legislation.gov.uk/ukpga/2018/18/contents.
129
See supra text accompanying notes 39-58.
130
Automated and Electric Vehicles Act 2018, supra note 113, at §2(1).
131
Id. at §2(2).
132
Automated and Electric Vehicles Act 2018, c.18, Explanatory Notes ¶ 11 (Eng.).
133
Automated and Electric Vehicles Act 2018, supra note 113, at §5.
Minond 34
parties without a finding of fault and contemplating a future claim for reimbursement of the insurer
from the liable party.
134
Canada
Canada has not enacted nationwide legislation addressing liability and automated vehicles,
but the non-governmental industry group Insurance Bureau of Canada (IBC) released a report in
2018 with recommendations for legislation.
135
This report recognizes that existing auto liability
policies presuppose human error as the cause of motor vehicle crashes, and that this assumption
will not hold true as automated vehicles take to the streets unsupervised by human drivers.
136
The
report recognizes that a liability claim stemming from an automated vehicle accident would likely
involve products liability claims which are more complex than typical motor vehicle collision
claims, delaying compensation to injured parties.
137
The report also posits that it would be
especially difficult to determine fault in collisions occurring during the transitional period where
conventional vehicles and automated vehicles share the roads.
138
The IBC Report recommends establishing a single insurance policy which would
simultaneously cover both driver negligence and the automated driving technology.
139
This policy
would compensate injured parties quickly, with the insurer able to recover its costs from the vehicle
manufacturer if the manufacturer is later shown to be at fault.
140
The IBC notes that this policy
134
See supra text accompanying notes 66-72.
135
INSURANCE BUREAU OF CANADA, Auto Insurance for Automated Vehicles: Preparing for the Future of Mobility
(2018). [hereinafter IBC Report]
136
Id. at 3.
137
Id. at 8.
138
Id.
139
Id. at 3.
140
Id. at 9.
Minond 35
solution, compared with a strict no-fault approach, can co-exist with the “mixed no-fault and tort
policies that are common in Canada.”
141
The IBC further recommends a data-sharing agreement between vehicle manufacturers,
owners, and insurers to help determine the cause of a collision and help efficiently resolve disputes
and claims.
142
This data-sharing scheme echoes the data collection mandated by California statute,
though the California statute does not mandate the sharing of that information.
143
Federal Preemption Potential and Examples
Current automobile liability insurance frameworks in the United States are a patchwork
system, with each state setting its own regulations. Federal preemption, however, is a distinct
possibility and represents an opportunity to harmonize and simplify the autonomous vehicle
liability system. A uniform nationwide system would likely be preferable to vehicle
manufacturers, who would otherwise be exposed to state-by-state differences in liability schemes
and products liability case law.
Federal preemption of vehicle safety issues is not without precedent. The National
Highway Traffic Safety Administration (NHTSA) acts under the authority of the National Traffic
and Motor Vehicle Safety Act to create and promulgate the mandatory Federal Motor Vehicle
Safety Standards (FMVSS).
144
These regulations are focused on vehicle safety features, and
NHTSA has already directed significant effort in developing FMVSSs specific to Level 4 and 5
141
Id. at 10.
142
Id. at 12.
143
See supra note 104.
144
National Traffic and Motor Vehicle Safety Act, 49 U.S.C. § 30101 et seq. (West 1994).
Minond 36
autonomous vehicles.
145
FMVSSs typically address vehicle safety issues and design, and not the
liability insurance surrounding motor vehicles.
However, there is precedent for federal vehicle regulations being used by vehicle
manufacturers as a defense against products liability claims, by the federal standard preempting a
conflicting state regulation.
146
In Geier v. American Honda Motor Co., the Supreme Court held
that a state vehicle safety standard was federally preempted by a conflicting FMVSS, and thus
the products liability suit based on strict liability for failure to adhere to the state standard
failed.
147
The Court held that even if the legislation enabling the publication of FMVSSs did not
expressly preempt state regulations, conflict preemption principles still applied.
148
It seems
plausible for Congress to enable the Department of Transportation to enact similar regulations
that could provide structure and guidance for the allocation of liability in autonomous vehicle
crashes. This federal solution, preempting disparate state solutions, would be beneficial to
manufacturers and insurers by providing uniformity, certainty, and stability.
Other federal legislation can provide examples for preemption in the liability arena. One
extreme approach would be to flatly limit the liability of automated vehicle manufacturers, to
protect the nascent industry. This liability-limiting approach was employed to protect the civilian
nuclear industry in the 1950s with the Price-Anderson Nuclear Industries Indemnity Act.
149
Under this scheme, still in existence in a modified form, licensed nuclear operators are required
to obtain primary commercial insurance coverage which is pooled as an industry to reimburse
145
See NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, FMVSS Considerations for Vehicles With
Automated Driving Systems: Volume 1, Report No. DOT HS 812 796 (Apr. 2020).
146
Geier v. American Honda Motor Co., 529 U.S. 861 (2000).
147
Id. at 874-77.
148
Id. at 882.
149
Price-Anderson Nuclear Industries Indemnity Act, 42 U.S.C. § 2110 et seq. (1957).
Minond 37
injured parties in the event of a nuclear incident.
150
The total liability of the licensees is capped
by statute, and any damages above that statutory cap would be covered by the federal
government.
151
Hypothetical federal legislation could establish a similar system for automated vehicle
manufacturers, whereby each manufacturer would be responsible for maintaining insurance
coverage proportional to their share of automated vehicles on the road, and where claims for
damages caused by automated vehicles would be paid from the industry insurance pool. This
system would set up incentives for the industry to self-police and maximize safety but could
equally incentivize manufacturers to be free riders knowing that liability for their faulty designs
is spread over the industry as a whole.
The National Childhood Vaccine Injury Act is another potential model for federal
preemption of the liability ecosystem.
152
This law limits the financial liability of vaccine
manufacturers from claims of injury, with the goal of encouraging manufacturers to remain in
the market and provide a steady supply of vaccines.
153
Like vaccines, automated vehicles have
the potential to have an overall positive effect on public health.
154
Federal legislation protecting
the automated vehicle manufacturers from liability and providing for efficient arbitration
procedures would remove uncertainty regarding manufacturer liability and encourage
manufacturer participation in the marketplace.
150
Id.
151
Id.
152
National Childhood Vaccine Injury Act, 42 U.S.C. §§ 300aa-1 to 300aa-34 (1986).
153
See id.
154
See supra note 3.
Minond 38
Conclusion
Automated vehicles are rapidly being developed by several companies competing for
future market share. The regulation of vehicle safety and auto liability in the United States is a
patchwork system of state laws and federal vehicle safety regulations. Several states have enacted
legislation specifically addressing automated vehicles, though no state laws have been identified
which directly address the apportionment of liability in the event of a collision involving an
automated vehicle operating without human input. The federal government has begun to develop
and institute safety regulations for the design and operation of automated vehicles, but likewise
has not regulated the topic of liability.
The absence of clear guidance regarding liability may cause manufacturers to be reluctant
to be the first to enter the Level 3, 4 or 5 automated vehicle market. In the absence of legislation
indicating otherwise, it is likely that products liability law will be applied to tort claims resulting
from automated vehicle accidents. Existing products liability law is likely to be problematic and
inconsistent when applied to the complex technology of automated vehicles.
Alternatives exist, including the UK’s Automated and Electric Vehicles Act 2018, which
makes clear that the automated vehicle’s insurer will pay any injured party and may later pursue a
claim against the manufacturer or any other responsible party. The Insurance Board of Canada
recommends mandating a single insurance policy for automated vehicles which equally covers
driver negligence and the automated vehicle technology. The federal government may also
Minond 39
preempt state law regarding automated vehicle liability, as it has done in the past for liability issues
for other industries.
The widespread adoption and use of automated vehicles is likely to have a significant
positive impact on society. Automated vehicles hold the promise for lower emissions, more
efficient land use, and lower traffic injuries and fatalities. Clear and consistent regulations on the
issue of liability will likely encourage the development of the industry and hasten these positive
societal impacts.