Blaming the autonomous vehicle computer as a regulatory strategy

Blaming the autonomous vehicle computer as a regulatory strategy

 A technique being used by the AV industry in
pursuing state regulation is attempting to blame the computer for any crashes
by saying that the Automated Driving System (the computer) is considered to be
the driver of any AV operating on public roads. That way there is no person at
fault for any harm to road users. Yes, really, that is what is going on.

The general AV industry tactic when lobbying for
such rules is to argue that when fully automated driving is engaged the “driver”
is the driving computer (the ADS). Any remote safety supervisor is just there
to lend a hand. In some states a remote human support team member need not have
an appropriate driver license, because it is said that the ADS that is the
driver. Superficially this seems to make sense. After all, if you are a
passenger who has paid for a retail robotaxi ride and the AV breaks a traffic
law due to some flaw in the design, you as the passenger should not be the one
to receive a ticket or go to jail.

But the tricky bit is that ADS computers are not
afforded the legal status of being a “person” – nor should they be.
Corporations are held to be fictitious people in some legal circumstances, but a
piece of equipment itself is not even a fictitious person.

If a software defect or improper machine learning
training procedures result in AV behavior that would count as criminally
reckless driving if a human were driving, what happens for an AV? Perhaps
nothing. If the ADS is the “driver” then there is nobody to put on trial or
throw into jail. If you take away the driver’s license for the ADS, does it get
its license back with the next software update? Where
are the repercussions for an ADS being a bad actor? Where are the consequences?

See also  Mythbusting: LATCH is Safer Than a Seatbelt Installation

Blaming the ADS computer for a bad outcome removes
a substantial amount of deterrence due to negative consequences because the ADS
does not fear being harmed, destroyed, locked up in jail, fined, or having its driver’s
license revoked. It does not feel anything at all.

A related tactic is to blame the “operator” or “owner”
for any crash. In the early days of AV technology these roles tended to be
either the technology developer or a support contractor, but that will change
over time. Contractors perform testing operations for AV developers. Individual
vehicle owners are operators for some AV technology road tests. Other AV
operators might work through a transportation network service. Someone might
buy an AV in the manner of a rental condo and let it run as a robotaxi while
they sleep.

Imagine an arrangement in which an investor buys a
share in a group of robotaxis as might be done for a timeshare condo. A
coordinator lines up independent contractors to manage investment money, negotiate
vehicle purchases, arrange maintenance contracts, and participate in a ride-hailing
network. Each AV is the sole asset of a series LLC to act as a liability
firewall between vehicles. The initial investor later sells their partial
ownership shares to an investment bank. The investment bank puts those shares
into a basket of AV ownership shares. Various municipal retirement funds buy
shares of the basket. At this point, who owns the AV has gotten pretty
complicated, and there is no substantive accountability link between the AV
“owner” and its operation beyond the value of the shares.

Then a change to the underlying vehicle (which was
not sold as an AV platform originally, but rather was adapted by an upfitter
contractor) impairs functionality of the aftermarket add-on ADS manufactured by
a company that is no longer in business. If there is a crash who is the “operator?”
Who is the “owner?” Who should pay compensation for any harm done by the AV? If
the resultant ADS behavior qualifies as criminally negligent reckless driving,
who should go to jail? If the answer is that nobody goes to jail and that only
the state minimum insurance of, say, $25K pays out, what is the incentive to
ensure that such an arrangement is acceptably safe so long as the insurance is
affordable compared to the profits being made?

See also  Tesla recalls nearly all U.S. vehicles to fix system that monitors drivers using Autopilot after investigation into deadly crashes

While the usual reply to concerns about
accountability is that insurance will take care of things, recall that we have
taken some passes at discussing insurance and risk management can be
insufficient incentive to ensure acceptable safety, especially when it only
meets a low state minimum insurance requirement
originally set for human drivers that have skin in the game for any crashes.