In their first year, law students are introduced to a revered mythical beast: the Common Law. Calling the American legal system’s devotion to the Common Law “religious” would be an understatement: people violate religious tenets all the time and then ask for forgiveness later, but no jurist deviates from the Common Law without explaining why, and typically they don’t deviate at all unless the state legislature or the federal Congress has told them to.
Why the heck were the West Memphis Three allowed to plea guilty and not guilty? Because the Common Law permitted nolo contendere, and not even the Supreme Court in Alford felt comfortable overruling that.
Justice Oliver Wendell Holmes’ most famous quote? “The life of the law has not been logic: it has been experience.” It’s from the introduction to crowning achievement, his book, The Common Law.
It’s that important.
So when I saw Alan Crede and Frances Zacher discussing liability arising from driverless cars (e.g., the Google Car, recently in a crash and much of this discussion arises from Tyler Cowen’s op-ed in the New York Times; the American Scientist discusses many of the benefits), I wondered: what would the Common Law hold?
No need to reinvent the wheel here, so to speak. The accident occurred in California so we’ll look to their laws. If we view the driverless car as analogous to a wild animal, then:
[D]efendant, and its employees, knew ‘that the leopard was a wild, untamed animal, of fierce, dangerous, vicious, ferocious character, nature, and disposition.’ By this admission, defendant charged itself with the duty toward the plaintiff, as well as all other persons, to guard the leopard in such manner as to absolutely prevent the occurrence of an injury to others through such vicious acts of the animal as it would naturally be inclined to commit. The liability of the owner is absolute, in such cases, and he is bound to keep the animal secure, or he must suffer the penalty for his failure to do so, in making compensation for the mischief done, unless it can be shown that the person injured voluntarily, or consciously, did something to bring about the injury. The gist of the action is not the manner of keeping the vicious animal, but the keeping him at all with knowledge of the vicious propensities.In such instances the owner is an insurer against the acts of the animal, to one who is injured without fault, and the question of the owner’s negligence is not in the case.
Opelt v. Al G. Barnes Co., 41 Cal. App. 776, 779, 183 P. 241, 242 (1919)(citations omitted).
If, instead, the car is more like a domestic animal, then:
[T]he owner of an animal, not naturally vicious, is not liable for an injury done by it, unless two propositions are established: 1. That the animal in fact was vicious, and 2. That the owner knew it. Thus, if an animal theretofore of peaceable disposition, while in charge of the master or of a servant, suddenly and unexpectedly, either through fear or rage, inflicts injury, neither is responsible, if at the time he was in the exercise of due care. But, conversely, the owner of such an animal knowing its vicious propensities is liable for injury inflicted by it upon property or upon the person of one who is free from fault.’ And as is said in Barrett v. Metropolitan Contracting Co., ‘It is unquestionably true, as declared in Haneman v. Western Meat Co., that it is of the essence of the plaintiff’s case that he is ignorant of the viciousness of the animal until the injury has occurred.’ In order to recover, the plaintiff must himself be ignorant of any vicious trait on the part of the animal.
Mann v. Stanley, 141 Cal. App. 2d 438, 441, 296 P.2d 921, 923 (1956)(citations omitted).
What we’re really talking about then, is a simple choice between strict liability and negligence. If the driverless car is like a wild animal known to be dangerous, then it’s strict liability; if the driverless car is like a domesticated animal presumed to be safe, the the plaintiff must prove negligence by the owner, operator, or manufacturer of the car.
I’m a strong believer in ensuring that corporations are held legally responsible for the damage they cause, but “strict liability” isn’t necessary to do that here. We can go right back to the case that first adopted the “crashworthiness doctrine,” Larsen v. General Motors Corporation, 391 F.2d 495 (8th Cir. 1968) — although the case itself never uses the term “crashworthiness” — and find an answer to our questions about driverless car liability:
The intended use and purpose of an automobile is to travel on the streets and highways, which travel more often than not is in close proximity to other vehicles and at speeds that carry the possibility, probability, and potential of injury-producing impacts. The realities of the intended and actual use are well known to the manufacturer and to the public and these realities should be squarely faced by the manufacturer and the courts. We perceive of no sound reason, either in logic or experience, nor any command in precedent, why the manufacturer should not be held to a reasonable duty of care in the design of its vehicle consonant with the state of the art to minimize the effect of accidents. The manufacturers are not insurers but should be held to a standard of reasonable care in design to provide a reasonably safe vehicle in which to travel. Our streets and highways are increasingly hazardous for the intended normal use of travel and transportation. While advances in highway engineering and non-access, dual highways have considerably increased the safety factor on a miles traveled ratio to accidents, the constant increasing number of vehicles gives impetus to the need of designing and constructing a vehicle that is reasonably safe for the purpose of such travel. At least, the unreasonable risk should be eliminated and reasonable steps in design taken to minimize the injury-producing effect of impacts.
I similarly “perceive of no sound reason, either in logic or experience, nor any command in precedent,” why the liability for driverless cars should be any different. Was the software programmed and debugged negligently? Did the company use unreliable hard drives? These are the same types of factual questions that arise in our typical seat belt failure, tire defect and van rollover crashworthiness cases.
In short, we have ample precedent and Common Law to absorb driverless car cases into the legal system; if it ain’t broken, don’t fix it.