O Brave New World, That Has Such Vehicles In’t!

(With Apologies to Shakespeare, The Tempest: Act V, Sc. i.)

By ROBERT W. PETERSON, PROFESSOR, SANTA CLARA LAW

Robert PetersonAre autonomous vehicles as new as they seem? Not really. Keep in mind that horses were autonomous—they could find their way home with little or no help from their “drivers.” Indeed, the Gospels report that on Palm Sunday Jesus rode into Jerusalem on an autonomous vehicle—a donkey.

Not only could horses find their way home, but they could also use an overhanging branch to rid themselves of their driver. Unlike cars, they also bite and kick. Per hour of riding, the Centers for Disease Control and Prevention estimates that horses are more dangerous than motorcycles. So, autonomous travel is not so new. Autonomous travel in cars is just better.

As the number of vehicles increased, so did the death and injury rate from manually driven cars. In the U.S. alone, vehicles kill between 33,000 and 35,000 people annually. In 2015 the number of deaths topped 36,000. This is as if five 737 jets crashed every week. It is more than twice the total number of people who have died worldwide in the recent Ebola epidemic.

The National Highway Traffic Safety Administration (NHTSA) estimates that between 93 percent and 95 percent of these accidents are caused by human error. In addition to deaths, vehicle accidents send about 2.5 million people per year to emergency rooms. NHTSA estimates the U.S. economic and social costs of vehicle accidents at $871 billion a year (not including cost of car ownership).

Take a simple test. In your preferred media source, carefully read each account of a person killed or injured by vehicles. Then ask, would this tragedy have been avoided, or the injury mitigated, if one or more of the vehicles had been self-driving?

We tolerate this carnage because cars bring great utility and freedom. Self-driving vehicles will deliver even greater utility by freeing driving time for other things—be it texting, working, or just relaxing. Self-driving cars also deliver huge benefits to the disabled and to those lucky enough to live until they lose their licenses. At the same time, self-driving cars will remove much of the human error that contributes to the vast majority of injuries and deaths.

Self-driving cars also deliver a number of broader social utilities. These range from far more efficient use of our present land and infrastructure to more overall productive lives.


Google Car

Still dubious about the idea? Take your first step toward a test drive by visiting google.com/selfdrivingcar.

Self-driving cars will not create utopia. There will still be some accidents, although far fewer. When accidents do occur, there will be regulatory and public relations challenges for this new technology.


Americans have dreamed of driverless horseless carriages since the 1930s, but their advent had to await the development of cheap and convenient computing power. Let’s look at a few interesting facts.

Young people today seem far less enamored with driving than in the recent past. If they license at all, many license much later and drive fewer miles. Rather than driving to see friends, they may opt to text or call. Smart phones may replace cars as the future’s status symbol.

In addition, car ownership is a major expense. Using fleets of on-call vehicles saves not only the cost of a depreciating asset that spends 95 percent of its time idle, but also saves on the other major cost of a car—insurance. When polled about selfdriving cars, higher safety and lower insurance are the two most persuasive factors motivating those who would purchase them.

Indeed, in many respects self-driving cars are already here. You may be followed by one. Some of the most recent safety improvements will also drive the car under some circumstances. Adaptive cruise control, lane keeping, automatic braking, traffic jam, and parking assist are just the most recent developments in a clear trajectory toward self-driving cars. At present, however, all still require the driver to continuously monitor and take control in an emergency.

Of course, self-driving cars will not create utopia. There will still be some accidents, although far fewer. When accidents do occur, there will be regulatory and public relations challenges for this new technology. For example, May 7, 2016, was a day like most days. One would expect about 100 people to die in the U.S from traffic accidents. It is likely a person will die while you are reading this article. (No, it won’t help if you stop reading).

At least two fatal accidents occurred in Florida on May 7. A large semi-truck failed to yield the right-of-way when it made an unprotected left turn across a divided highway in front of an oncoming car killing the car driver. In another Florida accident, a car flipped over, killing four people and injuring three. In Chicago, one person was killed and six people injured when a driver ran a red light. In Pennsylvania, three people were killed by a wrong-way driver.

Now for the question: Have you read or heard about any of these accidents? I will wager that you have heard or read about only one—the first one. Why? Because the truck turned in front of a Tesla driven in semi-autonomous “Autopilot” mode. This Florida accident was covered by countless outlets, from the New York Times to the front page of the Santa Cruz Sentinel. Such is our fascination with or fear of “robot” cars.

Self-driving cars will also present some cyber challenges. Your current vehicle can be hacked through any number of surfaces, including on-board diagnostic systems, radio, antilock brakes, keyless entry, tire pressure monitoring system, engine control unit, airbag control unit, HVAC, and transmission control unit. As manufacturers collect data from autonomous vehicles, they will also continually improve their performance with over-the-air downloads. In addition, future vehicles will begin to communicate with each other (vehicle-to-vehicle, or VtoV communication). Both downloads and VtoV communication present challenges to validate the communications to avoid hacking or malicious code. NHTSA, manufacturers, and many others are diligently working on hardening self-driving cars from cyberintrusion. Likewise, gathering information about driving will implicate some important privacy issues.

Doubtless, there will be some people who will never give up their cars. There will be some who live in areas difficult to serve with self-driving cars (only 67 percent of U.S. roads are even paved). Some regulators may stall because they fear criticism after an accident like the May 7 fatality. And there will be some who will argue self-driving cars are “unsafe” because self-driving cars may be a threat to their business. One can hardly expect repair shops, emergency rooms, and funeral parlors to argue they deserve more business.

Are Americans ready for autonomous vehicles? Self-driving cars offer such a wealth of advantages that it makes little difference. Americans need to get ready.


The Legal Environment for Autonomous Vehicles

By DOROTHY J. GLANCY, PROFESSOR, SANTA CLARA LAW

Dorothy GlancyVehicles that can drive themselves have been the stuff of science fiction and fantasy for a long time. The future will bring driverless cars, trucks, and buses that provide mobility for non-drivers, such as persons with disabilities and the elderly, as well as safety and more efficient use of our nation’s roadways. Here in Silicon Valley, we are already sharing the road with experimental self-driving vehicles.

As my colleague Professor Robert Peterson points out in his essay on page 18, many thousands of lives, otherwise lost in vehicle crashes, are projected to be saved by taking human drivers (the cause of roughly 90 percent of vehicle accidents) out of the loop. According to Morgan Stanley, autonomous vehicles are expected to generate at least $507 billion in yearly productivity gains. Goldman Sachs estimates that the market for advanced driver assistance systems and autonomous vehicles will grow from about $3 billion in 2015 to $96 billion in 2025 and $290 billion in 2035.

Although just visible over the horizon, autonomous selfdriving vehicles are not yet available for purchase by U.S. consumers. Before that can happen, the legal system will have to figure out how to respond to this new form of ground transportation. That is where three Santa Clara Law professors made a significant contribution.

The National Academies of Sciences Transportation Research Board commissioned a careful look at the legal environment for driverless vehicles. I persuaded professors Kyle Graham and Robert Peterson to collaborate with me on an extensive research project that produced “A Look at the Legal Environment for Driverless Vehicles,” Legal Research Digest 69, published in February 2016. You can read this eighty-page monogram on the web here.

This ground-breaking legal research project combined different perspectives from three Santa Clara law professors, all of whom look at driverless cars from different angles. Professor Graham brought his inimitable depth of thought about how tort law responds to risk, about the challenges of criminal law and procedure, as well as about how United States law has gradually responded to new transportation technologies that were just as new and challenging in the nineteenth century as autonomous vehicles are today. Professor Peterson is a longstanding expert in products liability and insurance law, and directs Santa Clara Law School’s Insurance Law Institute. I brought a background in transportation technologies, as well as regulatory law, privacy, security, and sustainability. The legal research required a remarkable synergy among three quite different legal minds to figure out the legal ramifications of an innovative type of transportation mode that is not yet available in the U.S. marketplace.

Challenges to analysis of how the legal system will embrace driverless vehicles are not limited to the fact that these vehicles are not yet in commercial production. In addition, the field is so new, it does not even have standard terminology. For example, passenger cars that provide personal mobility without the intervention of a human driver may be called “autonomous” or “driverless” or “self-driving cars.” Alphabet X (formerly known as Google) refers to its test cars as “self-driving.”


Recent consumer polls showed a large proportion of driver-aged adults expressing reluctance to ride in fully autonomous vehicles with no human operator. Automated vehicles, with a driver still in control, appeared to be slightly less scary to those polled. In addition, from a legal standpoint, autonomous vehicles confront an ambivalent regulatory framework.


Adding to the semantic muddle, commercially available forms of vehicle automation sometimes describe themselves as “semi-autonomous” or as offering “autonomous driving.” For example, Tesla’s model S features a semi-autonomous driving mode (still in beta testing) called “Autopilot.” Recently, Consumer Reports criticized Tesla’s use of “autopilot” as fostering confusion about the car’s capacity to safely operate itself while, at the same time, the human driver is supposed to remain alert at all times and ready to take control of the vehicle in emergencies.

Initially, “autonomous vehicle” meant that there was no human driver exercising operational control of the vehicle which entirely operated itself. Gradually, “autonomous” began to be associated with various automated vehicle systems that perform particular functions for the driver, including electronic stability control and automatic braking. Even more technologically sophisticated, commercially available part-time autonomous driving systems are options that assist drivers in coping with particular driving situations, such as slow traffic (Mercedes Traffic Jam Assist) or controlled access highways (General Motors Supercruise). As a result, federal regulatory agencies, such as the National Highway Traffic Safety Administration (NHTSA), now refer to a range of vehicle automation, rather than to autonomous vehicles. These regulatory levels of vehicle automation will range from limited driver assistance options to fully automated driverless vehicles that will operate themselves at all times, without intervention by any human operator. Right now, most modern vehicles are somewhere in between automated assistance to human drivers, at one end of the spectrum, and, at the other end of the spectrum utterly driverless, with no human driver in the control loop.

Convergence of a variety of technological advances in such fields as artificial intelligence, optics and lidar, as well as advanced electronic switches, make possible the development of motor vehicles that eventually will not need human operators. Already, test vehicles (with human safety drivers only for backup) have safely operated autonomously over millions of miles on public roads. Recent reports of crashes involving vehicles with autonomous features, such as the Tesla crash discussed by Professor Peterson in his essay, are minor anomalies in a general pattern of mostly safe autonomous operation of experimental vehicles. These incidents raise caution flags but are not likely to end the race toward fully driverless vehicles.

Nevertheless, autonomous vehicles will certainly face a number of bumps in the road ahead. Fully self-driving driverless vehicles may not be available for widespread consumer purchase for a number of years. The reasons are many. Recent consumer polls showed a large proportion of driver-aged adults expressing reluctance to ride in fully autonomous vehicles with no human operator. Automated vehicles, with a driver still in control, appeared to be slightly less scary to those polled. In addition, from a legal standpoint, autonomous vehicles confront an ambivalent regulatory framework.

At the federal level, NHTSA and the U.S. Department of Transportation remain reluctant to permit autonomous vehicles in U.S. markets until they have proved safe, not only for the vehicles’ occupants, but also for other road users. Moreover, various states set different regulatory requirements that can complicate the legal operation of autonomous vehicles nationwide. For example, a New York statute still requires that vehicles have a driver with one hand on the steering wheel. California’s vintage anti-truck-caravan law makes platoons of wirelessly connected driverless trucks problematic. These matters are expected to be resolved over time by lawyers and regulators, with the assistance of good old-fashioned American know-how.

Our study of the legal environment for driverless vehicles indicates that the U.S. legal system is mostly prepared to embrace autonomous vehicles, with a few tweaks and modifications still needed before consumer versions of self-driving vehicles will be fully street-legal. The current regulatory tendency is to go slow so as to avoid safety hazards. In the summer of 2016, Secretary of Transportation Anthony Foxx emphasized that in autonomous vehicles: “We want people who start a trip to finish it.” Still, he cautioned, “Autonomous doesn’t mean perfect.”


Google X Tesla logo Mercedes Benz logo GM logo

 

Several companies are developing self-driving cars, including X (a research and development arm of Google), Tesla, Mercedes-Benz, and General Motors.