I Never Volunteered

David Potenziani
4 min readAug 11, 2023
Silicon or Carbon Brains?

Many of us have been invited to participate in a clinical trial. Our online patient portals usually have a tab or button where we can learn more and volunteer to participate.

Let’s say you suffer from toenail fungus. You hear of a new trial that will offer a cure. But there is a very small, but non-zero, chance you might die during the trial. Your death would be not be predictable nor would there be any forewarnings. You just become dead.

Would you participate? My guess is that almost everyone, even those suffering from toenail fungus, would give a pass to that offer.

So, why are we all participating in a gigantic trial to develop and test self-driving cars? Even though some of us have already been killed.

That’s the question that autonomous vehicle (AV) companies are asking us by not posing it. If you are asked to participate in a medical clinical trial, you can always say no. If you do join one, you can leave at any time for any reason — or for no reason at all. Your participation is up to you.

Today, all people — not just drivers — are participants in the experiment to develop and perfect AVs. Every driver, every pedestrian, and even those off the streets are involved in the experiment. While these are supposed to be regulated by the states and the National Highway Transportation Safety Administration (NHTSA), that only applies to the companies and their vehicles. Fifteen states have no regulations about autonomous vehicles.

AVs have struck pedestrians. They have rammed into jack-knifed semi-trailers. They have wandered into medians for no discernible reason. They have snarled traffic at fires by just stopping (and ignoring the directions by emergency personnel) because they cannot resolve uncertainty. They have mysteriously slammed on the brakes, causing rear-end collisions. They have run stop signs. As one researcher put it, “probabilistic estimates do not approximate judgment under uncertainty,” the sort of thing experience allows a human driver to gain.

The Moral Dilemma

In a clinical trial, any adverse response or reaction must be documented and reported to the FDA. That data is analyzed and often ends up in a public disclosure statement. (That’s the fine, usually microscopic, text included with the packaged product.) Companies developing autonomous vehicles have no such requirement. Granted, a traffic crash, particularly ones that involve injury or fatality get reported. But near misses and traffic snarl’s usually go undetected officially. There are no requirements or guidelines to report these kinds of incidents to official agencies.

Autonomous vehicles fly in the face of over a century of driving experience. When they fail, they fail in novel ways that human beings typically would not. We have a road system in the US that has been engineered with human responses and reactions in mind. It has literal guard rails to keep a car from spinning off into someone’s yard or into a river. We have rules we obey when confronted by the unexpected, such as following the directions of emergency personnel.

AVs have a lot of data but are not sentient. They can predict the next word in a sentence to sound like a human being. But they do not understand that word. You might think that the previous sentence is a digression, but it’s exactly the challenge. AVs are driven by artificial intelligence software that relies on data, granted enormous amounts of data, but data reflecting a place and time. When they encounter situations outside of their training data, the best case is that they shut down and stop the car. That sounds okay, but what if they are on a busy street or highway? The problem is that “what these systems lack is judgment in the face of uncertainty, a key precursor to real knowledge.”

Autonomous vehicle safety is more than just the danger of being struck by one of these vehicles. It also involves the traffic snarls that can result when a car does something unusual or unexpected and everybody has to stop. This can impede the progress of emergency vehicles getting to their destinations to help people, put out fires, etc.

We lack the equivalent of a clinical laboratory to conduct research into the safety and efficacy of these devices. There are no equivalents to clinical trials for AVs where their safety and effectiveness can be tested under controlled situations. Not all data collected must be reported to any official safety agency. Finally, there are no ways the rest of us can opt-out of these experiments.

The autonomous vehicle industry places the onus of safety and protection on everybody who is not them. When an AV stops unexpectedly and snarls traffic that is a problem for the crossing guard, the traffic cop, or the emergency worker. It is not the problem of the company, that designed, deployed, and supports the vehicle.

I’m not against autonomous vehicles, but I would like my objections noted and taken into account. We were not invited to participate in this experiment. We were opted in by the very fact of our residence in this country. We never volunteered.

--

--

David Potenziani

Historian, informatician, novelist, and grandfather. Part-time curmugdeon.