Even those who acknowledge the multifold roadblocks guaranteed to delay the self-driving car’s arrival admit that its coming is inevitable. Because it is.
Every car ever made is a “self-driving” car, literally speaking. As is implied by the word “automobile,” cars are machines that move under their own power. Which is why some prefer the term “autonomous” over “self-driving,” although that, too, is wrong. To be autonomous (as opposed to automated — the more precise term), a car would require no input from any outside source, whereas Google’s self-driving car, for example, relies on communication with other vehicles, with GPS satellites, and with the cloud.
Part of the reason both terms are incorrect is that they both imply that there is a “will,” or indeed a “self,” at work somewhere deep inside the machinery of a car. On the one hand this is nothing but a bit of linguistic shorthand -- anthropomorphizing helps us to understand complicated systems. On the other hand, there are those who believe that all technological developments are inevitable.
There’s an esoteric bit of eggheadery out there called the technological imperative (sometimes referred to as the inevitability theory) which holds that once a technology becomes possible it will inevitably exist. In other words, mankind will always be compelled to move technology forward, if only because it seems like the right thing to do, which makes it seem as though technology has a will of its own.
This explains why no matter who you discuss the subject with — industry analyst, sociologist, drunk guy at the bar — they invariably speak in terms of “when” self-driving cars arrive, rather than “if.” Even those who acknowledge the multifold roadblocks guaranteed to delay the self-driving car’s arrival admit that its coming is inevitable. Because it is.
According to a study titled “Autonomous Cars: Not if, but when,” published by IHS Auto, “It is expected that the autonomous car technology will have a long-term impact on the auto industry and is likely to have a positive impact on auto sales and autos in-use after 2035.”
Not everyone believes in technological determinism, however. As an article in the MIT Technology Review states, “Among historians and sociologists who study the interactions of technology and society, ideas about necessity and inevitability are now considered laughable.”
But advances in automotive technology do seem to follow the patterns one would expect from deterministically decreed technology. Auto companies don’t necessarily want to create complicated new technology to keep us safe, and car buyers aren’t choosing to always buy the safest cars, yet the technology advances, because it’s considered good for society.
As a recent report on automated cars published by Eno Trans, a nonprofit, bipartisan think tank dedicated to automotive technology, put it, “Automated vehicles have the potential to fundamentally alter transportation systems by averting deadly crashes, providing critical mobility to the elderly and disabled, increasing road capacity, saving fuel, and lowering emissions.”
Who is going to say no to that?
Examined under the microscope of the inevitability theory, each major advance in the history of automotive technology can easily be seen as another step in the unstoppable evolution of the self-driving car.
The story of the automobile began all the way back around 3200 BC, when the newly conceived axle was first mated with the already-three-hundred-year-old wheel. With that one glorious act of technological copulation, one of the six simple machines was born.
Early wheeled-transport was, understandably, terribly inefficient. One needed to push (the word “drive” literally means “push,” or “urge forward”) or pull one’s cart, chariot, or carriage, or else one needed to employ an animal or a slave.
When cruise control was coupled with forward-sensing radar to create the first adaptive cruise control in 1995, one of the primary operations of a fully automated car became an available option.
The steam engine (which led to the very first automobile, the Cugnot Steam Trolley, in 1769), along with its successors the electric engine (the first automobile engine to gain popularity) and the internal-combustion engine, finally liberated vehicles from the performance limitations of horses and other beasts of burden.
The evolution of the automobile sped into overdrive with the arrival of the Industrial Revolution, and ever since then, the amount of human effort required to operate a vehicle has steadily diminished as technology has advanced.
The introduction of the electric starter in 1911 meant that one no longer needed to turn a crank to initiate the engine’s rotation. Not only did crank-starting require a good bit of strength, but it posed potential dangers; a single kickback of the engine could result in broken fingers, wrists, or arms.
Single brake pedals replaced the pedal-and-stick configurations found in most cars in 1922, and just a year later hydraulic brakes made bringing your vehicle to a dead stop no more difficult than applying gentle pressure with one foot.
Driving was getting so effortless by the mid-1920s that it became standard for car cabins to be outfitted with radios to keep their “drivers” entertained.
One of the last bits of a driver’s physical connection to the mechanical operation of the auto was removed in the late 1930s with the advent of automatic transmissions, which replaced friction clutches with fluid couplings and combined them with a hydraulically-controlled planetary gearbox, eliminating the need for the driver to engage with a car’s transmission while the vehicle was in motion.
With automobile operation now mostly automated, increasing attention was paid to the comfort of riders. In 1939, Packards became the first cars to be offered with optional air conditioning, which meant that wealthy car buyers could now be spared not only the difficulties of physical labor, but the inconveniences of climate as well.
The most significant piece of automated car technology – cruise control – was first introduced on Chrysler vehicles in 1958. This took all control except for steering and, potentially, braking, out of the hands of humans. When cruise control was coupled with forward-sensing radar to create the first adaptive cruise control in 1995 (on the Mitsubishi Diamante), one of the primary operations of a fully automated car became an available option.
It’s tempting to think of these more recently developed “intrusive” features, like adaptive cruise control, traction control, and emergency brake assist, as early signs that cars are inching toward full robothood, but that’s only because those systems literally take control away from us while we are attempting to exert it. We’ve been doing next to nothing for a very long time.
Which gets back to the semantics of “driving.” The way we (meaning average, nonprofessional drivers) interact with our cars is completely analogous to how a carriage driver interacts with a horse. What we do is instruct them to move forward, and we control their velocity and direction. What they do is convert a liquid (or electricity, or both) into enough kinetic energy to get us wherever we want to go and keep us comfortable and entertained. Our primary job is to keep them from crashing.
Up until recently, human interaction with cars has been necessary because cars lacked the ability to monitor their surroundings. But a modern car outfitted with cameras, radar, sonar, and GPS can not only know what’s going on around it, it can react in time to prevent or avoid an accident, something the vast majority of drivers lack the skill to do.
Although the car was conceived as a way to efficiently transport human beings, we human beings have now become the weak link in its operational chain. Cars could be more efficient, more eco-friendly, and safer if drivers were taken out of the equation.
And since they can be, the inevitability theorists claim, so they shall be, regardless of what you and I, or any living human, might want.