Many of my friends in auto-writer world insist that Americans will never cotton to self-driving or “autonomous” cars. My friends contend people won’t ever willingly cede control of their speeding automobiles to batteries of onboard computers and intrusive sensors while directing their attention elsewhere. Because, the argument goes, folks love driving too much. And because, you know, freedom.
I’m quite fond of driving myself, as you know. But I’m not certain autonomy won’t happen. Indeed, my own informal survey of American drivers has led me to conclude with a high degree of confidence that a large percentage of the driving population isn’t paying attention when driving already. And I believe an even larger subset of American motorists can’t wait to take their eyes off the road. Permanently. Because as much as folks might like hugging the curves and shifting the toggles and cogs for themselves, self-driving cars will enable humankind to concentrate on the important things in life.
Stuff like gaming, watching Tik-Tok videos and pornography, gambling, crypto-trading, and curating fictitious online personas. Plus other essential tasks, such as snacking, napping, drinking, drugging, and having sex. All while safely ensconced in the soft, washable, couch-like recliner of a fast-moving, futuristic party palace. Sounds aces, right?
Then, lest we forget, there’s also the bonanza of newfound occasions for shopping and being marketed to that autonomy will unleash. These exchanges are, after all, the primary drivers behind so many of today’s advances in automotive technology, designed to enable and protect motorists who simply must take their eyes off the road to buy that ugly Christmas sweater just gone on sale, or respond to some particularly mendacious tweet in real time. So many tedious hours spent watching where you’re going, wasted no longer, freed up and put to work creating vast new flood zones of actionable consumer data with which to prime the economy.
Let’s face it, friends—self-driving cars are where we’re headed. People are ready to sit back and leave the driving to somebody or something else, no matter the cost.
But don’t pass the Pouilly-Fuissé and moules frites just yet. Because, problem is, truly safe, self-driving cars don’t actually exist yet. But that this has even become a topic of debate is a tribute to a very in-the-moment combination of wishful thinking, fuzzy verbiage, incompetent media, and lax and confused regulators. Plus, not least to blame: the enormous cult following of the auto industry’s richest practitioner, most outsize personality, and farthest-out futurist, Elon Musk. More on him in a moment. But first, allow us to attempt to clear the fog.
What Is a Self-Driving Car?
In 2014, the Society of Automotive Engineers (S.A.E.)—the 117-year-old American professional society that, among other things, has tasked itself with promulgating industry standards—outlined six levels of development to be used when describing the various tiers of driver assistance it anticipated, which it since updated. S.A.E.’s “Levels of Driving Automation” breaks it down, from Level 0—no automation whatsoever (and the car many still drive)—to full driving automation, Level 5, wherein you won’t need a steering wheel or brake pedals and the car will do everything for you.
Most motorists have by now experienced some level of automated function. Level 1 describes mild driver assistance, such as cruise control. Near ubiquitous today, it was first devised for steam engines in the 18th century, patented for cars in the early 1950s, but not offered in passenger cars until 1958, when it became available in the Chrysler Imperial. Cruise control helps a car maintain a given speed but requires frequent driver intervention, less so when coupled with adaptive capability such as introduced by Mercedes-Benz with its groundbreaking Distronic system in 1998, which uses radar to maintain (or increase or reduce, as necessary) a set speed to keep a specified distance between the car and other vehicles on the road.
Another example of Level 1 autonomy is “lane keep” assist—a sort of steering override that endeavors to keep drivers from inadvertently drifting out of their lanes by using radar or lidar (pulsed laser light). Operators of such systems will note that, though useful, they not infrequently fail to work in inclement weather, when sensors might be rendered inoperative, an even greater concern with higher levels of autonomy.
Level 2 systems, called advanced driving-automation systems, are increasingly popular and can be thought of as partially automated. Cars so equipped will steer and brake for themselves, relying on various types of sensors (radar, lidar, and cameras) while still requiring driver engagement, such as maintaining hand contact with the steering wheel.
In a move that consternated many, Tesla recently chose to eliminate radar from its Model Y and Model 3 cars sold in North America (the former is the world’s best-selling electric vehicle) and equipped with its so-called Tesla Vision, relying instead on cheaper cameras, combined with neural net processing, a type of machine learning. Tesla’s is an outlier position, and it seems the system does not interact well with snow, ice, and the sun’s glare, which can take cameras off their game if not out of it entirely; what’s more, it may not see fixed objects as well as Tesla’s previous system.
Level 3 systems are known as conditionally automated. Such cars can make decisions for themselves—overtake slow-moving vehicles, for instance. They don’t require as much intervention from the driver as Level 2 systems but still require human input and attention.
While an increasing number of Level 3 cars are being sold, most have not yet been enabled to fully exploit this technology. For example, Mercedes-Benz’s new-for-2022 Drive Pilot is sold only in Germany and is programmed so that it can only be used on certain sections of the autobahn that have been extensively mapped out. What’s more, it is speed-delimited on the legendary high-speed road to a sleepy 37.2 m.p.h., presumably intended (like Audi’s similar Traffic Jam Pilot) only for use in relatively slow-moving situations. Relying on lidar sensors, cameras, external microphones, and advanced positioning (more accurate than G.P.S.), vehicles with Drive Pilot are also fitted with redundant electric systems to operate steering and brakes in the event the main systems fail. Mercedes’s system, like that of most every heritage manufacturer, is distinctly more robust than Tesla’s and reflects the company’s historically more cautious approach to safety.
Level 4 will operate without any input from lowly mortals, although an operator might be able to intervene. Such systems are typically expected to be used in geographically limited—or “geofenced”—areas, where the number of hazards and unknown variables can be kept low, with speeds held to 30 m.p.h. and below. A Level 4 vehicle will not function outside of, or in fact ever leave, these areas. A prime example would be a shuttle bus operating along a specified route.
There has been a lot of activity in this space, and the presence of these truly self-driven vehicles is expected to expand in coming years to include driverless cars, taxis, and small mobility pods that might navigate crowded cities that have been extensively mapped before the vehicles begin service.
That this has even become a topic of debate is a tribute to a very in-the-moment combination of wishful thinking, fuzzy verbiage, incompetent media, and lax and confused regulators.
Alphabet’s Waymo division operates an autonomous Level 4 taxi service in Phoenix—with some of its fleet offering totally driverless rides—since 2020 and began offering San Franciscans self-driven rides last August.
General Motors’ autonomous vehicle enterprise, Cruise, recently filed an application with the National Highway Traffic Safety Administration (N.H.T.S.A.) for permission to build and commercially deploy next year up to 2,500 self-driving Cruise Origins, people-moving shuttles without steering wheels or brake pedals. In the meantime, Cruise (whose other investors include SoftBank, Honda, Microsoft, and Walmart) is offering public free rides on San Francisco streets in its current fleet of self-driving electric Chevrolet Bolts, ahead of launching a commercial taxi service this year.
The ultimate dream is that these geo-restricted efforts will one day be displaced by Level 5 autonomous vehicles that can go anywhere and do anything a car can do, without any active driver input or even any controls. Substantial issues will remain, however. In addition to needing advance, up-to-the-minute mapping of millions of often-changing roads, a herculean task just getting underway, most autonomous vehicles rely on road markings and painted lines to guide themselves. But these are part of an infrastructure that is woefully undermaintained and eternally subject to degradation from time and extreme weather.
Another, entirely different sphere of concerns arises when we remember that automotive computer systems, like many others, are too easily hacked, which raises all manner of dystopian possibilities, such as literally having control of your vehicle wrested from you to enable highway robberies, rapes, kidnappings, terrorist attacks, and even unwanted government intervention: “We’re sorry, Mr. Kitman, but we didn’t care for that crack you just made about the President for Life, so you’ll be coming down to the station to have a few words.”
The Stench of Musk
Bamboozling the press and the public in the face of all this onslaught of previously unseen technology has been Elon Musk, who for better and worse has become the official, albeit unelected, face of our transportation future. The fabulously wealthy proprietor of electric-car-maker Tesla, the world’s most valuable car company, and SpaceX, the wildly successful rocketry start-up currently tasked with putting tens of thousands of satellites into near space (in signature Musk style, with plentiful taxpayer dollars)—the self-proclaimed Imperator of Mars insists that his cars enjoy or will soon enjoy what he calls Full-Self-Driving (F.S.D.) capability. However, the evidence—including more than a few dead drivers who took him at his word (itself contradicted by the legal language in Tesla owner’s manuals and the statements of company lawyers in filings with various regulatory agencies)—indicates, thus far, otherwise.
All Tesla cars come equipped with a basic version of what the company calls Autopilot, which misleadingly conjures an image of an airplane whose pilot can leave the cockpit briefly to use the bathroom without crashing his plane into the sea. While the Tesla system requires drivers to keep their hands on the wheel, its name suggests otherwise, and this fail-safe measure is often easily defeated by overzealous Tesla fans, many of whom think they are following the Way of Musk by overstepping the recommended limits of the system’s use. As it stands, Autopilot permits a car to automatically steer, accelerate, and brake within its lane. Those who pony up for F.S.D. functionality in their Tesla get something called “Navigate on Autopilot,” which will additionally steer the car along highway interchanges and make automatic lane changes around slower traffic in highway settings.
Disconcertingly, Musk has stated that his car will achieve true Full-Self-Driving capability—the equivalent of Level 4 or 5 autonomy—within the coming year for nine years running. Naturally, he has yet to deliver.
It hasn’t stopped him from charging customers a stiff premium for the function, with buyers of the feature variously charged through the years an extra $3,000 (2018), $2,000 (2019), $6,000 (2019), $7,000 (2020), $8,000 (2020), $10,000 (2020), or $12,000 (2022), though many today instead pay a monthly subscription fee of up to $199, cancelable at any time, to amortize the cost. That certainly seems a more prudent way to access a technology that Tesla concedes is “designed to become more capable over time; however the currently enabled features do not make the vehicle autonomous.” Musk has said that the investment is good value as at some future date owners will be able to rent their cars out as robotaxis. This has yet to happen, and for good reason.
Indeed, Tesla characterizes customers utilizing its F.S.D. system as “beta testers,” meaning it’s expecting owners to find bugs. And, as a Consumer Reports video release reports, bugs are what they found. Overall, the venerable testing organization deemed Tesla’s system not worthy of being called self-driving, of limited benefit overall, costly, and potentially risky to operators of the cars and other drivers.
In May of last year, the magazine noted, Tesla was placed “under review” by the California Department of Motor Vehicles for “public statements that may violate state regulations prohibiting automakers from advertising vehicles for sale or lease as autonomous unless the vehicle meets the statutory and regulatory definition of an autonomous vehicle and the company holds a deployment permit.”
Musk has stated that his car will achieve Full-Self-Driving capability within the coming year for nine years running.
Tesla and its leader, meanwhile, are also engaged in tense negotiations with the N.H.T.S.A. over Tesla’s self-driving claims and, separately, with the S.E.C. over a host of issues, including Musk’s apparent failure to abide by the terms of a settlement of securities-fraud charges brought against him for market-moving tweets about an alleged plan for him to take the company private.
The N.H.T.S.A., which was sorely and likely not unintentionally hamstrung by the lack of a congressionally-approved administrator for the entirety of Donald Trump’s term in office, has today more than two dozen active investigations into accidents involving Autopilot, with more than a dozen deaths involving the system reported.
Tesla is not alone in having difficulties perfecting its vehicles’ autonomous capabilities. Last year, a prototype Toyota e-Palette autonomous people mover that was working the Tokyo Paralympic Games struck a visually impaired Japanese judoka, Aramitsu Kitazono, in a pedestrian crossing, causing him to withdraw from the games. The vehicle was removed from service.
“I don’t think it’s at all realistic yet that self-driving cars can travel normally on ordinary roads,” Toyota president Akio Toyoda said following the accident. “There is this pressure on car-makers to be the first to release Level 5 vehicles. But I have been saying we should not jump on such a bandwagon.”
He makes a good, un-Musk-like point. The bandwagon is waiting. But it’s not ready to go.
Jamie Kitman is a car columnist for AIR MAIL