“This was meant to be the year we expected a lot of unsupervised autonomous driving to take place, and it isn’t,” says James Hodgson, principal analyst for smart mobility and automotive at ABI Research in London, UK. Billions of dollars of investments in sensors, processors, networking and software technology for self-driving cars simply is “not being monetized effectively at the moment,” he declares, noting that cars now offer only “longitudinal and lateral assistance” with what is known as Level 2+, a reference to levels of driving automation defined in a standard named J3016 by SAE International.
“The car will do a lot on your behalf, far beyond what we think of as Level 2 today,” he says including changing lanes without driver steering input, navigating itself through a highway exit, and negotiating urban scenarios by, for example, moving unaided around stationary vehicles or automatically braking for pedestrians when they approach the vehicle. “But the driver will still be responsible,” he adds. And while this saves automakers from liability and lets them wait for regulators and legislators to act before bringing fully self-driving cars to market, “critically it means they can’t start deploying some of those technologies that they’ve invested an awful lot in,” he says.
“When you look at the definitions, what really separates Level 2 from Level 5 is supervision: how involved is the driver in the process and how responsible for what’s actually taking place,” Hodgson explains. To be sure, some Level 4 and Level 5 vehicles exist today and are used in test fleets of robotaxis operating in the U.S. and elsewhere, with backup drivers on-board. Next will come full commercial rollouts of robotaxis without backup drivers on a small scale, Hodgson foresees. Fully self-driving, Level 5 cars for private use won’t emerge before 2026, and then only as expensive options for high-end vehicles like a Mercedes S-Class, he predicts.
“We’re looking at 2030 as that inflection point now” for mass market deployment of self-driving cars, says Mark Fitzgerald, director of autonomous vehicle research at Strategy Analytics in Newton, MA. By contrast, in 2025 there will be only a few thousand Level 4 vehicles and no Level 5 vehicles deployed, Fitzgerald predicts. These will be robotaxis and high-end personal vehicles, he says, adding luxury brands like Mercedes-Benz will then have to incorporate Level 4 automation to distinguish themselves from mass-market brands that already deployed Level 3 automation in cars for common folk.
“CTA views self-driving technology as transformative,” says Mitchell Kominsky, director of CTA government affairs. It will improve people’s lives by enabling driverless deliveries, providing new transit options for non-drivers, and making roads safer. Kominsky adds, “CTA is looking at how we can advance the safe deployment of self-driving vehicles,” through a self-driving vehicle working group. Its goal is to help federal and state governments shape policies that propel companies active in the space, and its roughly 160 member-companies vary widely “across the ecosystem,” he says.
Policies for which CTA advocates would encourage technology neutrality, open roads for testing, and update and modernize the Federal Motor Vehicle Safety Standards (FMVSS) process to not conflict with the concept of a self-driving vehicle. On a state level, for example, CTA is working with the California Transportation Agency to develop its autonomous vehicle guidance framework.
Kominsky points out, to help educate the public about self-driving cars, CTA is working with Partners for Automated Vehicle Education (PAVE), a non-profit organization backed by the automotive and technology industries that was launched at CES 2019.
“It’s hard to predict when we will see mass deployment of self-driving vehicles on our roads,” Kominsky proclaims. A substantial number of self-driving vehicles are being tested on public roads right now, “and with the pandemic we saw a lot of new applications” of the technology, including deliveries of groceries and medicines to consumers, as well as transport of COVID-19 test specimens by self-driving vehicles, he notes.
“For better or worse, a lot of the AV (autonomous vehicle) space is in the [realm] of perception right now because the technology is still developing,” says Ed Niedermayer, director of communications at PAVE, which is headquartered in Washington, D.C. “But it’s important to remember that perceptions of AVs have always been at odds with the reality.” According to the results of a PAVE consumer survey published in February 2020, the people who are most pessimistic about self-driving cars are also the ones who know the least about them. “The more you know about AVs, the more comfortable you are with them,” however, “for most people, AVs are just this abstract idea,” Niedermayer says.
Conversely, there’s also a correlation between experience with the advanced driver assistance systems (ADAS) integrated into many cars now and comfort with the concept of self-driving cars. But this could be due to the misconception that a vehicle with ADAS is a self-driving car, and PAVE is working to reverse that, too, so people don’t over trust ADAS and “actually make driving less safe,” Niedermayer exclaims.
Reality is that the development of self-driving cars is more about the learnings and challenges of the “human-less customer service experience” — such as determining the best place for a robotaxi to drop off or pick up a passenger in an urban environment — and less about the driving task, he says.
Moreover, public transit systems are likely to adopt self-driving buses before robotaxi fleets are widely deployed, Niedermayer suggests. But rather than merely replacing the driver, he adds, self-driving technology will allow the transit system employee to remain on-board and focus on customers.
“Five years ago at CES, the car companies were saying that within five years we’re going to have fully-self-driving vehicles available. I think there’s been a clarification of that now, to recognize that there’s going to be a continuum of progress made over one or more decades to get to that point, and it’s going to come through ADAS,“ says John Verboncoeur, associate dean for research and graduate studies in the college of engineering at Michigan State University, in East Lansing, MI. Full automation of functions like parking or freeway driving will be first to proliferate after driver assistance because those could be most precisely tuned for accuracy and safe operation. Verboncoeur predicts, “I think you can expect that self-driving vehicles will have a higher bar than manually driven vehicles in terms of safety statistics.”
To be sure, one reason for that higher bar will be passengers’ comfort needs, and MSU has a “socio-mobility group” of 30 to 40 faculty outside the engineering department that is studying the “sociological and psychological aspects of vehicles,” Verboncoeur notes.
In fact, MSU has turned its entire campus into a testbed for AVs — ranging from experimental self-driving cars to autonomous lawnmowers and robo-shuttles — used by a wide swath of the university community. It’s the largest contiguous college campus in the U.S., spanning 5,200 acres, containing 57 lane miles, 30 traffic signals, and 11,000 parking spots. It garners a daily population of about 70,000, which grows to more than 115,000 persons when a sporting event is hosted. “So it is a small city,” says Verboncoeur. Thus, what’s done there could easily serve as a model for a municipality looking to accommodate self-driving cars and other AVs, and MSU’s goal “is to build a smart city,” Verboncoeur affirms.
“Most of the test beds that you see around the country are meant for testing vehicles at high speeds. This is one of the few that can be used for first-mile, last-mile testing in pedestrian-rich environments,” says Satish Upda, mobility director and university distinguished professor of electrical and computer engineering at MSU. “It is that vehicle-human being interaction that is the most complicated to deal with.”
Currently, there are two self-driving development cars on campus used to test, for example, systems that could predict a pedestrian’s or bicyclist’s next move so the car is prepared to react. There are autonomous lawnmowers and snowblowers programmed to work when classes aren’t in session, so their noise isn’t a distraction. And the university’s first self-driving campus shuttle is expected to start serving the campus community this fall. MSU also boasts the largest carport solar array in North America, which powers the campus’s buildings and can also be used to charge electric vehicles (EVs).
“The reality is that (self-driving cars) is a problem that is much more challenging” than was initially thought, says Danny Shapiro, senior director of automotive at NVIDIA Corp. in Santa Clara, CA. “It requires a massive amount of computing. You can’t afford to make a mistake. In a car, there’s no second chance. You can’t not protect pedestrians.”
Therefore, Shapiro adds, much more training and data collection is required, with more sensors and redundancies adding complexity to the self-driving systems and making TOPS “the new horsepower.” TOPS is a chip industry acronym referencing trillions of operations per second that a computer processor can perform. DRIVE Orin — the latest generation of NVIDIA’s “system-on-a-chip” (SOC) for self-driving cars, used by automakers including Mercedes-Benz and Volvo — is rated at 254 TOPS. But DRIVE Atlan, the successor SOC that was announced at NVIDIA’s GTC conference in April, jumps up to more than 1,000 TOPS. It will be available in 2025 car models, “and we’ll probably have customers that will use multiple of those,” incorporated in “deep neural networks” within vehicles, Shapiro says.
Nevertheless, training the AI that runs in the car happens first outside the car, in data centers, using both real and simulated data. DRIVE Sim, a simulation software suite that resembles technology used by Hollywood studios can generate a nearly infinite range of real-world scenarios for AV development and validation, as well as managing fleets of self-driving cars. It was accompanied by DRIVE Hyperion, NVIDIA’s eighth-generation AV development platform that streamlines data collection and testing.
Of course, making cars smarter through simulation is inevitably insufficient, says Artur Seidel, vice president of Americas, at Elektrobit, an independently-operated subsidiary of Continental AG that is a global supplier of embedded and connected software products and services for the auto industry. What’s best is for the vehicle to learn as it goes and then report back to the automaker, which can improve its fleet of vehicles on the road with higher intelligence through over-the-air updates, Seidel says. This requires a total overhaul of the vehicle’s electronics architecture, including high-performance computers on-board that can do real-time analysis of sensor data, as well as connectivity — which in turn necessitates new software underpinnings. Hence, Elektrobit last November introduced EB xelor, an industry-first software platform that enables automakers and suppliers to do that overhaul around a handful of high-performance computers in a vehicle, instead of a traditional architecture of many small electronics control units (ECUs).
But connectivity must not be fundamental to the safe operation of a self-driving car, contends Jack Weast, vice president of automated vehicle standards at Mobileye, and an Intel Fellow, based in Phoenix, AZ. “We believe that an automated vehicle needs to be fully situationally aware and be able to navigate any reasonably foreseeable scenario, fully within its own capabilities. And the reason for that frankly is safety,” he says. “From a safety standpoint, if your ability to keep the people inside the vehicle safe, or people outside the vehicle safe from the vehicle, is dependent upon the ability to send or receive a message over a shared untrusted medium, by definition your safety case is flawed. So while we see lots of value in connectivity for value-added services, as a safety pillar, we think that’s a flawed approach.”
For that reason, Weast says, upon examining their architectures, “you don’t see the leading automated vehicle companies talking about V2X or infrastructure.
What a self-driving car needs ultimately is a software-based “responsibility-based safety model” that enables it to use reasonable assumptions about other road users as the basis for its own driving decisions to ensure a vehicle operates safely and avoids causing a collision, he argues. But, this relies upon agreement by the auto industry, governments and societies about what it means to drive safely. And different places in the world will set the goalposts differently, he adds. Not everything can be solved with AI, Weast insists.
“This whole process of training, testing, validating, deploying, it’s a continuous cycle,” says NVIDIA’s Shapiro. “A car will get better and better and better over time,” he asserts — but don’t expect to buy one without a steering wheel anytime soon.
I3, the flagship magazine from the Consumer Technology Association (CTA)®, focuses on innovation in technology, policy and business as well as the entrepreneurs, industry leaders and startups that grow the consumer technology industry. Subscriptions to i3 are available free to qualified participants in the consumer electronics industry.