As the presence of autonomous vehicles on the roads moves closer to becoming a reality, concerns related to safety and reliability remain pressing. In this episode of Mastering Innovation on SiriusXM Channel 132, Business Radio Powered by The Wharton School, David Zhou, Head of Product for Autonomous Driving at Baidu USA, discusses the technological challenges yet to be resolved before driverless cars become mainstream. (Zhou also recently spoke at the Mack Institute’s Fall Conference 2018.)
Most well-known as China’s largest search engine, Baidu has branched out into autonomous driving in part because its strengths in artificial intelligence, big data, and cloud services are heavily transferable into the growing field. In an initiative to revolutionize the industry as a whole, their open platform means that anyone can download the software to make their vehicle autonomous. Zhou explains the importance of investing in both infrastructure and the sensors in the cars themselves to ensure that the vehicles are safe and error-free enough to gain public acceptance.
An excerpt of the interview is transcribed below. Listen to more episodes here.
Nicolaj Siggelkow: We’re all looking forward to autonomous cars — or at least I’m looking forward to them because I don’t like driving. What are the key technical obstacles that are still in the way? There are clearly other obstacles, such as regulatory, and we can talk about them in a moment. But for right now, let’s focus on the technical obstacles. What are the big unsolved problems in autonomous driving?
David Zhou: I think you’re correct. Autonomous driving is currently facing a lot of challenges. As you mentioned, from a business perspective, how do you define the product and what’s in the different model? Will it affect the current insurance mode? From society’s perspective, how do you promote acceptance of autonomous driving among people? Will it take jobs from drivers?
I would say that, currently, technology is one of the biggest challenges, from the sensors, to computing hardware, to software, and to cloud services. Let’s take, for example, sensors. Sensors are very important for autonomous driving because they sense the surroundings and help plan the vehicle’s trajectory to avoid obstacles.
Currently, mass production cars have radar for ACC, which is adaptive cruise control. We have blind spot detection, autonomous emergency brakes, and parking assistance. Radar is very cost-effective, and it will work perfectly well in different weather conditions. But radar cannot see things clearly and cannot detect color, so we need to add cameras to the car to detect lane lines and traffic lights. Cameras cannot detect distance very well — for example, some accidents happen because the camera wrongly calculates distance in complex situations — so we need to introduce LiDAR, which is laser radar that has more accurate obstacle detection.
When you have three sensors, they all have their own advantages and disadvantages. LiDAR is accurate but very expensive, and it’s very sensitive for the weather conditions. Cameras can see color but cannot work in low light and need strong computing power. If you add one camera, you need to add more computing power and image processing power. Radar is good for bad weather but is not that clear, and the computing level is not that high. So a safer way is to use sensor fusion technology to merge the output of sensors for clear perception. This is one of the areas that a lot of engineers are working on — to make better fusion of the sensors.
I’ll give an example. If you want to fuse the LiDAR with the camera because the camera doesn’t have the distance information, you need to project the LiDAR into the camera’s view and decide which one matches the other both in space and in time. If it matches incorrectly, then you’ll have bad detection. Sensor fusion is not enough. What if the sensor doesn’t work or its perception doesn’t work? If the sensor doesn’t work, the perception will not have a clear vision.
We also use Vitria Technology to enhance the sensibility of road infrastructure. For example, we can put detection devices in traffic lights. The intersection itself can detect all the vehicles around and broadcast the details and the traffic situation to each vehicle that bypasses the intersection. Everyone can get the perception technology capabilities for free; we call it IRCU, intelligent roadside service.
“Today, if you saw a horse cart on the highway, would you think it was safer? Of course you wouldn’t.” – David Zhou
Siggelkow: That raises an interesting question, because if I can remember a little bit of the history of autonomous driving, first, there was this idea that we would need to have all these sensors in our environment that would help the cars drive. But that creates a huge adoption problem, because who’s going to finance that? Where do we start? It seemed that huge progress was made by saying that we could put everything in the car; costs have come down so much on processing power and in the sensors. We could put so much more technology in the car that the car becomes truly autonomous and can drive practically anywhere. We wouldn’t need to have this big infrastructure investment that might be hard to coordinate and hard to get off the ground.
Now I hear you say that maybe we need a little bit of both. We can put so much into the car, but then there’s the question about reliability if one sensor breaks down or if there are all these trade-offs among the sensors. Do you think, at the end, both types of investments will be necessary to make autonomous driving possible at scale?
Zhou: Yes, I believe so. 100 years ago when cars appeared on the road, no one thought they would be safer. Why? They’re faster. Henry Ford said, “If I had a vehicle, they would say, ‘You need a faster horse.'” This means the car is faster than horse carts, right? Today, if you saw a horse cart on the highway, would you think it was safer? Of course you wouldn’t. Why? The highway, traffic lights, and lane lines were created to make the car work better, more safely, and more effectively. With the development of technological trends, all these things like traffic rules and infrastructure, even the whole transportation system, will evolve to accommodate the newer, more effective, and safer technology to make it even better.
Siggelkow: Just a few days ago, there was the first approval in Sweden for an autonomous truck to drive at least a little bit on public roads between a warehouse and a terminal. We’re starting to see more and more autonomous cars or trucks out on the roads. Everyone would love to make a prediction. The question is, by when and where do we see what kind of vehicles that drive themselves on the road? What’s your best assessment? Let’s stick with where you’re currently living, the Bay Area. What’s your prediction of how many autonomous cars will be out there in 10 years?
“How many miles do we need for autonomous driving before it can be proven or accepted by the public?” – David Zhou
Zhou: It’s very hard to say. I’ve read a lot of predictions varying quite a lot. Someone says, “Autonomous vehicles will be very popular in five years,” and someone else says, “It will be in 30 years.” If we look at some data, there are currently 48 companies that have tested autonomous driving in California, and they have accumulated around 400,000 miles of autonomous driving. How many miles do we need for autonomous driving before it can be proven or accepted by the public?
I have data by RAND Corporation. It shows that if you demonstrate within 95% confidence that failure rates are within 20% of the true rate of 1.09 fatality per 100 million of miles, then eight point eight billion miles of testing is required. This is truly a big gap. It’s 4000 times the gap currently. How we are going to merge that gap comes back to the first question that you asked: “Why should Baidu do autonomous driving?” Baidu has expertise in data, AI, and data processing. We think data will be one of the keys. I cannot predict how many autonomous vehicles will be there in 10 years. But for us, we open our services. We open our data capability to accelerate the whole industry, and we want to make it faster.
About Our Guest
A pioneer with more than a decade of experience in disruptive technology and product management, David is the Head of Product and the founding PM of Baidu Apollo Autonomous Driving Platform. He joined Baidu USA in April 2014 to lead the digital health AI team. Months later, he was brought on to spearhead the Autonomous Driving group at Baidu. Before joining Baidu USA, he founded several successful startups in the consumer mobile and digital health industries. Prior to that, as the PM on Microsoft Windows Live mobile team, he managed the launch of the first mobile app for Microsoft.
Mastering Innovation is live on Thursdays at 4:00 p.m. ET. Listen to more episodes here.