The company announced its latest plans during an investor conference on Monday. But sceptics doubt Tesla can pull it off.
Tesla CEO Elon Musk expects to start converting the company’s electric cars into fully self-driving vehicles next year as part of a plan to create a network of robotic taxis to compete against Uber and other ride-hailing services.
The vision sketched out Monday during an event at Tesla’s Silicon Valley headquarters requires several leaps of faith – something that the zealous investors and consumers who view Musk as a technological genius often are willing to take.
But experts on self-driving cars fear Musk is shirking public safety in an effort to boost Tesla’s stock and sell more of the company’s electric cars. This is amid lingering questions about whether the 15-year-old automaker can consistently make money.
Here are five reasons experts think autonomous cars are many years away:
1. Snow and weather
When it’s heavy enough to cover the pavement, snow blocks the view of lane lines that vehicle cameras use to find their way. Researchers so far haven’t figured out a way around this. That’s why much of the testing is done in warm-weather climates such as Arizona and California.
Heavy snow, rain, fog and sandstorms can obstruct the view of cameras. Light beams sent out by laser sensors can bounce off snowflakes and think they are obstacles. Radar can see through the weather, but it doesn’t show the shape of an object needed for computers to figure out what it is.
“It’s like losing part of your vision,” says Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University.
Researchers are working on laser sensors that use a different light beam wavelength to see through snowflakes, said Greg McGuire, director of the MCity autonomous vehicle testing lab at the University of Michigan.
Software also is being developed so vehicles can differentiate between real obstacles and snowflakes, rain, fog, and other conditions.
But many companies are still trying to master the difficult task of driving on a clear day with steady traction.
“Once we are able to have a system reliably perform in those, then we’ll start working toward expanding to those more challenging conditions,” said Noah Zych, Uber’s head of system safety for self-driving cars.
In some limited areas that have been mapped in three dimensions, the cars can function in light snow and rain.
2. Non-standardised lines and curbs
Across the globe, roadway marking lines are different, or they may not even exist. Lane lines aren’t standardised, so vehicles have to learn how to drive differently in each city. Sometimes there aren’t any curbs to help vehicles judge lane width.
For instance, in Pittsburgh’s industrial “Strip District,” where many self-driving vehicles are tested, the city draws lines across the narrow lanes to mark where vehicles should stop for stop signs. Sometimes the lines are so far back and buildings are so close to the street that autonomous cars can’t see traffic on the cross street if they stop at the line.
One workaround is to programme vehicles to stop for the line and creep forward.
“Is it better to do a double stop?” asked Pete Rander, president of Argo AI, an autonomous vehicle company in which Ford has invested heavily.
“Since intersections vary, it’s not that easy.”
3. Dealing with human drivers
For many years, autonomous vehicles will have to deal with humans who don’t always play by the rules.
They double-park or walk in front of cars. Recently in Pittsburgh, an Argo backup driver had to take over when his car stopped during a right turn, blocking an intersection when it couldn’t immediately decide whether to go around a double-parked delivery truck.
“Even if the car might eventually figure something out, it’s shared space, and it’s socially unacceptable” to block traffic, Rander said.
Humans also make eye contact with other drivers to make sure they’re looking in the right direction, something still being developed for autonomous vehicles.
Add to that the antagonism that some feel toward robots.
People have reportedly been harassing Waymo’s autonomous test vehicles near Phoenix.
The Arizona Republic reported in December that police in suburban Chandler have documented at least 21 cases in the past two years, including a man waving a gun at a Waymo van and people who slashed tires and threw rocks. One Jeep forced the vans off the road six times.
4. Left turns
Deciding when to turn left in front of oncoming traffic without a green arrow is one of the more difficult tasks for human drivers and one that causes many crashes. Autonomous vehicles have the same trouble.
Waymo CEO John Krafcik said in an interview last year that his company’s vehicles are still encountering occasional problems at intersections.
“I think the things that humans have challenges with, we’re challenged with as well,” he said. “So sometimes unprotected lefts are super challenging for a human, sometimes they’re super challenging for us.
5. Consumer acceptance
The fatal Uber crash near Phoenix last year did more than push the pause button on testing. It also rattled consumers who someday will be asked to ride in self-driving vehicles.
Surveys taken after the Uber crash showed that drivers are reluctant to give up control to a computer. One by AAA in March found 71 percent of people are afraid to ride in fully self-driving vehicles.
Autonomous vehicle companies are showing test passengers information on screens about where the vehicles are headed and what its sensors are seeing. The more people ride, the more they trust the vehicles, says Waymo’s Krafcik.
“After they become more and more confident they rarely look at the screens, and they’re on their phones or relaxing or sleeping,” he said.