Did you know that Cruise, the autonomous vehicle (AV) operator owned by General Motors, has been asked by the California Department of Motor Vehicles to halve its operations in San Francisco following a crash into an emergency vehicle. The news came after a Cruise driverless taxi collided with a firetruck in a San Francisco intersection Aug. 17.
Authorities said the firetruck was in “Code 3” emergency mode, which means its red lights and sirens were activated at the time of the crash. The autonomous taxi had one passenger inside who was taken to a local hospital with injuries that are not considered life-threatening.
Wait, I thought self-driving cars were about safer driving. But there is more.
Driverless vehicles promise a future with less congestion and pollution, fewer accidents resulting from human error and better mobility for people with disabilities, supporters say. But every now and then, one of the cars runs into trouble in a way that casts a bit of doubt on that bold vision.
How about this one, last month, in San Francisco, where a driverless car somehow drove into a city paving project and got stuck in wet concrete. I’m not joking. It’s front wheel were actually stuck in the concrete as worker stood by shocked. Some of the people watching stated, “I was kind of pleased to see this because it illustrates how creepy and weird the whole driverless car thing is to me.”
The incident happened just days after California regulators agreed to expand driverless taxi services in San Francisco, despite the safety concerns of local officials and community activists. In a 3-to-1 vote last week, the California Public Utilities Commission, which regulates self-driving cars in the state, gave Cruise and Waymo permission to offer paid rides anytime during the day, throughout the city.
Though driverless cars have not been blamed for any serious injuries or crashes in San Francisco, they have been involved in several jarring episodes. Don’t forget the incident where a Waymo vehicle killed someone crossing the street in Arizona 2 years ago.
Other situations were a mess too. As many as 10 Cruise driverless cars stopped working near a music festival in San Francisco’s North Beach, causing traffic to back up.
In January, a Cruise vehicle entered an area where firefighters were working and did not stop until a firefighter started “banging on its hood and smashing the vehicle’s window,” according to city records. In May, a driverless Waymo car blocked a fire vehicle while it was backing into a station.
Driverless-car companies have strongly defended their safety records. It is foolish to expect driverless cars to operate perfectly. The cars, like any new technology that relies on machine learning, need to operate in real-world conditions to improve. However, nothing replaces a human paying attention on the street and making immediate decisions.
As I have been saying, there are huge factors that will stop self driving or autonomous (AV) cars in our lifetime.
AV’s can be hacked. There have been reports of researchers “hacking” cars by simply putting stickers on typical street signs. The researchers found that they can exploit vulnerabilities in the car’s visual classifying system. This made a stop sign look like a 45-mile-per-hour sign with just the use of some simple-to-make stickers. Hackers can also break-in to the grid and that would be a disaster.
Ethical issues are an issue. Cars come with “ethical issues”–just like anything created by human beings. When we drive, we make 1000s of mini-decisions. Researchers are trying to help the cars figure out what to do when it is making these ethical decisions. For instance, should the car save a driver or the four kids crossing the street. A simulation created by a group at M.I.T., called Moral Machine, brings some of these scenarios to the public. The other ethical issues with autonomous cars, include privacy issues and liability concerns.
Profit matters over safety. The companies are thinking about profit–not your safety. Sure, they may lower accidents and reduce emissions, but the real reason companies are doing this is for profit. There’s still a bunch of technological, ethical, and regulatory issues that have not yet been ironed out. Accidents and other issues are still a huge problem, particularly on bridges, in bad weather, in city traffic, and in high-speed circumstances. Weather is a huge factor especially rain and snow.
Communication breaks down. While much of driving is paying attention and avoiding obstacles, a large part of driving is communicating with other cars. Turn signals and brake lights show your intentions to other drivers, you may need to “wave” at someone to go ahead at a stop. Automated cars are still struggling to communicate with each other and that is a serious issue.
Drivers like driver assistance features to keep them safer on the road, however, recent studies have shown that drivers want to control their vehicles and not get into a self driving car especially one without pedal and a steering wheel. We are far, far away from self driving cars becoming main stream. We will be watching and reporting as this develops.
Are you interested in self-driving cars? What are your thoughts?
Lauren Fix is a nationally recognized automotive expert, media guest, journalist, author, keynote speaker and television host. A trusted automotive expert, Lauren provides an insider’s perspective on a wide range of automotive topics, energy and safety issues for both the auto industry and consumers. Her analysis is honest and straightforward.
Follow Lauren on Twitter, Facebook and Instagram