California DMV Regulations May Kill the State’s Robocar Lead

Be careful what you wish for — yesterday the California DMV released its proposed regulations for the operation of robocars in California. All of this sprang from Google’s request to states that they start writing such regulations to ensure that their cars were legal, and California’s DMV took much longer than expected to release these regulations, which Google found quite upsetting.

The testing regulations did not bother too many, though I am upset that they effectively forbid the testing of delivery robots like the ones we are making at Starship because the test vehicles must have a human safety driver with a physical steering system. Requiring that driver makes sense for passenger cars but is impossible for a robot the size of breadbox.

Needing a driver

The draft operating rules effectively forbid Google’s current plan, making it illegal to operate a vehicle without a licenced and specially certified driver on board and ready to take control. Google’s research led them to feel that having a transition between human driver and software is dangerous, and that the right choice is a vehicle with no controls for humans. Most car companies, on the other hand, are attempting to build “co-pilot” or “autopilot” systems in which the human still plays a fundamental role.

The state proposes banning Google style vehicles for now, and drafting regulations on them in the future. Unfortunately, once something is banned, it is remarkably difficult to un-ban it. That’s because nobody wants to be the regulator or politician who un-bans something that later causes harm that can be blamed on them. And these vehicles will cause harm, just less harm than the people currently driving are doing.

The law forbids unmanned operation, and requires the driver/operator to be “monitoring the safe operation of the vehicle at all times and be capable of taking over immediate control.” This sounds like it certainly forbids sleeping, and might even forbid engrossing activities like reading, working or watching movies.

Special certificate

Drivers must not just have a licence, they must have a certificate showing they are trained in operation of a robocar. On the surface, that sounds reasonable, especially since the hand-off has dangers which training could reduce. But in practice, it could mean a number of unintended things:

  • Rental or even borrowing of such vehicles becomes impossible without a lot of preparation and some paperwork by the person trying it out.
  • Out of state renters may face a particular problem as they can’t have California licences. (Interstate law may, bizarrely, let them get by without the certificate while Californians would be subject to this rule.)
  • Car sharing or delivered car services (like my “whistlecar” concept or Mercedes Car2Come) become difficult unless sharers get the certificate.
  • The operator is responsible for all traffic violations, even though several companies have said they will take responsibility. They can take financial responsibility, but can’t help you with points on your licence or criminal liability, rare as that is. People will be reluctant to assume that responsibility for things that are the fault of the software in the car they use, as they have little ability to judge that software.

No robotaxis

With no robotaxis or unmanned operation, a large fraction of the public benefits of robocars are blocked. All that’s left is the safety benefit for car owners. This is not a minor thing, but it’s a small a part of the whole game (and active safety systems can attain a fair chunk of it in non-robocars.)

The state says it will write regulations for proper robocars, able to run unmanned. But it doesn’t say when those will arrive, and unfortunately, any promises about that will be dubious and non-binding. The state was very late with these regulations — which is actually perfectly understandable, since not even vendors know the final form of the technology, and it may well be late again. Unfortunately, there are political incentives for delay, perhaps indeterminate delay.

This means vendors will be uncertain. They may know that someday they can operate in California, but they can’t plan for it. With other states and countries around the world chomping at the bit to get vendors to move their operations, it will be difficult for companies to choose California, even though today most of them have.

People already in California will continue their R&D in California, because it’s expensive to move such things, and Silicon Valley retains its attractions as the high-tech capital of the world. But they will start making plans for first operation outside California, in places that have an assured timetable.

It will be less likely that somebody would move operations to California because of the uncertainty. Why start a project here — which in spite of its advantages is also the most expensive place to operate — without knowing when you can deploy here. And people want to deploy close to home if they have the option.

It might be that the car companies, whose prime focus is on co-pilot or autopilot systems today, may not be bothered by this uncertainty. In fact, it’s good for their simpler early goals because it slows the competition down. But most of them have also announced plans for real self-driving robocars where you can act just like a passenger. Their teams all want to build them. They might enjoy a breather, but in the end, they don’t want these regulations either.

And yes, it means that delivery robots won’t be able to go on the roads, and must stick to the sidewalks. That’s the primary plan at Starship today, but not the forever plan.

California should, after receiving comment, alter these regulations. They should allow unmanned vehicles which meet appropriate functional safety goals to operate, and they should have a real calendar date when this is going to happen. If they don’t, they won’t be helping to protect Californians. They will take California from being the envy of the world as the place that has attracted robocar development from all around the planet to just another contender. And that won’t just cost jobs, it will delay the deployment in California of a technology that will save the lives of Californians.

I don’t want to pretend that deploying full robocars is without risk. Quite the reverse, people will be hurt. But people are already being hurt, and the strategy of taking no risk is the wrong one.

Brad Templeton is Singularity University’s Networks and Computing Chair. This article was originally published on Brad’s blog.

Image Credit: Google

Brad Templeton
Brad Templetonhttp://www.templetons.com/brad/bio.html
Brad Templeton is a developer of and commentator on self-driving cars, software architect, board member of the Electronic Frontier Foundation, internet entrepreneur, futurist lecturer, writer and observer of cyberspace issues, hobby photographer, and an artist. Templeton has been a consultant on Google's team designing a driverless car and lectures and blogs about the emerging technology of automated transportation. He is also noted as a speaker and writer covering copyright law and political and social issues related to computing and networks. He is a director of the futurist Foresight Nanotech Institute, a think tank and public interest organization focused on transformative future technologies. Templeton was founder, publisher and software architect at ClariNet Communications Corp., which in the 1990s became the first internet-based business, creating an electronic newspaper. He has been active in the computer network community since 1979, participated in the building and growth of USENET from its earliest days, and in 1987 founded and edited a special USENET conference devoted to comedy. Templeton has been involved in the development of important pieces of software including VisiCalc, the world's first computer spreadsheet, and Stuffit for archiving and compressing computer files. In 1996, ClariNet joined the ACLU and others in opposing the Communications Decency Act, part of the Telecom bill passed during Clinton Administration. The U.S. Supreme Court sided with the plaintiffs and ruled that the Act violated the First Amendment in seeking to impose anti-indecency standards on the internet.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured