LOS ANGELES - California is trying to do something unusual in this age of rapidly evolving technology: get ahead of a big new development before it goes public.
By the end of the year, the Department of Motor Vehicles must write rules to regulate cars that rely on computers -- not the owner -- to do the driving. That process began last week when the DMV held a public hearing in Sacramento to puzzle over how to regulate vehicles that haven't been fully developed yet.
Among the complex questions officials sought to unravel:
How will the state know the cars are safe?
Does a driver even need to be behind the wheel?
Can manufacturers mine data from onboard computers to make product pitches based on where the car goes or set insurance rates based on how it is driven?
Do owners get points on their license if they send a car to park itself and it slams into another vehicle?
Once the stuff of science fiction, driverless cars could be commercially available by decade's end. Under a California law passed in 2012, the DMV must decide by the end of 2014 how to integrate the "autonomous vehicles" onto public roads.
That means the writers will post draft regulations around June, then alter the rules in response to public comment by fall.
Three other states have passed driverless-car laws, mostly focused on testing. California has mandated rules on testing and public operation, and the DMV expects within weeks to finalize regulations dictating what companies must do to test the technology on public roads.
Those rules came after Google Inc. had already sent its fleet of Toyota Priuses and Lexuses, fitted with an array of sensors including radar and lasers, hundreds of thousands of miles in California. Major automakers also have tested their models.
With the federal government apparently years away from developing regulations, California's rules could effectively become the national standard.
Much of last week's discussion focused on privacy concerns. California's law requires autonomous vehicles to log records of operation so the data can be used to reconstruct an accident.
But the cars "must not become another way to track us in our daily lives," John M. Simpson of the nonprofit Consumer Watchdog said at the hearing. He called out Google, saying the Internet giant rebuffed attempts to add privacy guarantees when it pushed the 2012 legislation mandating rules on testing and public operation.
Across from Simpson at the hearing's head tables was a representative from Google, who offered no comment on the data privacy issue.
Discussion also touched on how to know a car is safe, and whether an owner knows how to properly operate it.
Ron Medford, Google's director of safety for its "self-driving car" project, suggested manufacturers should be able to self-certify that their cars are safe. He cautioned it would get complicated quickly if the state tried to assume that role.
In initial iterations, human drivers would be expected to take control in an instant if the computer systems fail. Unlike current technology, which can help park a car or keep it in its lane -- owners might eventually be able to read, daydream or even sleep while the car did the work.
Responding to a question received via Twitter, DMV attorney Brian Soublet acknowledged the department is still grappling with the fundamental question: Will a person need to be in the driver's seat? Maybe not, by the time the technology is safe and reliable, he said.