The government’s plans could see cars, coaches and lorries with self-driving features on motorways in the next year, with new legislation in the pipeline to allow the safe wider rollout of driverless vehicles by 2025. It has also started a consultation on a “safety ambition” proposal that would require a self-driving vehicle to be “as safe as a competent and careful human driver”, according to the Department of Transport’s announcement. The government hopes to fast-track new legislation addressing insurance and liability so that manufacturers — and not drivers — are responsible for crashes when the vehicle is in self-driving mode. However, The UK government’s Centre for Data Ethics and Innovation (CDEI) related Responsible Innovation in Self-Driving Vehicles report, indicates a long and complex road ahead if the public is to trust self-driving vehicles and the rules governing their use. There’s a risk that the public will reject driverless vehicles if they are not seen as safe enough, CDEI warns. “Average improvements in road safety, even if they can be clearly demonstrated, will not engender public trust if crashes are seen as the fault of faceless technology companies or lax regulation rather than fallible human drivers,” CDEI says in its report. For the public to see autonomous vehicles as equivalent to trains or aircraft, the public “could expect a 100x improvement in average safety over manually-driven vehicles.” Professor Jack Stilgoe of University College London, who advised the CDEI, told the BBC that establishing how safe driverless cars are should be a democratic decision. “The danger is sort of sleepwalking in to a world in which these changes happen in order to suit one mode of transport – and the benefits then don’t get spread very widely,” he said. CDEI says driverless cars should be clearly identified so people know what “agents” they are sharing the road with. It’s also concerned that technology may create pressure to change roads and road rules to suit driverless cars. To gain public trust, CDEI recommends rules that require technology makers to explain the conditions under which self-driving vehicles can operate, such as road types, locations, weather, and other road users’ behaviour. It notes that, to reduce liability, tech companies have an incentive the narrowly define the operational design domain (ODD) of self-driving vehicles, referring to the operating conditions in which the driving automation system is specifically designed to function. CDEI recommends that disabled people should also be consulted to ensure that regulations are inclusive. “There is a need for ongoing dialogue and social research to deepen understanding of public views on liability, labelling, the explainability of decisions made by AVs, and possible infrastructure changes as AV systems expand and develop.” Answers to these questions will impact “safety (infrastructure), acceptability of AVs (labelling, explainability) and on accountability and the requirement to provide recompense (liability),” CDEI says. The government reckons that allowing self-driving cars on UK roads could create up to 38,000 jobs and could be worth £42 billion to the economy. Of the £100m total package, the government confirmed £34 million today for research to “support safety developments and inform more detailed legislation”, as well as £20 million to “kick-start commercial self-driving services and enable businesses to grow”. It sees potential in grocery deliveries and shuttle pods at airports, building on £40 million already invested. The government hopes the “safety ambition” consultation will inform standards for self-driving vehicles, including whether sanctions are imposed on manufacturers if their technology doesn’t meet those standards. CDEI’s report builds on proposals in a January report by the Law Commission on how to regulate driverless vehicles.