Skip to content
Automotive

So when self-driving cars collide, who pays?

Brian Peccarelli  President, Tax & Accounting, Thomson Reuters

Brian Peccarelli  President, Tax & Accounting, Thomson Reuters

Innovations used to be safe when they finally came to market. Or at least safer than today.

The really big innovations – the grand ideas such as exploring the potential power of the atom – were driven by governments, or government agencies. The research and development programs were well-resourced and scrupulous, even if their time to market was accelerated by external factors such as war or the space race. Governments had longer timeframes and deeper pockets than commercial organizations could afford, so they ensured as far as possible the integrity of solutions as they were developed.

But today the pace of innovation continues to accelerate, and although testing continues to be the most important aspect of time-to-market for new products, the consequences to society of really big technological changes can be far-reaching – revolutionary – they are difficult to anticipate.

In the fields of artificial intelligence (AI) and machine learning, for instance, even the inventors of new products can be surprised by the applications that users find for them. The advance of some of this technology is governed by the imagination of its users, who are finding new applications for cognitive computing all the time. And inevitably some of these applications will have a significant impact on society.

The delegates at Davos have already discussed the possible impact on the workforce of this fourth industrial revolution, and the imperative for workers to continue to train and develop their skills to fill the new jobs created by technology.

During the second day of this year’s World Economic Forum meeting, attention turned to the impact of technology on society beyond the workforce.

Take for instance the question of who is liable if self-driving cars collide. Can anybody inside the car be identified as a driver? Is the programmer liable in some way? This question has been doing the rounds as a thought experiment for some years, of course. But it is no longer a thought experiment. We will need our courts to opine on these new problems for society. And then insurers will have to restructure their businesses accordingly. In the new era of machine learning, at what point is the machine responsible for its actions?

Thomson Reuters has been working to provide answers to our customers on these momentous issues, and has been watching closely as the world’s courts and legal professionals come to terms with the new reality of machine learning. As well as analyzing the challenges for lawmakers on our Legal Solutions blog, we have been updating our Westlaw and Practical Law subscribers in more detail about developments in the law – such as the U.S. state of Michigan’s new regulations for autonomous vehicles (as the home of the U.S. motor industry, it is the first state to provide such clarity to its businesses).

The delegates at Davos are clearly well aware that today’s thought experiments will be tomorrow’s real-life test cases. Society will very soon need authoritative answers to these questions.


Learn more

Sign up to our Answers On Newsletter for Davos news and other insights. We look forward to seeing you online and hope you enjoy the Davos 2017 coverage.

  • Facebook
  • Twitter
  • Linkedin
  • Google+
  • Email

More answers