Letter: Don’t expect the courts to police artificial intelligence rules

Letter: Don’t expect the courts to police artificial intelligence rules

The opinion piece by Marietje Schaake, international policy director at Stanford University’s Cyber Policy Center, warns that “US action to rein in tech sector may come too late or not at all” (Opinion, January 9).

We would also highlight how the UK government’s artificial intelligence policy regulation paper by no means adequately addresses the characteristics of the technology it is thinking to regulate.

“AI” is used by many, as in the policy regulation paper, to mean primarily software systems using machine learning (ML) techniques to derive their output from “sensoric” input. We use that term in this sense here. (One of us was active for many years in another branch of AI, known then as symbolic AI.)

There is lack of agreement at present on appropriate assurance techniques and assurance requirements for ML-based software used in critical systems such as automobile-driving assistance systems.

This is despite considerable attention from the safety-critical systems engineering community, including its British contributors, for many years now. There is thus no good reason to expect courts to be able adequately to deal with torts caused by failures of ML-based systems.

One important question: will systems including ML-based software be subject to the Health and Safety at Work Act 1974, which requires it to be shown that risk has been reduced so far as is reasonably practicable? Since, as noted above, there is no agreement on how this might be done, are we thereby expecting courts to establish precedents which technical experts cannot?

Nicholas Bohm
Retired Solicitor, Bishop’s Stortford, Hertfordshire, UK

Professor Peter Ladkin
Systems Safety Expert, Bielefeld, Germany

Stephen Mason
Barrister, Langford, Bedfordshire, UK

Professor Martyn Thomas
Systems Safety Expert, Tunbridge Wells Kent, UK

Source link

Share This
COMMENTS

Leave a Reply

Your email address will not be published.