North East force's crime 'Minority Report' software could be rolled out

New rules should be brought in to control how police use a Minority Report-style system to predict which criminals will reoffend, a think tank has said.
Durham Constabulary already uses software to predict if a criminal will offend again.Durham Constabulary already uses software to predict if a criminal will offend again.
Durham Constabulary already uses software to predict if a criminal will offend again.

Durham Constabulary was the first force to start using a computer programme to help decide how likely criminals were to commit further offences, but the report says several other forces are looking at similar technology.

Now a report published by the Royal United Services Institute has called for clear guidelines on so-called "predictive policing", and the police watchdog to make sure that forces comply.

Hide Ad
Hide Ad

Sci-fi blockbuster Minority Report, set in 2054, features detectives who use a psychic to predict who will commit murders, and then arrest them before the crime happens.

Real-life software that predicts where crimes might take place has already shown in some tests to be twice as accurate as traditional policing methods, the authors suggest.

But the use of computer software to predict how likely criminals are to re-offend is still in its infancy.

The report says: "While machine learning algorithms are currently being used for limited policing purposes, there is potential for the technology to do much more, and the lack of a regulatory and governance framework for its use is concerning.

Hide Ad
Hide Ad

"A new regulatory framework is needed, one which establishes minimum standards around issues such as transparency and intelligibility, the potential effects of the incorporation of an algorithm into a decision-making process, and relevant ethical issues.

"A formalised system of scrutiny and oversight, including an inspection role for Her Majesty's Inspectorate of Constabulary and Fire and Rescue Services is necessary to ensure adherence to this new framework."

It recommends small, localised trials of software, along with "clear guidance and codes of practice outlining appropriate constraints governing how police forces should trial predictive algorithmic tools."

The authors continue: "This should be addressed as a matter of urgency to enable police forces to trial new technologies in accordance with data protection legislation, respect for human rights and administrative law principles."