News: 1744802771

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

In wake of Horizon scandal, forensics prof says digital evidence is a minefield

(2025/04/16)


Digital forensics in the UK is in need of reform, says one expert, as the deadline to advise the government on computer evidence rules arrives.

According to Peter Sommer, professor of digital forensics at the University of Birmingham, various issues threaten the reliability of digital evidence, from the software used to gather it to the manual methods that are deployed when platforms don't provide it themselves.

Sommer published his thoughts in a public response to the Ministry of Justice's (MoJ) call for views about how the admissibility of computer evidence is working in practice.

[1]

Issuing the call, Sarah Sackman KC MP, minister for courts and legal services, specifically cites the [2]Post Office's infamous Horizon IT scandal as one of the influential reasons for the review.

[3]

[4]

Up until the landmark conclusion of the long-running scandal, which was later dramatized on TV, the legal presumption was that computer evidence was reliable, as per a Law Commission paper published in 1997.

Sackman also highlighted how it has been over 20 years since the current principles of computer evidence-gathering were established, and that alone is enough of a reason to revisit them.

[5]

For those among our readership who are interested in the full breadth and context of what digital evidence looks like in the UK at present, Sommer does a great job outlining it in his [6]response [PDF], although the main forms are:

Digital communications, such as [7]SMS

Communications within online platforms such as [8]social media sites and [9]email

Mobile phone extraction reports

All three are subject to the same two problematic processes: The methods of data extraction from a device and the fact technicians are required to manually process that extracted data to ensure it's presentable as an exhibit.

Because both of these are dependent on the use of software Sommer described as "questionable," the overall reliability of computer evidence cannot be absolutely certain.

Sommer argues that evidence is never binary – reliable or unreliable – but the reliability of software-extracted evidence is a concern primarily due to nearly all software comprising some form of a [10]third-party library , which the developer doesn't scrutinize for its reliability to produce digital evidence.

Plus, some software will allow investigators to run their own scripts or automations, which in turn could lead to skewed or otherwise biased results if they're not generated in line with set standards.

[11]

In cases where investigators pull all the files from a suspect's smartphone, for example, the massive volume of raw data can't be presented to a lay jury because it would be too technical and abundant. Prosecutors take that data and convert it into a human-readable form.

So, even if the data itself can be wholly trusted, in such cases where human intervention takes place between the raw data being extracted and its delivery in court, this could influence how the evidence is portrayed to a jury, potentially weakening its reliability.

Research from 2021 concluded that digital forensics practitioners would find more or less evidence on devices depending on the way in which they were briefed about the case. The [12]study highlighted how unintentional bias can slip into an evidence-gathering process that is legally presumed to be substantially reliable.

There are additional complications to consider beyond the software argument, which forms the backbone of Sommer's submission. When evidence derives from an online platform, for example, which may or may not be helpful in handing over data for a case, police investigators often resort to downloading data manually and screengrabbing what they need – processes that are neither formalized nor governed by principles to ensure their reliability.

This, coupled with the cases of significant faults in single systems, such as Horizon, are credible threats to the reliability of evidence, but the lack of standards overseeing the software used to collect evidence from various devices is the professor's biggest concern.

Sommer told The Register : "My main concern was that the MoJ shouldn't just be looking at the 'big computer produces misleading results' scenario à la Post Office Horizon but at all the other sources and forms of digital evidence such as smartphones, vehicular forensics, IoT forensics etc. – and the extent to which software is used by law enforcement to analyze data and then produce vivid exhibits for trial use."

He said in the submission that the software being used to extract data from less-mainstream devices such as [13]internet-connected cars and other IoT devices is still being developed. Similarly, the software used for smartphones is more refined but undergoes frequent revisions, so the output could be different with each version.

Proposed solution

Whenever concerns are raised about the integrity of a process as important as digital evidence-gathering, many people's first instinct is to look to government for legislative answers.

After all, it was the Law Commission in 1997 that repealed section 69 of the Police and Criminal Evidence Act 1984 (PACE), effectively replacing the law's stipulation that prosecutors must prove the computer source of evidence was working properly so that it could be admissible with the legal presumption of machine reliability.

The decision was, in retrospect, a misguided one – arguably due to a misinterpretation of the advice the Law Commission received.

[14]Ransomware crims hammering UK more than ever as British techies complain the board just doesn't get it

[15]US sensor giant Sensata admits ransomware derailed ops

[16]Europol: Five pay-per-infect suspects cuffed, some spill secrets to cops

[17]UK officials insist 'murder prediction tool' algorithms purely abstract

Sommer said a legislative approach could be taken, but would likely lead to disputes and other difficulties.

He said: "The problem with a statutory approach is that some material will then become inadmissible and others admissible and which will depend on definitions embedded in the law. The inevitable result will be disputes as to whether particular items are included or excluded. There may also be attempts at circumventing any operationally inconvenient definitions as we saw during this section 69 regime and the 'real evidence' exceptions.

"A further problem will be deciding who would have the competence to issue such a certificate. Not the least of the difficulties in locating such a person is the extent to which computer output may be the product of multiple data inputs from multiple external computer systems and software that has been compiled from third-party libraries."

The problem remains, though. If there is a lack of standardization for the processes involved in evidence gathering, standards should ideally be introduced and enforced in some way.

Sommer's recommended approach, which received support from [18]forensic investigators and a [19]King's Counsel , involves the creation of a code of practice or set of procedural guidelines for the relevant existing legislation, such as PACE, which would be ultimately enforced by a judge.

He said this was "a much better approach" compared to one involving a prescriptive set of definitions that allow or disallow specific types of evidence. It places more value on the strength of the evidence instead of whether the evidence itself is admissible.

Part of this could also be a questionnaire handed to prosecutors who would provide assurances as to the quality of the evidence they present in court. Sommer suggested some suitable questions, the general themes of which are summarized below:

Provenance: How the data was gathered and preserved, avoiding contamination or alteration. What tools were used to turn raw data into a presentable format and the justification for using them?

Where processes are standardized, highlight them

Where processes are non-standard, highlight and justify them. Provide assessments of their accuracy

Details of technicians appearing as expert witnesses

Beyond new rules and regulations, the MoJ could explore other avenues for improving the state of digital evidence in the UK. The Association of Chief Police Officers (ACPO) Good Practice Guide to Digital Evidence, for example, hasn't been updated since 2011, so that could be a good place to start.

Sommer also pointed out that putting the onus on judges to evaluate the reliability of digital evidence is an issue. Some may be up to it, he said, but others won't and the Judicial Studies Board does not currently offer training courses on the matter.

Likewise, the police may need more training in this area, and the professor posited that there could be a need for a scheme to certify digital forensic experts before they appear as expert witnesses.

For organizations, the Forensic Readiness Program is an initiative that aims to ensure systems are set up so that digital forensics practitioners can easily extract the data they need should an incident occur, but awareness of it remains relatively low. ®

Get our [20]Tech Resources



[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Z__UIPzqMKv2VkZm9X3yewAAAdA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[2] https://www.theregister.com/2024/12/18/we_told_post_office_about/

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z__UIPzqMKv2VkZm9X3yewAAAdA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z__UIPzqMKv2VkZm9X3yewAAAdA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Z__UIPzqMKv2VkZm9X3yewAAAdA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] http://pmsommer.com/MoJ.pdf

[7] https://www.theregister.com/2025/02/25/google_sms_qr/

[8] https://www.theregister.com/2025/01/14/free_our_feeds_decentralized_social_media/

[9] https://www.theregister.com/2025/04/01/google_e2ee_gmail/

[10] https://www.theregister.com/2020/01/28/third_party_trust/

[11] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/legal&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Z__UIPzqMKv2VkZm9X3yewAAAdA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[12] https://www.researchgate.net/publication/351303866_A_Hierarchy_of_Expert_Performance_HEP_applied_to_Digital_Forensics_Reliability_and_Biasability_in_Digital_Forensics_Decision_Making

[13] https://www.theregister.com/2024/01/12/smart_cars_data_privacy/

[14] https://www.theregister.com/2025/04/11/uk_cyberattacks/

[15] https://www.theregister.com/2025/04/10/us_sensor_giant_sensata_ransomware/

[16] https://www.theregister.com/2025/04/10/europol_malware_loader_arrests/

[17] https://www.theregister.com/2025/04/09/uks_ministry_of_justice_algorithm_murders/

[18] https://www.linkedin.com/feed/update/urn:li:activity:7296487512053633024/?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7296487512053633024%2C7302840420688932864%29&dashCommentUrn=urn%3Ali%3Afsd_comment%3A%287302840420688932864%2Curn%3Ali%3Aactivity%3A7296487512053633024%29

[19] https://www.linkedin.com/feed/update/urn:li:activity:7296487512053633024/?commentUrn=urn%3Ali%3Acomment%3A%28activity%3A7296487512053633024%2C7296619190138884096%29&dashCommentUrn=urn%3Ali%3Afsd_comment%3A%287296619190138884096%2Curn%3Ali%3Aactivity%3A7296487512053633024%29

[20] https://whitepapers.theregister.com/



It's not just a data integrity issue.

markr555

When you can't trust those collating the evidence (think Fujitsu), how can you trust any of it?

Re: It's not just a data integrity issue.

abend0c4

There was a further complication in the Horizon case that, up until 2015, the Post Office was bringing private prosecutions (at least in England). The company was not only the source of the computer evidence but also the prosecutor - which you might consider to be at least procedurally imprudent. The whole system of private prosecutions needs reviewing as well as the rules regarding digital evidence.

Re: It's not just a data integrity issue.

Guy de Loimbard

100% - The Concept of any entity being able to act as judge, jury and executioner in this day and age should be outlawed.

Shocking level of autonomy that isn't allowed anywhere else I can think of.

Re: It's not just a data integrity issue.

Recluse

Might I nominate HMRC who seem to be claimant/judge/jury and executioner all rolled into one.

Naturally they are infallible (which is why their latest power grab is an ability to help themselves directly from your bank account without further ado)

Oh and they will only speak to you on their terms (and provided they are not on a WFH jolly day)

Re: It's not just a data integrity issue.

steven_t

The integrity of the data extracted is also pretty questionable when the data is:

a) stored by software clearly labelled as "beta", such as HMRC's own online Self-Assessment return software or

b) regularly corrupted by poor internal systems, such as those used for processing Real Time Information (for PAYE)

Re: It's not just a data integrity issue.

ComputerSays_noAbsolutelyNo

I bet, somewhere in Horizon's EULA, it says that one can not expect the software to work as expected.

Case closed

Re: It's not just a data integrity issue.

Martin Gregorie

Indeed. From the material I've seen about Horizon, it appears that:

The Post Office never issued even a Provisional System Requirement for Horizon, let alone a definitive System Requirements document.

In fact the material I've seen seems to indicate that the only system requirement definition was a single Post Office Sales Terminal implementation that was shown to Royal Mail executives at the beginning of the project. This was apparently never documented or used to generate system documentation for the central financial and stock control databases.

Similarly, no Acceptance Test scripts seem to have been written or used before Horizon was released for live operation.

In short, its difficult to see how anybody could have expected Horizon to have been bug-free, since almost every one of the generally accepted rules of computer systems design, documentation, implementation and testing seems to have been ignored.

Thorough legal review

Guy de Loimbard

Has been required for a while.

Anything that can clarify how to validate digital forensic evidence, particularly for use in Court proceedings, will be a huge step forward.

If we can define a set of rules to define integrity of evidence, or at least some sort of playbook for this field, it will go a long way to presenting solid evidence in court.

Of course, the vary nature of technological complexities, will not make this an easy thing to achieve, but the fact it's being looked at is a step in the right direction.

Brewster's Angle Grinder

Good article.

Mmm

Anonymous Coward

Frankly for something as complicated as Horizon just merely snapshotting the data doesn't really get you very far. If you're just taking screenshots from a complex line of business application then we're in Alice in Wonderland territory because it can mean whatever I say it means.

Without requirements that describe what something is supposed to mean in context (and independent verification that what the requirements say is actually backed up by what the code actually does) I wouldn't be keen on expressing an opinion as a juror.

tiggity

We have always known digital data is potentially flawed, sadly most of the public don't. Anyone working with IT would never have agreed computer data should always be regarded as reliable (as the Horizon case proved, software bugs meant the data was in places very unreliable, and as we all know bugs do occur. Cannot rely on most software inbuilt auditing either - I have worked with a variety of pieces of commercial software and in all those I investigated auditing could be disabled either via software itself if you had appropriate rights, or on underlying computer system, and beyond that default audit data retention was quite low - even saw some that had automatic date based purges that cleared financial data well before the 6/7 year rule (yes, people should be doing backups so audit data shouldn't be lost, but we all know instances of backups not being dore, or if they are the restore / retrieval process never being tested so faults not found until its too late) )

..though I would expect public to be well aware of bias in evidence that is presented, UK has a dubious record (in various police forces, not just a problem with 1 force) with hiding evidence that could at least give reasonable doubt or even exonerate defendant(s) (Birmingham 6 a classic example)

Doctor Syntax

Horizon seems to have been a case of someone too close to the day-to-day operation becoming blind to its limitations, especially if there was incentive to be so. It takes a fresh pair of eyes to see the problems. That was how I conceived my role back in the day - a fresh pair of eyes. The police investigate, they bring in bags of potential exhibits, the statements, their hypothesis. Can I, without having been enrolled in any group-think that might have happened, find evidence that contradicts the hypothesis? If I can it avoids a miscarriage of justice, if I look hard and fail it strengthens the hypothesis but again it's up to the court to become the final arbiter of fact.

There were two sides, the gathering of evidence - in my day in conventional forensic science often delegated to police SOCOs - and the testing. I'm not sure this is brought out strongly enough; presentation and testing are not the same thing.

Not over yet

Eclectic Man

Re: Post Office Horizon system landmark conclusion of the long-running scandal

I think you will find that this is still far from a conclusion. Project Horizon is still in use, Fujitsu are still expected to fix 'bugs' and many of the innocent Post office staff have yet to agree, let alone receive, compensation.

Re: Not over yet

Doctor Syntax

"many of the innocent Post office staff have yet to agree, let alone receive, compensation"

It's difficult to not believe that TPTB are hoping that more of them die before a payout happens.

Furthermore ...

Mike 137

In this context, every book on digital forensics that has crossed my desk (as a reviewer or as study material) over the last couple of decades has concentrated on the technicalities of extracting data from devices, with little or no reference to actual forensics -- how to deliver evidence acceptable in a court of law. Even the admittedly pre-digital ACPO guidelines did better -- for example stressing documented chain of custody. And it's clear that, given Prof. Sommer's comments on idiosyncratic methods, this should be extended to include a clear description of any post-extraction processing performed (which should of course be made available to the court).

rg287

Emphasis on the word "forensics".

Present traditional forensic evidence to court, and the forensic officers (or lab staff) will be able to deliver - on demand - a list of the techniques they used, chemical & analytical processes (for DNA analysis, etc), show that the mass spectrometer was properly calibrated when you determined that the chip of paint came from the scene of the crime.

They have to do this because a lot of traditional forensics like DNA is effectively a blck box and courts have always said "well go on... how does it work if you're magically telling us this suspect was definitely at the scene of the crime".

As the article alludes to... "digital forensics" can be as rudimentary as screenshotting social media. Which is not necessarily a problem - scene of crime officers have long photographed rooms, scenes, etc.

But it needs to be regulated, with proper chain-of-custody and robust data gathering procedures in place.

I suspect the likes of Bellingcat 1 have more robust audit trails than most Police forces (at least in terms of "casual" digital forensics conducted by non-specialist officers or detectives - not necessarily if they're shipping a device to a professional forensics provider).

1. Bellingcat have a home-brew audit package that basically logs the investigator's entire browsing history, shows exactly how they arrived at a page or document, archives it as they go - Recall-like - in the event a page or document is taken down or changed (for dynamic content), so you never get in a position where you say "can't find it again now" - particularly the case for social media where a timeline reload could whip away a post never to be seen again because the algorithm determines it's no longer of interest to you.

Doctor Syntax

It's painful to read DNA being traditional as it came into use after I left although the Jefferies paper was out. I think of blood grouping and enzyme polymorphisms as traditional.

Doctor Syntax

I'd have changed the order of the bullet points in his last list. The expert witness should come first as it's his or her job to prove the others.

jdiebdhidbsusbvwbsidnsoskebid

None of this seems very different from the issue of trust and reliability in traditional forms of evidence such as witness statements, interviews, photographs etc. Even without the issue of deliberately altered or faked material.

It seems like we just need to move ourselves away from the "the computer is always right" mentality. I'm happy with "the computer is always precise"* but always remember that precision and accuracy are different things.

*Unless using a quantum computer of course.

But I always fired into the nearest hill or, failing that, into blackness.
I meant no harm; I just liked the explosions. And I was careful never to
kill more than I could eat.
-- Raoul Duke