Retired West Midlands Police chief mocked by former force over AI blunder
Nottinghamshire Police appear to have taken a swipe at neighbouring West Midlands Police and their ex-boss following a high-profile scandal that saw the Chief Constable retire amid a series of serious mistakes, including the use of faulty AI-generated evidence.
Listen to this article
Craig Guildford, the former head of West Midlands Police, stepped down on Friday after the force used incorrect information in evidence shared with Birmingham City Council’s Safety Advisory Group in the run-up to a Europa League fixture involving Aston Villa and Maccabi Tel Aviv.
The evidence was used in a Home Affairs Select Committee to justify banning Maccabi Tel Aviv supporters from attending the match on safety grounds.
However, it later emerged that part of the intelligence relied upon was entirely false.
At the centre of the controversy was the use of Microsoft Copilot, an artificial intelligence tool, which generated details of a football match that never happened.
The AI model claimed Maccabi Tel Aviv had previously played West Ham in a fixture that raised security concerns, yet it was discovered later that it was a game that did not exist and was later confirmed to be an example of an “AI hallucination”.
AI hallucinations occur when systems generate information that appears plausible but is factually incorrect. Experts say this can be caused by insufficient or biased training data, flawed assumptions made by the model, or the way prompts are interpreted.
Read more: West Midlands officers warned about discussing police chief's future as pressure grows to sack him
In the aftermath of the scandal, Rushcliffe Police, a neighbourhood policing team within Nottinghamshire Police, posted a pointed message on social media that appeared to be a dig at West Midlands Police.
In a Facebook post addressing their own use of artificial intelligence, Rushcliffe Police wrote: “To avoid fictional football matches or made up case law being quoted officers and staff will be reviewing AI-generated material for accuracy.”
They added: “Due to recent events and to maintain transparency in our press releases and social media posts, if officers or staff have used Microsoft Copilot, they will be adding comments to show that.”
A Rushcliffe resident, who wished to remain anonymous, told LBC: “This is a very snide comment from Rushcliffe Police… especially the reference to the made-up football matches is probably a bit too soon after last week's resignation!
West Midlands Police’s errors generated debate around the use of artificial intelligence within policing and the criminal justice system, particularly around accountability, verification, and transparency.
The National Police Chiefs Council said, on behalf of forces across the UK, “our use of AI will always be responsible, transparent and explainable, and this is why we have all signed up to the first-ever AI covenant.”
Announcing Craig Guildford’s retirement on Friday to press outside of West MIdlands Police’s HQ, Simon Foster the Police and Crime Comissioner said: “I am pleased this outcome has been reached having regard to due process and the law. That has prevented what might otherwise have been a complex procedure, that would have caused significant distraction, impact and cost to West Midlands Police and the wider West Midlands. It was important this matter was resolved in a balanced, calm, fair, measured and respectful manner.”
Subsequently, PCC Foster made a voluntary referral of the former Chief Constable to the Independent Office for Police Conduct to examine the circumstances around the decision that led to Maccabi fans being banned from the fixture.
West Midlands Police have since acknowledged the errors and said changes have been made to prevent a repeat, while the Home Office and policing watchdogs are understood to be monitoring the wider use of AI across forces in England and Wales.
A spokesperson for Nottinghamshire Police told LBC: “This post was published by an individual officer not on behalf of Nottinghamshire Police, without knowledge or approval on a local neighbourhood Facebook page.
“We do not use AI for our press releases or social media posts as the post suggests. The post has been removed, and the officer’s supervisor will be speaking to him.”