Fabricating evidence is the one thing cops are good at .
Even RAG is terrible at accuracy and avoiding hallucinations.
Sounds like a great way to get evidence thrown out of court.
In a just judicial system, yes. But that’s not what we have in the US.
Mission is going according to plan
Ha ha ha, good quality headline from The Onion!
sees source
…well fuck.
A month or two ago, there were a few articles involving Jim Jordan that were peak Not The Onion material.
The fourth amendment implications are on point here, but this tool isn’t “hallucinating” evidence. It’s a shitty LLM that lazy investigtors can use to find links between different device artifacts mostly.
Cellebrite is dumping money into this because its the industry buzz right now. They just want more of that sweet government contract money. It’s usefulness (and even invasiveness in some cases) is pretty overstated.
Even less shitty LLMs tend to hallucinate.
While it being junk is all well and good, how to you convince a judge or a jury that their “evidence” is garbage?
Ironically, the number of inaccuracies and half truths this article contains makes me think it was written by AI.
Got a lot of people to click on it while raging, though, so it served its purpose.
In case anyone’s interested in the source material, here’s the press release it’s going on about. The AI is about searching and analyzing evidence, it isn’t fabricating anything that’ll actually be used in court.
I’m not holding my breath.
The problem is those of us not in digital forensics believe this BS. It fuels anti-law enforcement sentiment unjustly. Hate LE if you want, just make sure it’s based on truth, not shite like this.
This article is written with some wild speculations by both the author of the article and the source they are quoting. When cell phones are cracked for evidence they have to use write blockers when they copy the phone. They do the analysis on the copy. The original is then re-copied in court to show what was found. This way the original is never tampered with and made inadmissible, and whatever analysis bullshit you did isn’t mixed in with your court room copy. What this also means is that your AI can hallucinate all it wants and make up any evidence you can imagine all day long, but when you get into the court room and have to then point to where the conclusions came from and you can’t-you will be standing there with a dick on your forehead and with a case being tossed out.