IE 11 is not supported. For an optimal experience visit our site on another browser.
EXCLUSIVE
Artificial intelligence

Washington state judge blocks use of AI-enhanced video as evidence in possible first-of-its-kind ruling

Lawyers for a man charged with murder in a triple homicide had sought to introduce cellphone video enhanced by machine-learning software.
The Norm Maleng Regional Justice Center
The Norm Maleng Regional Justice Center, which houses the King County Superior Court, in Kent, Wash.Ian Dewar / Shutterstock file

A Washington state judge overseeing a triple murder case barred the use of video enhanced by artificial intelligence as evidence in a ruling that experts said may be the first-of-its-kind in a United States criminal court.

The ruling, signed Friday by King County Superior Court Judge Leroy McCullogh and first reported by NBC News, described the technology as novel and said it relies on "opaque methods to represent what the AI model 'thinks' should be shown."

"This Court finds that admission of this Al-enhanced evidence would lead to a confusion of the issues and a muddling of eyewitness testimony, and could lead to a time-consuming trial within a trial about the non-peer-reviewable-process used by the AI model," the judge wrote in the ruling that was posted to the docket Monday.

The ruling comes as artificial intelligence and its uses — including the proliferation of deepfakes on social media and in political campaigns — quickly evolve, and as state and federal lawmakers grapple with the potential dangers posed by the technology.

Lawyers for a man accused of opening fire outside a Seattle-area bar in 2021, killing three people and wounding two, had sought to introduce cellphone video enhanced by machine learning software, court filings show. Machine learning is a particular field within artificial intelligence that has risen to prominence in recent years as the underpinning of most modern AI systems.

Prosecutors in the case said there appeared to be no legal precedent allowing the technology in a U.S. criminal court, according to a February filing in King County Superior Court. Jonathan Hak, a solicitor and barrister in Canada and an expert on image-based evidence in the United States and elsewhere, said this was the first case he was aware of where a criminal court had weighed in on the matter.

The defendant, Joshua Puloka, 46, has claimed self-defense in the Sept. 26 killings, with his lawyers saying in a court filing in February that he had been trying to de-escalate a violent situation when he was assaulted and gunfire erupted.

Puloka returned fire, fatally striking innocent bystanders, the filing says. The man accused of assaulting Puloka was also fatally shot, a probable cause statement shows.

The deadly confrontation was captured in the cellphone video. To enhance the video, Puloka’s lawyers turned to a man who had not previously handled a criminal case but had a background in creative video production and editing, according to the prosecutors' filing.

The software he used, developed by Texas-based Topaz Labs, says its software is used by film studios and other creative professionals to "supercharge" video, according to the filing. 

Puloka’s lawyers did not respond to requests for comment. In a statement, a spokesperson for Topaz Labs said the company "strongly" recommends against using its AI technology for forensic or legal applications.

The prosecutor’s office said the enhanced video predicted images rather than reflect the size, shape, edges and color captured in the original video. The enhanced images were “inaccurate, misleading and unreliable,” the filing says.

In a declaration for the prosecution included in the filing, a forensic video analyst who reviewed the original and enhanced recordings said the enhanced version contained visual data that was not in the original. Data had also been removed from the enhanced version, according to the expert, Grant Fredericks.

Every pixel “in the AI-generated video is new, resulting in a video that may appear more pleasing to the eye of a lay observer, but which contains the illusion of clarity and increased image resolution that does not accurately represent the events of the original scene,” Fredericks wrote in the declaration.

In a separate filing, Puloka’s lawyers countered that such claims were “exaggerated and overblown.” A comparison of the two videos shows the enhanced version is a “faithful depiction of the original,” the filing says. “And that is what matters.”

In his declaration, Fredericks, who has taught for the FBI and has worked as a video analyst for 30 years, said he was unaware of peer-reviewed publications that establish an accepted methodology for AI video enhancements. The FBI includes nothing on the subject in its best practices for handling forensic video, he said.

George Reis, a former crime scene investigator and longtime forensic video analyst in Southern California, said he was aware of a handful of examples of artificial intelligence being used as a potential investigative tool to clarify license plate images.

One of the companies that has developed such software, Amped, said in a post in February that artificial intelligence is not reliable enough to use for image enhancement in a legal setting. The company pointed to the technology’s opaque results and potentially biased outcomes.

“It’s a novel science,” Reis said. “There should be research done on it before someone uses it in actual case work. It should be peer-reviewed. I’m not certain what level is going to be appropriate at some time in the future for the use of AI in actually doing a clarification of a still photograph or a video, but at this particular point it’s premature.”