Can technology bring Vladimir Putin to justice?

Placeholder while article actions load

The Russian invasion of Ukraine in many ways has become one of the world’s first digital wars, with combatants from both sides fighting for advantage on social media, Western players embarking on attempts to raise cryptocurrency for Ukraine, and a Ukrainian minister taking to Twitter to persuade global companies to intervene digitally.

Now, there’s a new frontier. To bolster the kind of war-crimes evidence that has not always proved easy to admit to international courts, a group is looking to the technology behind cryptocurrency and non-fungible tokens, or NFTs.

The project backed by Stanford University and the University of Southern California, which comes out of something called the Starling Lab, is using decentralizing technologies to ensure that visual evidence that is being gathered and uploaded in Ukraine doesn’t fall victim to the evidence-collection missteps of war crimes past. The project — which boasts human rights experts and former government officials among its leaders — hopes to use blockchain technology as well as other tools to ensure that evidence isn’t lost, challenged or corrupted by those who want the alleged crimes of the Russian invasion force covered up.

“Technology offers us so many more tools to go after perpetrators than we’ve ever had before,” said Jonathan Dotan, a professor at Stanford’s Graduate School of Business and a writer on the streaming show “Silicon Valley” who co-founded the Starling Lab. “Unfortunately, the perpetrators have much better tools too. So we need to fight back as hard as we can.”

Joining Dotan is John Jaeger, a former State Department employee who founded Hala, a private company funded by the U.S. government that uses artificial intelligence to gather unencrypted intelligence in war zones; Graham Brookie, who runs the Atlantic Council’s tech-minded Digital Forensic Research Lab; and Stephen Smith, a British genocide scholar and the former executive director of USC’s Shoah Foundation, who has pioneered holographic testimony that he now oversees at a company called Storyfile.

Their hope is that some of the same technologies that power cryptocurrency will make it harder for Russian President Vladimir Putin and his aides to fog up prosecutions with misinformation on social media platforms, as they’ve tried to do with the images of Ukraine’s Bucha massacre that recently surfaced.

For many Americans, the blockchain is incomprehensible; for others it is simply a way to power an NFT bubble or a widespread cryptoscam. But war-crimes prosecution may provide an unequivocally positive use.

Together, Dotan, Smith, Brookie and Jaeger have spent the past five weeks building a team of engineers and legal experts in Ukraine and the United States in an effort to make the images and video uploaded to Telegram, TikTok and other platforms more airtight against war-criminal defenses.

“Social media images as they currently exist are just not going to slow down or prevent or ultimately indict war criminals. They’re just too suspect to manipulation,” Smith said. “If we’re going to do justice to these people’s lives and stories, we’re going to need to have a higher standard.”

The complex effort involves several points on a timeline, all using some form of decentralizing technology.

First comes registration — or “hashing” — the process by which an image is first noted online. To register an image, recordkeepers use metadata, the tag for the image. Starling deploys various technologies on the blockchain (the public digital ledger where the metadata is kept) that make it very difficult for that data to be changed without triggering a digital alarm.

“The goal isn’t a single ledger of truth,” Dotan said. “It’s to have it exist in as many places as possible so we can have a consensus of trust.”

Then there’s preservation — making sure the image or video is not changed along the way. This media can’t be stored on a blockchain, which records just letters and numbers. Instead, the images are stored with another decentralization method.

For decades — including very recently in the Web 2.0 era — the goal was to get material to a single high-security physical or digital location where no one could touch it. That makes sense: If you want to preserve the family diamond ring, you put it in the best bank-storage vault money can buy. But in the emerging Web3 era, the goal is to store the images in as many places as possible, as far apart as possible.

Starling is using Filecoin, a leading storage facility that has a higher threshold for authentication than even a cryptocurrency like bitcoin. The Starling Lab has secured 2 petabytes of storage — 2 million gigabytes, enough to fill 8,000 laptops — to ensure that any single image could be kept in dozens or even more storage lockers to prevent anyone from corrupting it.

Dotan compares the two stages to a real estate transaction in the analog world. First comes a notarized signature on an agreement, certifying that the person who signed it is who they say they are. Then comes the sending of the binders of the agreement itself to numerous parties, making it difficult for any one person to change a binder without it being evident exactly what they’d done.

At the end of this process, there are even NFTs; the unique digital tokens will be given to investigators so they and only they can access this material.

“What we’re always thinking about is future-proofing,” Jaeger said. “The technology is very possible. But it’s easy to get it wrong and easy to do inefficiently, so we need to do our jobs right so it doesn’t come back to hurt us later in a case.”

The push comes along with another initiative, the British-based effort ARWeave, that also aims to collect data on open-source networks, though that is a far less moderated approach that simply makes itself available to all who wish to upload.

These new technologies may be necessary because of the nature of the combatants. The most successful war-crimes prosecutions, going all the way back to Nuremberg and including the heaps of evidence brought against former Serbian president Slobodan Milosevic 20 years ago, happened because the alleged attackers kept verifiably meticulous records that could later be used against them. This is less likely to happen with Russia in Ukraine.

Activists can recount a long list of failed prosecutions by the International Criminal Court because of a lack of official material. In 2014, for instance, prosecutors at The Hague court withdrew charges against Kenyan President Uhuru Kenyatta because the Kenyan government was withholding evidence prosecutors said they needed to move forward.

Social media turns out to be not much better. Dotan noted that much of the evidence of crimes from the Syrian civil war has been permanently lost because it was stored on servers of social media companies like Twitter and YouTube — where an algorithm would automatically take the images down for graphic-content reasons or a human staffer would bow to a challenge initiated by someone trying to cover it up.

Human Rights Watch said in a September 2020 report that a review of Facebook, Twitter and YouTube showed that fully 11 percent of the videos and images the group cited in its 4,739 reports of abuse over the previous 13 years had since been taken down by the sites, essentially making the evidence disappear.

“Social media platforms have been taking down online content more often and more quickly, often in response to the demands of governments, in a way that prevents the use of that content to investigate people suspected of involvement in serious crimes, including war crimes,” HRW said. “They are not currently archiving this material in a manner that is accessible for investigators and researchers to help hold perpetrators to account.”

Still, the strongest system for cataloguing images won’t prevent false flags or fabricated images in the first place, and that gives human rights experts pause about the effectiveness of even Starling’s precautions.

“I don’t want to say it’s not important to this. It’s an additional layer of information that can help,” said Sam Gregory, director of programs, strategy and innovation at the tech-focused human rights group Witness. “But unless you’re getting the complete arc from the moment you capture the video to the moment you store and share it, you still have a lot of questions. ‘Was this video staged?’ ‘Did things happen that are not captured on camera?’ ‘Can we trust what was initially hashed?’”

Equally important is that even the most certified images still have to convince judges in the real world. It’s famously difficult to get a conviction at the ICC to begin with; arcane technical processes may not move the needle much.

“I’m not sure it will be a game-changer,” said Alexa Koenig, executive director at the UC Berkeley School of Law’s Human Rights Center, who is regarded as one of the foremost experts on technology and human rights. “I think it will be really valuable at poking around the edges of what has been one of the biggest problems, which is chain of custody,” she said, using the formal term for knowing who had access to a piece of evidence. “The challenge will be on the admissibility, on convincing judges this is something they should be allowing or heavily weighting.”

Starling principals say they recognize the challenges.

“We’re just entering this era, and it could be a while before the stuff is very really established as part of good jurisprudence,” Jaeger said.

But he said the alternative was to continue a track record of non-accountability, a travesty in his eyes in this age of sharp tech tools.

“Yes, admissibility for any kind of digital evidence will be tough. But we need to make sure it’s as strong as possible.

“Because otherwise,” he added, “when the time comes, we won’t have anything to admit.”

Loading…

Source: WP