Saturday , July 11 2020
Home / Technology / Former Google unit’s latest tool to help journalists spot fake images

Former Google unit’s latest tool to help journalists spot fake images

“We observed an evolution in how disinformation was being used to manipulate elections, wage war and disrupt civil society,” Jared Cohen, Jigsaw’s chief executive, wrote in a blog post about Assembler. “But as the tactics of disinformation were evolving, so too were the technologies used to detect and ultimately stop disinformation.”

The tool is meant to verify the authenticity of images, or show where they may have been altered. Reporters can feed images into Assembler, which has seven “detectors,” each one built to spot a specific type of photo-manipulation technique.

When an image has been manipulated — for instance, two images were merged together or something was deleted from the background — traces of the changes may be left behind. With a computer program that has been trained to learn from being shown example after example of what it should detect, Assembler can analyse an image and highlight where it thinks those traces are.

Five of the Assembler’s image detectors were developed by research teams at universities, including the University of California, Berkeley; the University of Naples Federico II in Italy; and the University of Maryland. The models can detect things like colour pattern anomalies, areas of an image that have been copied and pasted several times over, and whether more than one camera model was used to create an image.

Santiago Andrigo, left, and Andrew Gully, two of Jigsaw's researchers.

Santiago Andrigo, left, and Andrew Gully, two of Jigsaw’s researchers.Credit:Justin Kaneps / The New York Times

“These detectors cannot completely solve the problem, but they represent an important tool to fight disinformation,” said Luisa Verdoliva, a professor at the Naples university and a visiting scholar at Google AI.

The other two detectors were developed by Jigsaw. One was designed to identify “deepfakes,” realistic images that have been heavily manipulated by artificial intelligence in ways meant to mislead an audience.

Santiago Andrigo, a Jigsaw product manager, said Assembler might be “most helpful in a situation where a journalist from a large news organisation receives a scandalous image and is under pressure to break the news.” It could also be used to verify an image that has gone viral, he said.

Jigsaw also announced an interactive platform showing coordinated disinformation campaigns from around the world over the past decade. They include Ukrainian soldiers receiving targeted disinformation encouraging them to defect during the 2014 Russian annexation of Crimea; associates of President Rodrigo Duterte of the Philippines hiring “click armies” to write pro-Duterte comments and stories online; and a small-town California hospital hiring a private firm, Psy-Group, to influence public opinion about a contested seat on the hospital board.

The database described the players involved in influence operations, common tactics used and how the falsehoods were spread on social media platforms. Jigsaw worked with the Atlantic Council’s Digital Forensic Research Lab to organise the set of around 60 disinformation cases, culled from over 700 investigations, articles and reports the lab published over the last five years.


Emerson Brooking, a resident fellow at the lab, said the goal was not to build an encyclopedic list of disinformation campaigns but to create a foundation for “a shared language” to describe the various efforts. That way, they could develop a taxonomy that could help other media outlets and groups studying disinformation, he said.

The two projects, Assembler and the disinformation interactive platform, were announced on Jigsaw’s new research publication, The Current.

The New York Times

Most Viewed in Technology


About admin

Check Also

TikTok’s data collection a worry, regardless of nationality

Downloaded 1.6 million times in Australia, 2 billion worldwide, and having grown substantially since much …