Accessibility links

Breaking News

Special Team Helps Amnesty Identify ‘Deepfake’ Videos


This image made from video of a fake video featuring former President Barack Obama shows elements of facial mapping used in new technology that lets anyone make videos of real people appearing to say things they've never said. (AP Photo)
Special Team Helps Amnesty Identify ‘Deepfake’ Videos
please wait

No media source currently available

0:00 0:05:46 0:00

Human rights groups have long depended on pictures and video as evidence during investigations. Images help confirm reports of abuse that cannot be proved through other methods.

Amnesty International is one of the world’s largest human rights organizations. It carries out investigations across the world in an effort to identify human rights abuses. The investigations seek to inform the public and influence governments to take action to protect human rights.

Amnesty often uses video or photos in its evidence collection process. In many cases, this material comes from social media services.

Earlier this year, Amnesty launched an investigation into police abuses against Russian protesters. The research methods included collecting and verifying videos posted on social media from across Russia since 2012.

Denis Krivosheyev is deputy director of Amnesty's Eastern Europe and Central Asia department. He spoke to the RFE/RL news organization. He said images posted on social media can greatly strengthen human rights investigations if they can be confirmed as true.

Police officers chase protesters during a rally against planned increases to the nationwide pension age in Moscow, Russia September 9, 2018. REUTERS
Police officers chase protesters during a rally against planned increases to the nationwide pension age in Moscow, Russia September 9, 2018. REUTERS

The problem, Krivosheyev says, is that images found on the Internet can be misrepresented for propaganda purposes. They can also be changed in an attempt to play a trick on the public.

There is an increasing number of tools people can use to create false photos and video. Some of these tools have been used to make videos known as “deepfakes.” A deepfake video is one that looks real, but was electronically changed. Such videos can make people appear to say or do things that never actually happened.

The name deepfake comes from the process of deep learning, a form of artificial intelligence, or AI. Deepfake videos can be created with computer programs available for sale or on the internet. The technology uses face mapping and AI to produce false video images that look very real.

To help fight this problem, Amnesty created a special team to identify and verify photos and video found on the internet. The team is called the Digital Verification Corps. The group includes about 100 student volunteers from six universities around the world.

The team is trained to use the latest technology tools and methods to discover false videos and pictures. In some cases, images might not have been falsified, but did not happen at the time and place that is claimed.

Police detain a demonstrator during a rally protesting retirement age hikes in Moscow, Russia, Sunday, Sept. 9, 2018. (AP Photo/Dmitry Serebryakov)
Police detain a demonstrator during a rally protesting retirement age hikes in Moscow, Russia, Sunday, Sept. 9, 2018. (AP Photo/Dmitry Serebryakov)

Krivosheyev says the flood of videos and photos being shared on social media has been a complete “game changer" for human rights investigations.

"There is a major difference in the level of confidence with which we can speak about things as 'fact,' as opposed to 'allegations,’” he said.

He added that Amnesty has discovered photographic and video evidence to document many different human rights violations. These have included incidents of people being denied the right to demonstrate, as well as severe cases of torture.

Krivosheyev says this is why the Digital Verification Corps is so important to the organization’s work. The team can examine every piece of information Amnesty finds or is given.

"We must be able to speak confidently about things that happened as facts rather than merely quoting reports, some of which are not entirely accurate."

I’m Bryan Lynn.

Ron Synovitz reported this story for RFE/RL. Bryan Lynn adapted his report for VOA Learning English. RFE/RL and VOA are both part of the U.S. Agency for Global Media (USAGM), formerly called the Broadcasting Board of Governors. Caty Weaver was the editor.

We want to hear from you. Write to us in the Comments section, and visit our Facebook page.

_____________________________________________________________

Words in This Story

verify v. to prove something is true, or do something to discover if it is true

artificial intelligence n. ability of a machine to use and analyze data in an attempt to reproduce human behavior

confidence n. feeling or belief that you can do something well or succeed at something​

allegation n. statement that someone has done something wrong or illegal without actual proof that it is true

quote v. to repeat what someone has said or written

entirely adj. fully or complete

accurate adj. true or correct

Your opinion

Show comments

XS
SM
MD
LG