Facebook to tackle misinformation in Armenian

Facebook today announces that it will expand the partnership with GRASS’s FactCheck Georgia as an independent fact-checker to help tackle the spread of misinformation on Facebook and Instagram in Armenian. FactCheck Georgia is certified through the non-partisan International Fact-Checking Network (IFCN) and for the upcoming two months (June-July 2021) will cooperate with a local Armenian NGO Media Initiatives Center (MIC) fact-checking content posted on Facebook and Instagram in Armenian.

Facebook’s fact-checking programme includes more than 80 partners around the world checking content in over 60 different languages. These partners play a key role in identifying viral misinformation. When a fact-checker rates a story as false, Facebook adds a warning label to the content and significantly reduces its distribution. Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed.

As Facebook cooperates only with IFCN-certified fact-checkers and prioritizes places with upcoming elections, where misinformation might be leading to real risks on the grounds, Facebook decided to expand the cooperation with FactCheck Georgia, Facebook’s important partner with experienced journalists, to Armenia.

“We’re very happy that we may expand the partnership with FactCheck Georgia to Armenia. Before this announcement we took into account several factors that we analyzed prior to the launch of the fact-checking programme. Together with Media Initiatives Center (MIC), a local partner in Armenia, their vital work will help us reduce the spread of Armenian-language misinformation on our platforms” – Sophie Eyears, Strategic Partner Development Manager at Facebook.

Facebook has a three-part strategy for tackling misinformation:

● inform people by giving them more context on the posts they see

● reduce visibility of false news and inauthentic content

● remove content that violates Facebook and Instagram’s Community Standards or ad policies

This strategy has been particularly important in tackling health-related misinformation. Since the start of the pandemic, Facebook has removed more than 18 million pieces of content for breaking its rules on COVID-19 and vaccine misinformation. Warning labels have also been applied to more than 167 million pieces of content. When a warning label is placed on a post, 95% of the time people don’t click to view it.

Additional information on the fact checking program is available here. More information on the International Fact-Checking Network and its code of principles is available here and here.

Leave a Reply