Learn how Facebook tackling misinformation across the apps of fake accounts
March 27, Washington D.C.: The House Energy and Commerce Committee will examine how technology platforms like Facebook are tackling misinformation online. It is tempting to think about misinformation as a single challenge that can be solved with a single solution.
But unfortunately, that’s not the case. Thinking of it that way also misses the opportunity to address it comprehensively. Tackling misinformation actually requires addressing several challenges including fake accounts, deceptive behaviour, and misleading and harmful content. As the person responsible for the integrity of the products, Guy Rosen, VP, Integrity wanted to provide an update on how FB approach each of them.
FB take a hard line against this activity and block millions of fake accounts each day, most of them at the time of creation. Between October and December of 2020, FB disabled more than 1.3 billion of them. FB also investigate and takedown covert foreign and domestic influence operations that rely on fake accounts. Over the past three years, FB has removed over 100 networks of coordinated inauthentic behaviour (CIB) from the platform and keep the public informed about efforts through monthly CIB reports.
FB also crackdown on deceptive behaviour, found that one of the best ways to fight this behaviour is by disrupting the economic incentives structure behind it. FB have built teams and systems to detect and enforce against inauthentic behaviour tactics behind a lot of clickbait. FB also use artificial intelligence to help us detect fraud and enforce the policies against inauthentic spam accounts.
Misinformation can also be posted by people, even in good faith. To address this challenge, FB has built a global network of more than 80 independent fact-checkers, who review content in more than 60 languages. When they rate something as false, FB reduces its distribution so fewer people see it and add a warning label with more information for anyone who sees it. FB know that when a warning screen is placed on a post, 95% of the time people don’t click to view it. Also, notify the person who posted it and reduce the distribution of Pages, Groups, and domains that repeatedly share misinformation. For the most serious kinds of misinformation, such as false claims about COVID-19 and vaccines and content that is intended to suppress voting, FB will remove the content.
Over the past several years, FB has invested in protecting the community and now have over 35,000 people working on these challenges. Making progress thanks to these significant investments in both people and in technology such as Artificial Intelligence. Since the pandemic began, FB has used AI systems to take down COVID-19-related material that global health experts have flagged as misinformation and then detect copies when someone tries to share them. As a result, removed more than 12 million pieces of content about COVID-19 and vaccines.
But it’s not enough to just limit misinformation that people might see. FB also connect people to reliable information from trusted experts. FB do this through centralized hubs like the COVID-19 Information Center, Climate Science Information Center or US 2020 Voting Information Center, labels that FB attaches to certain posts with reliable information from experts, and notifications that FB run in people’s feeds on both Facebook and Instagram.
Despite all of these efforts, there are some who believe that FB has a financial interest in turning a blind eye to misinformation. The opposite is true. FB have every motivation to keep misinformation off of the apps and have taken many steps to do so at the expense of user growth and engagement.
For example, in 2018 FB changed the News Feed ranking system to connect people to meaningful posts from their friends and family. FB made this change knowing that it would reduce some of the most engaging forms of content, like short form video, and lead to people spending less time on Facebook — which is exactly what happened. The number of time people spent on Facebook decreased by roughly 5% in the quarter where FB made this change.
As with every integrity challenge, the enforcement will never be perfect even though improving it all the time. While nobody can eliminate misinformation from the internet entirely, FB continues using research, teams, and technologies to tackle it in the most comprehensive and effective way possible.
Read more from the below TAGS
People also reading-