Image courtesy of Mike MacKenzie licensed under Creative Commons (CC BY 2.0) |
This year has seen the proliferation of fake news over the internet. It comes in different forms and shapes such as memes, links, news or other forms. This news is often disguised in a wide range of topics from pop culture, politics and even serious issues such as natural disasters. Being one of the major sources of news and a powerful medium for sharing, Facebook has been taken the blame partly for the widespread circulation of fake news. In response to this, Facebook has taken steps to curb its epidemic nature from designing news feeds, third-party fact checking up to adding red flags on the potential fake news since December last year. Below are the changes made in the news feed to identify and reduce the spreading of false news:
- easy access for users to report stories that look like fake news
- use of machine learning to identify potential false news
- third-party partners to fact-check these articles
- reduced distribution of articles disputed by fact-checkers
- several features that notify people who are about to share or have shared fake news (so they can take corrective or preventive actions)
None of us likes to share the fake news with our friends or families. But at one point, we may have fallen victims of this nasty bait of this hoax to popularise something that is damaging or hurting an agency or to increase political or financial gain or drive traffic to their business in a sensationalised manner. Sometimes, it's just not easy to recognise real from the false news.
Facebook has gone places and found out that misinformation is a major concern among people of different races, and since then have made it their priority.
After a year of rolling out their strategy, Facebook not only tried to identify and reduce the spread of fake news. They also researched on how people will react to the changes. For example, what happens when a red flag shows up. They learned a lot of things and discovered that in a way, (1) this could put the false news on the stage because of the strong visualisation and further entrench someone’s belief. (2) It also takes a few more clicks and actions before people could see what fact-checkers said about the disputed story. (3) Because two third-party fact checkers need to identify a false news before the strong signal is added, it takes longer for Facebook to provide related articles, which somehow defeats the purpose. (it false news may have spread at some rate before the legitimate articles can mitigate it. (4) Disputed flags worked for false ratings. However, fact checker also uses other labels such as "unproven", "partly false" or "true." Aside from the disputed flag, people want more context regardless of the rating.
To address the limitations, Facebook tested an improved version of the Related Articles in April. The Related Articles appears on News Feed before someone clicks on an article. This allows people to get more context on the topic and enables them to make a more informed decision over what they read, trust or share.
In August, they found progress in that there were fewer shares on hoax article, though the click-through rates haven't changed much. Facebook also got positive feedback from people who noticed the change: it became easier to get the context, it works for other ratings as well, and it does not create negative reactions like that of a red flag.
On the part of Facebook, this practice applies to articles reviewed by only one fact checker. This means that crucial information reaches people much faster, as soon as an article is marked false by at least one fact checker. To strengthen the drive for mitigating false news, they also added a badge to fact checkers so people will easily recognise them. On top of these changes, they kept the strong parts of the first experience. Like before, when an article is disputed by fact-checkers, people who previously shared it are notified. If someone shares the content, a pop message will notify additional reports on the story. Facebook thinks that with the use of unbiased and non-judgmental language, the experience becomes more encompassing of people with diverse perspectives.
0 comments:
Post a Comment