In the aftermath of US elections which held on the 8th of this month, many blamed Facebook for the way the election turned out and if you’ve been watching TV lately, you may have been hearing so much about this. In light of this, someone developed a Chrome extension to help check fake news on Facebook by flagging them and just days after this report, Facebook is officially and finally going to tackle the issue.
[xyz-ihs snippet=”Facebook-official-fake-news-detector”]
While the Facebook CEO insists that the number of fake news on its platform is still relatively small, it feels it needs to act on recent news of it playing a role in the election. Some months ago, it was reported that Facebook fired the human team involved in identifying and fixing issues with fake news on the website only for us to start hearing months later that the algorithm it was banking on to do this has not really been effective.
Mr. Zuckerberg has now identified seven way he thinks the issue of fake news can be dealt with once and for all;
- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
- We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
- We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.
About 62 percent of adults in the United States rely on Facebook for news every day while 18 percent of those people actually read the news even more often on Facebook. This is significant because just four years ago, about 49 percent of American adults said they relied on news they read on Facebook. You can understand why it is important that what people read on Facebook is actually authentic and not just form of hoax. In June, Washington Post reported that about 6 in 10 people will share links without even reading what’s contained in the report first. That’s a whopping 60 percent of the people and that’s big.
“Study: 70% of Facebook users only read the headline of science stories before commenting.”
Nearly 46,000 people shared the post, some of them quite earnestly — an inadvertent example, perhaps, of life imitating comedy.
Now, as if it needed further proof, the satirical headline’s been validated once again: According to a new study by computer scientists at Columbia University and the French National Institute, 59 percent of links shared on social media have never actually been clicked: In other words, most people appear to retweet news without ever reading it.”
In another report by the Washington Post again, Facebook fake news writer, Paul Horner makes about $10,000 a month publishing content he knows appeals to people’s beliefs and way of life. By promoting such news to a certain group of people on Facebook, he’s able to connect with users who are more interested in what he writes. So seeing as 62 percent of adults in the US rely on Facebook for news reports while another 60 percent will just share such news without actually reading the news first, you can see how viral such headlines by Paul Horner on Facebook can go.
So a headline like “Hillary Clinton likely to be indicted by the FBI-sources” by a reporter who later apologised and retracted the story after claiming he spoke to sources within the FBI is capable of changing a lot of minds. After all, who wants a president who is under FBI investigation right? Well days later, the FBI dismissed the claims and ultimately closed the case and so it turned out that this was another fake news story that may have impacted the election in ways we may never know now.
But sometimes publishers partner with the likes of Taboola and Outbrain to make up for lost revenue in digital advertising sometimes because many now use ad blocking services which is set to cost publishing about $35b annually by 2020. The Wall Street Journal reports that Taboola and Outbrain generate close to $500m in revenue by providing “Recommended” links at the bottom of partner sites. At least we know by now that some of those sites they link you to are not as authentic. It’s left to seen what plans Facebook has for such sites but whether Facebook is going to re-hire human curators to work with its algorithm to fight fake stories on its website is yet to be seen.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.