Since the Coronavirus outbreak, there has been lots of misinformation and conspiracy theories concerning a possible vaccine to eradicate the virus. YouTube announced this week its move to take down videos containing false information about COVID-19.
The greatest challenge with social networking sites revolves around misinformation. News spread faster on social media than via traditional media and isolating the effects of misinformation can be very challenging. It is not often that a user will share both accurate and inaccurate information about an event or an occurrence which is why social networking sites must be on their toes to take down false information without trampling on users’ freedom of speech. Human biases often play a major role here: we are naturally more likely to react to content that resonates with our existing grievances and beliefs. A user who has been in doubt of the virus is more likely to believe and share a video with claims that the Coronavirus vaccine has the potency to lead to infertility.
Facebook in February banned ads that sought to misinform other users.
Some videos on the video sharing platform promote claims about COVID-19 that contradict the consensus from local health authorities or the World Health organisation. In an email, YouTube shared that it would remove claims that the vaccine had the potency to cause infertility or that microchips will be implanted in people who receive the vaccine.
Very recently, the use of vaccines has become unpopular in many countries with claims that vaccines can cause other avoidable ailments or sicknesses. But overall, however, vaccines are known to have gradually slowed down certain diseases such as polio which is widespread in Africa.
A spokesperson from YouTube said that general discussions about the virus and the vaccine would remain on the platform. Some users have shared claims disputing the existence of transmission of the Coronavirus and promoted medically unsubstantiated methods of treatment, thereby discouraging people from seeking medical attention when they begin to exhibit similar symptoms of carriers of the virus.
Professionals advice self-isolation or social distancing, but there have been a series of videos with claims that certain medication can counter the virus. YouTube said it had removed over 20,000 videos related to dangerous or misleading COVID-19 information since early February.
World Health Organisation’s manager of digital solutions stated that WHO schedules to meet weekly with the policy team at YouTube to discuss content trends and problematic video trends. Andy said he was pleased by the app’s announcement on the COVID-19 misinformation. According to him, it shows that the video sharing app understands its huge influence and is working to curb the spread of misinformation.
YouTube said in retrospect, that it was restricting some borderline videos on its platform but didn’t site examples of which. In April, the platform banned videos on conspiracy theories that linked 5G technology to the virus.
The Coronavirus took the world by surprise early this year and has since killed more than a million people, crippled the global economy and infected over 38 million people. Drug makers and researchers are currently working on various treatments and vaccines to stop the deadly virus.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.