Latest News
The war against #FakeNews: How misinformation about COVID-19 can spread quicker than the virus itself
Social media allows us to connect with others, give our opinion and share stories, but what if that story turns out to be #FakeNews? How easy it is to distinguish false accusations from factual information – especially when you trust the person who has shared the content?
With the power of social media, misinformation can spread quicker than COVID-19 itself. Director-General of the World Health Organization (WHO) declared that the COVID-19 epidemic is going through an “infodemic” of misinformation. Previously, social media has been accused of contributing to a measles outbreak in Washington by not attempting to stop the spread of fictitious reports about the danger of vaccines. Health organisations are concerned that history could repeat itself and misinformation “super spreaders” could cause a decrease in vaccination figures. A study published in Nature Human Behaviour found that after being shown online misinformation, those that originally stated they would “definitely” accept a vaccine, dropped by 6.2%.
Viewing this as a potential crisis, in November 2020 Labour MPs called for new emergency laws to “stamp out dangerous” anti-vaccine content, criticizing social media platforms for simply not doing enough. Hashtags like #VaccinesAreDangerous have been seen trending across Twitter and TikTok.
How are social media giants responding?
Nature Human Behaviour found that Facebook spreads fake news faster than any other social website. In light ofthis, Facebook have recognised the severe need to promote news that is factual.
“False news is harmful to our community, it makes the world less informed, and it erodes trust. It’s not a new phenomenon, and all of us — tech companies, media companies, newsrooms, teachers — have a responsibility to do our part in addressing it.” – Facebook
In order to fight back the social media giant has brought in a “viral content review system” to act as a circuit breaker. This means that when content is in its viral infancy, Facebook can restrict its spread in order to investigate its accuracy preventing what could be a tsunami of misleading news.
As well as expanding its COVID-19 Information Centre to Instagram, Facebook have allocated millions in advertising credits to help health ministers, NGOs and UN agencies to reach wider audiences.
Now that vaccine rollouts are becoming a priority in many countries, a new component has been added in the US to alert people when they’re able to be vaccinated and pointing them to where they can go.
Twitter has been partnering with health experts to help people find credible information and to encourage healthy conversation. For example, in partnership with Team Halo, UNICEF, NHS, and the Vaccine Confidence Project, Twitter began the emoji hashtag #vaccinated to encourage getting the vaccine.
More recently, they’ve accelerated efforts to stop the spread of inaccurate claims around the COVID-19 vaccine with a new strike system to suspend offenders. In cases where the same account continues to spread false or inaccurate claims they may even be permanently banned.
In addition, they’ve introduced new warning labels for tweets identified as containing false information and users will be prompted to ‘find out more’ when attempting to share content that’s potentially inaccurate.
YouTube
A study by BMJ Global Health claims that 1 in 4 of the most viewed COVID-19 videos on YouTube in spoken English contains misleading or inaccurate information. To combat this, YouTube have created a ‘COVID-19 medical misinformation policy’, which includes examples of content that the organisation will remove if posted. For example, claiming that the COVID-19 vaccine will contain a microchip or tracking device, or that COVID-19 is caused by radiation from 5G networks.
Similarly to Facebook, YouTube have also implemented a strike system, in which you receive a strike for each time misinformation is found to be posted on your channel (after the first offence), with channel termination after the third strike.
Is it too late?
Following the extensive research that proves social media content can sway people’s opinions of the COVID-19 vaccine, the Government have recently stepped up their efforts in spreading positive messaging about the vaccine. Mirroring the approach that many brands take when looking to increase their reach, they are recruiting influencers – or in this case infamous pop culture figures – such as Elton John and Sir Michael Caine, to amplify their campaign and minimise any vaccine doubts.
Overall, it’s great to see social media companies stepping up in the fight against #FakeNews. Some of the tougher measures may seem harsh, but hopefully it will see users reconsider what they’re sharing. Though social media platforms will continue to argue that their channels were created to voice opinions and express feelings, when it comes to a worldwide pandemic which has had such devastating impacts, preventing false information from spreading is critical and platforms must find rapid and robust solutions to tackle this.
Related news

How my autism diagnosis helped me finally find a label that fit

Hunt’s autumn statement focuses on growth whilst accelerating northern devolution

There’s more to TikTok than dance trends and viral videos
Who we work with







