In my view: To tackle disinformation, we must uphold freedom of opinion and expression

From online work to home schooling, from family communications to medical advice, access to the internet has been a game changer and a life saver during the pandemic. At the same time, digital technology enables new pathways for disinformation. Banner Image: Shutterstock/GoodStudio
In my view: To tackle disinformation, we must uphold freedom of opinion and expression
Like

This article is part of a series in which OECD experts and thought leaders — from around the world and all parts of society — address the COVID-19 crisis, discussing and developing solutions now and for the future. Aiming to foster the fruitful exchange of expertise and perspectives across fields to help us rise to this critical challenge, opinions expressed do not necessarily represent the views of the OECD.

Join the Forum Network for free using your email or social media accounts to share your own stories, ideas and expertise in the comments.


This piece was first published in the OECD's Development Co-operation Report 2021: Shaping a Just Digital Transformation

Digital technology has transformed communications, creating unprecedented opportunities for people to exercise their right to information, voice their views and participate in democratic and development processes in multiple ways. Social media platforms have enabled marginalised groups to build solidarity networks, journalists to expose corruption and abuse of power, and human rights defenders to mobilise for change in real time. From online work to home schooling, from family communications to medical advice, access to the internet has been a game changer and a life saver during the pandemic.

At the same time, digital technology enables new pathways for disinformation – harmful, false or manipulated information to be created, disseminated and amplified for political, ideological or commercial motives at a scale, speed and reach never known before. Algorithms, targeted advertising and data harvesting on social media drive users towards extremist content in ways that feed and intensify disinformation, robbing individuals of their autonomy to freely select information and develop their own views.

Disinformation online exploits political, economic and social grievances in the real world, and contributes to polarising public debate, eroding public trust in factual, scientific information, inciting violence and hatred against minorities, women and vulnerable groups, threatening human rights, and disrupting democratic and development processes.

Disinformation online exploits political, economic and social grievances in the real world, and contributes to polarising public debate, eroding public trust in factual, scientific information, inciting violence and hatred against minorities, women and vulnerable groups, threatening human rights, and disrupting democratic and development processes.   

While disinformation is problematic, so too are the responses of many states. Several governments have sought to filter, throttle or block digital traffic and shut down websites. Many have introduced “false news” laws to criminalise and censor legitimate online content, or prosecute political opponents, journalists and human rights defenders. Not only are such actions disproportionate and incompatible with international human rights law, they are also short-sighted and counter-productive. By discouraging diverse sources of information, they hamper fact-finding, feed rumours, amplify misperceptions and undermine trust in public information.

Freedom of expression is not part of the problem. It is the primary means of combatting disinformation. For instance, people’s trust in vaccines is built not through censorship, but through access to facts and open debate among journalists, civil society, policy makers and experts discussing alternative viewpoints and challenging falsehoods and conspiracy theories.

Ensuring the benefits of technology to advance development and democracy while mitigating the risks of disinformation requires a partnership of states, companies, development partners and civil society to uphold human rights.

What does that mean? 

First, states should enhance their own transparency, proactively disclosing official data and ensuring that state institutions and political leaders do not spread or sponsor false information. Speech should not be criminalised except in the most egregious circumstances of incitement to violence or hatred. Any restriction of freedom of expression should be strictly in accordance with international human rights standards of legality, necessity, proportionality and legitimate aim.

Second, evidence shows that fostering diverse sources of information, robust public information regimes and independent journalism are strong antidotes to disinformation. States should promote the independence, diversity and pluralism of media, and ensure the safety of journalists and human rights defenders.


Read more: Turning the Tables: Using BigTech community standards as friction strategies by Vincent F. Hendricks, Professor of Formal Philosophy; Director of Center for Information and Bubble Studies, University of Copenhagen


Third, media information and digital literacy should be part of the national school curriculum and adult education programs to empower people and build their resilience against disinformation and misinformation.

Fourth, more investment must be made to close the digital divide so that people in developing countries can have meaningful, free, open, interoperable, reliable and secure access to the Internet. The disparities in Internet access are grounded in economic, social, political and cultural disparities and gender inequalities. There is not just one divide but multiple divides to be overcome, and that requires a holistic, human rights-based approach to development.

Fifth, data protection is key to reorienting the advertisement-driven business model of the digital economy, which drives disinformation. States should adopt and enforce strong data protection laws.

Sixth and finally, the policies, practices and business models of digital platforms must be human-rights compliant. States should not compel companies to remove or block content that is legitimate under international law. Instead, they should focus on “smart” regulation to enforce transparency, accountability and human rights due diligence by companies in line with UN Guiding Principles on Business and Human Rights. For their part, companies should enhance their transparency on content moderation, including algorithmic transparency, ensure proper recourse for users and that their business model, operations, policies and practices are in line with the Guiding Principles.

Fighting disinformation online is ultimately about restoring public trust in the integrity of the information order. The best way to do that is by strengthening the right to freedom of opinion and expression.

Join us to discuss the findings of the OECD Development Co-operation Report online on 8 April 2022.

Join us to discuss the findings of the OECD Development Co-operation Report online on 8 April 2022.

Read the OECD's Development Co-operation Report 2021: Shaping a Just Digital Transformation

Read the OECD's Development Co-operation Report 2021: Shaping a Just Digital Transformation

Related topics

Privacy and Cybersecurity New Societal Contract Digitalisation Digital Inclusion

Please sign in

If you are a registered user on The OECD Forum Network, please sign in