The Forum Network is a space for experts and thought leaders—from around the world and all parts of society— to discuss and develop solutions now and for the future. It aims to foster the fruitful exchange of expertise and perspectives across fields, and opinions expressed do not necessarily represent the views of the OECD.
Calls grow louder to regulate artificial intelligence, counter disinformation, and social media. But how can democracies govern the information environment if they don’t know how it affects people’s thinking and behaviour?
The information environment is the space where people receive and process information to make sense of the world. To do that, we use our own brains, but we also build tools, from alphabets to AI, to process information into artefacts that can be shared in forms like the written word, holograms and everything in between. The information environment is the sum of people as agents, the means we create to produce and process information, the artefacts created from that process and the interrelationships between them all. Confusing this complex system with one or two new technologies or apps will not lead to useful regulations or measures to counter disinformation.
To do better, we need better research. Currently, our “knowledge” of the information environment is based mostly on case studies and experiments.
More on the Forum Network: The Good, the Bad and the Algorithmic: What impact could artificial intelligence have on political communications and democracy? by Dan Morrison, Senior Advisor, Acumen; Adjunct Professor, Strategic and Political Communications, American University, Acumen
AI will walk with us into the future, whether we like it or not. The impact on politics and democracy will be profound. It will be helpful and unhelpful in the same breath. How? Here’s how Dan Morrison sees the good, the bad and the ugly.
There are seemingly endless examples of threats in the information environment, like disinformation. Case studies also focus on what’s hot in the media, like elections, wars, or the latest technology, or what’s available to study, like Twitter data used to be. But that’s like trying to understand a forest by studying just the commercially viable trees, perhaps Balsam firs and the spruce budworms that kill them, in one season rather than studying the whole forest for years.
The other leading type of research involves experiments to assess the effects of things like social media or test the efficacy of interventions like fact-checking, warning labels, media literacy or deplatforming offending users from platforms. While this work is important, much of it can’t be replicated. Moreover, these experiments are often conducted in isolation from the wider system where they occur – the information environment. It’s a bit like using a pesticide like DDT to eradicate the spruce budworm, while ignoring the fact that the pest self-regulates its population without destroying all the firs. The DDT might work at getting rid of the budworms, but at what cost to the system?
Relying on experiments and case studies alone will rarely tell us what exactly is causing problems and how to redress them. Here, there is much to be learned learn from the study of nature – and how our interventions within it might not be limited to the obvious.
In the middle of the last century, ecologists discovered that the eggs of peregrine falcons were thinning, and fewer than one-fifth of mates successfully raised young. Thanks to historic egg collections, researchers could connect the start of thinning to the introduction of pesticides like DDT. To try and prove the cause was DDT, researchers fed it to falcons in lab experiments – but it didn’t lead to egg thinning. That’s because birds of prey don’t consume DDT directly; they pick it up as a by-product called DDE later in the food chain. Sure enough, DDE was found in falcon eggshells collected in the wild. Subsequent experiments using DDE produced very different results. Lab experiments alone weren’t enough; researchers needed historical collections of eggs to compare and an understanding of how the ecosystem worked to deduce how falcons experience DDT. To understand a complex system, in addition to experiments, one needs consistent measurements of the same things over time, structured field observations, analysis of processes, and a systematization of research covering a variety of methods to bring it all together.
Currently, research on the information environment is a lot more like the lab experiment directly feeding falcons DDT. It aims to prove that the new thing introduced into the information environment is problematic. Take, for example, experiments aimed to prove that social media causes polarization. With challenges in replicability and a lack of understanding of how the information environment works, these experiments end up cancelling each other out. Some claim social media causes polarization and others refute this. The lack of consensus opens the door to doubt, which some actors use to their benefit, usually the culprit accused of causing the problem. Indeed, some groups still point to the initial failed lab experiments on DDT to argue for a reintroduction of the pesticide.
This situation is a policy nightmare.
What’s needed is a field to study the information environment – information ecology.
Even if policymakers have the will to tackle problems in the information environment, do they really know what needs to be done? Policymakers are pressed to come up with practical solutions in real-time. They are unlikely to have the time to sort through reams of studies to find consensus, nor do they necessarily have the scientific skills to discern quality results from bad. Moreover, continuing to rely on case studies and experiments alone leaves a gaping hole in knowing what makes a healthy information environment in the context of democracy or how that can be measured. What’s needed is a field to study the information environment – information ecology. Here again, much can be learned from how the physical world is studied.
Scientists have been studying the physical environment and ecosystems within it since the 1800s. For decades and globally, climate scientists have been measuring temperatures, water levels, plant growth patterns, air quality and more. What if the answer to unlocking the mysteries of the information environment is more straightforward than assessing the impact of algorithms on election choices – at least to start? Maybe it begins in much the same way ecology did – observing and measuring key factors within ecosystems, cataloguing, and structuring that data over time to identify patterns and changes. Then, the question becomes one of what can be measured consistently over time. This might be as simple as measuring demographics, literacy and numeracy rates, and cataloguing the types of means available to process and share information and who controls them. Taken together, building up data on categories of factors over time offers a way to study and compare different types of information ecosystems.
Before democracies can talk about governing the information environment, it must first be understood. That means studying it like the system it is. This will take consistent observations and measurements of the same things over time and the identification of factors differentiating types of information ecosystems. Ultimately, this means finding the information environment’s equivalents to temperatures, air quality, and plant growth and systematizing measurements over those conditions over time. That level of consistent measurements is needed to be able to produce the evidence for policymaking, helping put experiments into context and informing how they are designed. The scale of that effort shouldn’t be underestimated – especially if democratic societies want to get answers as quickly as possible. Fast-tracking insights requires the same level of coordination and investment that was put into unlocking the mysteries of the physical world in the CERN or European Centre for Nuclear Research. Establishing an independent multinational research facility would enable multistakeholder collaboration to advance an understanding of the information environment.
 You you, Wu, Yang Yang, and Brian Uzzi. "A discipline-wide investigation of the replicability of Psychology papers over the past two decades," PNAS, 120 (6) e2208863120 (January 30, 2023) https://doi.org/10.1073/pnas.2208863120; Samuel, Segal, "Lots of bad science still gets published. Here’s how we can change that." Vox (December 6, 2022) https://www.vox.com/future-perfect/23489211/replication-crisis-project-meta-science-psychology; Baker, Monya, "Over half of psychology studies fail reproducibility test," Nature (August 27, 2015). https://www.nature.com/articles/nature.2015.18248
Learn more about OECD event: Tackling disinformation: Strengthening democracy through information integrity
This conference will bring together representatives from government, digital platforms, media, academia and civil society to identify effective policy responses to the urgent challenges our democracies face in the information space. Conversations will focus on government architecture and coordination mechanisms, efforts to build societal resilience, and regulatory responses. By focusing on tangible questions around what works and why, the conference will also set the stage for working toward OECD guidelines on the issue and provide the occasion to discuss expanding the OECD DIS/MIS Resource Hub’s engagement and reach.