Technology, Privacy and Market Power: What policy makers must do next

We rely more than ever on digital tools, which we often pay for with our personal data. Policy makers are already making progress on privacy, cybersecurity and concentrations of market power – but with greater international co-operation, they can be even more effective.

Like Comment

This article is part of a series in which OECD experts and thought leaders  from around the world and all parts of society  address the COVID-19 crisis, discussing and developing solutions now and for the future.

To keep updated on all of the OECD's work supporting the fight against COVID-19, visit our Digital Content Hub.


COVID-19 uprooted many things in 2020, not least how we live our lives as workers, as citizens and as consumers. As the pandemic took hold, the rapid adoption of the digital tools available to us supercharged the changes we were already going through: greater flexibility in how we work, different approaches for governments to connect with citizens, and new ways for companies to reach their customers.

With take-up accelerating, so have the challenges. Digital technologies allow for business continuity, but give malicious actors new channels to target their attacks – allowing, for example, for the deployment of ransomware to take hospitals hostage. And while surveillance cameras, geolocation data and credit card purchase records can be useful for establishing COVID-19 transmission chains, they also raise significant privacy risks.

Privacy and cybersecurity risks are nothing new, but the digital acceleration has greatly intensified them. The “infodemic” problem – the flooding of information both accurate and inaccurate into the public sphere, which makes it harder to understand the essential facts – is not new either. However, coming up with appropriate remedies for our informational environment may prove a far more complex and long-lasting undertaking than tackling COVID-19 itself.

How should we think about the way we are harnessing the power of artificial intelligence (AI) and digital technology? What should policy makers consider when scrutinising the dangers? Why might the lack of an appropriate regulatory and legal framework around the harvesting and use of personal data be the biggest risk of all? To answer these questions and many more – and on the occasion of the 60th anniversary of the signature of the Convention that established the OECD – I had the honour of hosting a distinguished panel of some of the most prominent voices in the debate to discuss the challenges ahead.

The “attention economy” and the commodification of personal data

Opening proceedings, Dr Shoshana Zuboff, best known for her critique of Big Tech in her landmark book, The Age of Surveillance Capitalism,  drew our attention to a simple fact: companies have an incentive to commodify and monetise personal data for the simple reason that it is very profitable. “We must first understand better the deep-rooted problem of an economic framework that has identified human attention as a scarce commodity, the so-called ‘attention economy’,” she explained.

There is a need for an upgrade to the ‘software of democracy’ that allows regulators to act with greater speed.

“We unknowingly and unwittingly allow technology companies to monitor every click, every move and every gesture, down to our use of exclamation marks, the stoop of our shoulders and our micro expressions,” she argued. Through access to millions of data points, they are able to harvest an almost infinite crop of information about us, about our preferences and our proclivities, from which “they can produce a great smorgasbord of predictive behavioural signals and targeting mechanisms that include subliminal cues and psychological micro targeting,” she continued. “It should therefore not come as a surprise that such predictive insights have conferred tremendous commercial and political advantage, whose effect on public discourse we are only just beginning to grasp”.

Listen to What data and digitalisation could mean for your democratic future with Shoshana Zuboff and check out more OECD Podcasts

What can policy makers and governments do? The furious pace of technological change risks leaving the policy-making world constantly playing catch-up. Alessandro Fusacchia, Italian MP and Chair of the All-Party Group on Artificial Intelligence in the Italian Chamber of Deputies, underlined the need for an upgrade to the “software of democracy” – existing methods and timeframes more suited to the twentieth century need to be adapted to allow regulators act with greater speed and effectiveness. Alessandro Curioni, Vice President of IBM Europe and Africa, agreed: “the dynamism of transformational technology requires dynamic regulation,” he said.

Three key elements emerged from our deliberations:

  • the need to better understand market power in the twenty-first century

  • the need to bridge digital divides, notably to ensure access to education

  • the need for greater international co-operation to better understand AI and best practices in data governance

First, on the question of market power, big technology companies have emerged stronger during the pandemic, with a greater concentration of market power than before and exacerbating already stark differences in digital diffusion, as we have laid out in our OECD Digital Economy Outlook 2020. If not addressed, this could further widen the gap in performance between digital adopters and digital laggards – and further entrench market power.

Read the OECD Digital Economy Outlook 2020 and see all of our publications on the OECD iLibrary

“This is inextricably linked to the ownership of intangible assets, which in turn risks a post-COVID recovery focused on fewer jobs,” according to Financial Times Associate Editor and author of Don’t Be Evil, Rana Foroohar. Nonetheless, recent moves from regulators suggest they are beginning to take this seriously – and can, as Rana mentioned, “move fast and break things” too. In the US, a congressional probe has called for an overhaul of its antitrust laws, while the Federal Trade Commission’s recent action against a number of Big Tech firms for anti-competitive practices indicates a shift in world view. Meanwhile, the European Commission’s proposed Digital Services Act and Digital Market Act seek to update regulators’ toolboxes and clarify the responsibilities of digital platforms.

Second, policy makers must work hard to ensure that the digital acceleration does not further widen existing gaps in digital access. In particular, the pandemic underlined the importance of digital learning for children, many of whom risked being left behind while schools were closed. As Alessandro Fusacchia put it, “remote learning applications must respect the privacy of children, but it is also paramount they all have access to the opportunities digital tools bring.”  It means addressing not only varying levels of connectivity, but also persistent skill gaps across demographic groups and countries: higher educational attainment or income levels typically means greater access to knowledge, job opportunities, and health and education services. The pandemic has underscored the urgency of closing these digital divides.  According to Frank Pasquale – who argues in his latest book New Laws of Robotics that we can harness digital technologies rather than fall captive to them but only through wise regulation – the challenge for policy makers now is therefore to maximise inclusion, without doing so on terms that puts it out to the highest bidder.

For many, technology is a Faustian bargain, privacy the price to be paid in exchange for its many conveniences.

Third, the question of closer international co-operation is an existential one for the OECD itself.  We produced the first set of intergovernmental policy guidelines on AI, which went on to provide the basis for the G20 AI Principles, whose goal is to ensure that the design of AI systems be human-centred, robust, safe, trustworthy, and fair. In addition, the OECD Global Parliamentary Network on Artificial Intelligence serves as a legislative learning hub, bringing together MPs from many countries and from across the political spectrum to share experiences, best practices and foster international legislative co-operation. The OECD’s Going Digital project helps countries formulate appropriate policy strategies for the digital age. Our focus in the next two years will be on strengthening data governance at national and global levels, on bolstering understanding of what “open”, “protected”, and “controlled” mean, and to harness these efforts to achieve better social outcomes – while managing risks and stimulating innovation. These challenges have been magnified by the global nature of the COVID-19 pandemic and corresponding need for international co-operation and cross-border exchanges of data.

The OECD has made efforts to overhaul international rules and frameworks on tax, notably towards establishing more coherent international rules that aim to reduce tax evasion and avoidance strategies, with big technology companies in particular under public scrutiny. With COVID-19 putting increasing pressure on public finances while driving forward the digitalisation of the economy – along with decreasing public tolerance for the tax-avoidance strategies of multinational enterprises – the stage is set for a political agreement on the G20/OECD Inclusive Framework on BEPS (base erosion and profit shifting).

Should the oil “stay in the ground”?

In looking for solutions, we must not lose sight of the problem we want to tackle – which brings us back to the question of private data and who should control it. Is data protection regulation enough? Declaring “extraordinary times require extraordinary measures”, Dr Zuboff made the case for privileging individual sovereignty over personal experience – and consider outlawing the collection of certain types of data in the first place. Drawing parallels with the battle against climate change, she posited that “if data is the new oil, the oil must stay in the ground.”

This debate is set to run and run. For many, technology is a Faustian bargain: they consider the cost a price worth paying in exchange for its many immediate conveniences. But this betrays the temporal challenge bedeviling  economics whereby costs are long term and rewards and gratification are instant. This has significant implications for how we understand the choices individuals make and the implications for policy not only in tech, but also in climate, health, education, and any number of other policy arenas.  Psychologists and behavioural economists must bring their knowledge to bear here.  

“We knew that a certain future was coming,” said Rana Foroohar, “but thought we had 20 or 30 years to deal with it.” We don’t: the future is already here. COVID-19 has accelerated the digital transformation. This, in turn, has brought its associated risks and challenges under intense scrutiny, providing fresh impetus for regulatory action. As Frank Pasquale puts it, we need to democratise the direction of technology; innovation itself can't just be a watchword to stop regulation. If we want to harness the power of the great digital acceleration, if we want to focus on positive social outcomes – on well-being and civil liberties – it is not enough to leave the driving to the machine. We must state our destination and grab the wheel. In this endeavour, the OECD will continue to work in close partnership with all stakeholders to chart the path ahead, to nurture and drive forward AI and innovation centred on people – to help deliver Better Policies for Better Lives.

Read more from Anthony Gooch on the Forum Network

Related Topics

Tackling COVID-19 Privacy & Cybersecurity Artificial Intelligence

Whether you agree, disagree or have another point of view, join the Forum Network for free using your email or social media accounts and tell us what's happening where you are. Your comments are what make the network the unique space it is, connecting citizens, experts and policy makers in open and respectful debate.

No comments yet.