Blog

Securing Big Tech Disclosures in 2024's Year of Democracy 

By Alexandra Pardal, Interim Co-Executive Director of Digital Action

This year, more than 2 billion people are voting in over 65 elections across the world, making this a pivotal Year of Democracy in the largest megacycle of elections of the internet age. But the biggest online platforms and messaging services aren’t rising to the occasion. Instead, Google, Meta, X, and ByteDance are largely taking a business-as-usual approach.  

This is happening in the context of mass tech layoffs and more intense and serious threats to election integrity and human rights, including more prolific and deceptive disinformation campaigns waged with AI content. Aside from Meta, which made a single reference to one, new global election policy, Google, YouTube, and TikTok blog posts on 2024 elections solely talked about this November’s US presidential election. The dozens of elections taking place in some of their largest and most fragile markets were ignored. The latest commitments to work on AI show the reactive nature of policymaking at Big Tech companies when deepfakes have been a known threat to information integrity for years now.  

Yet as the companies’ influence on information ecosystems in Africa, Asia, and Latin America grows, their impacts have become more serious, more extensive, and more frequent. Companies are barely investing in harm assessment and mitigation outside the US and English language content. In recent years, we’ve seen online hate speech fuelling genocide against the Rohingya in Myanmar and violence against communities in India, Ethiopia, and Kenya, orchestrated doxxing and disinformation in Tunisia’s slide to authoritarianism, and the online mobilisation of mobs storming Brazil’s state buildings (in scenes resonant of the US Capitol attacks). All this is boosted by algorithms, design features, and content moderation systems within the control of companies, whose platforms we use, invest in, and advertise on.

Insufficient transparency and oversight tools are also undermining election integrity efforts in real time. YouTube’s latest failure to provide transparency around political ads spending in Pakistan’s election is preventing scrutiny of potential breaches of campaign finance laws, in a highly contested election, which international observers said was neither free, nor fair.

These problems are multi-country and endemic to how the platforms function. Left unaddressed, they also pose immediate commercial risks, which are being borne by advertisers whose reputations are damaged by association with harmful or illegal content. All of these tech harms can be traced to the decisions and investments, or lack thereof, of the tech companies themselves, including insufficient investment in localised policies, content moderation, and enforcement. But the harms are also having a cumulative effect across the globe and are part of a long-term trend with the declining legitimacy of democratic governments and institutions, a rise of authoritarianism and illiberalism, undermining the rules-based international order on which global businesses thrive.

2024 offers a once-in-a-generation opportunity to address the deficiencies in Big Tech business models and governance, enhancing the connection between ad tech design, democratic rights, and the prevention of violence. That’s why the investment community is so critical, in a context of low accountability for Big Tech companies in large parts of the world.

Investors can ask for disclosures to enable honest conversations on how Big Tech companies can do better to protect people and elections. The Global Coalition for Tech Justice of over 200 organisations and experts globally, convened by Digital Action, sent ten key asks to tech companies last July, to which none responded adequately. We’re calling on investors to request full disclosures of the steps the companies have taken this year, including:  

  • Risk assessment and mitigation relating to 2024 elections in markets where they have users.
  • Trust and safety spending globally, regionally, and nationally, including the number of content moderators per language/dialect/region in Africa, the Middle East, Asia, and Latin America.
  • Information on the provision of the full spectrum of harm mitigation tools and measures for all elections, not just the US, operational at each stage of the electoral process including post-ballot where risks of post-election violence loom.
  • Substantive information of the extent to which they have increased local expertise and stakeholder engagement, resourcing of partnerships with fact-checkers, independent media, civil society, and other bodies that protect electoral integrity.
  • Publication of all contacts with governments, and all government and institutional requests to suppress speech and surveillance demands, where permitted by law.
  • The existence and plans to eliminate policy exemptions for politicians and ensure fact-checking of political ads.
  • Transparency of political advertising for oversight and legal compliance.
  • Strengthening of public oversight and transparency, including free or truly affordable data access and training for researchers, civil society, independent media, and election monitors.

We believe that investors can play a crucial role this year by leveraging their influence in favour of equitable and effective platform safeguards. Two billion people are depending on it for the protection of their democratic and human rights in this #yearofdemocracy.

Bio:

Alexandra Pardal is the Interim Co-Executive Director of Digital Action, leading campaigns, programmes, partnerships, and fundraising. She has over 24 years of experience in international non-profits and politics, leading campaigns, investigations, and policy reforms. Her work has generated positive impacts including reforms to a billion-euro EU trade and development programme and financial restitution to communities in Africa and been reported on by international media outlets like The Times, The Guardian, Le Monde, El Pais, and Süddeutsche Zeitung. Prior to working in non-profits, Alexandra spent a decade working in European politics. She is a native speaker of English, Spanish, and French, and a barrister with training in international human rights law.