roundtable 2023 at just transition talk

Investor Asks on the Digital Rights and AI Accountability

Under the UN Guiding Principles on Business and Human Rights, all companies are expected to conduct effective human rights due diligence to ensure that they are not causing, contributing to, or linked to human rights abuses through their direct or indirect actions. This includes actions within the digital space which negatively affect peoples' rights online; such as privacy, anti-discrimination, and freedom of expression.

Investor Expectations on Digital Rights and AI Accountability

As set out in the Investor Statement on Corporate Accountability for Digital Rights, 176 institutional investors with over USD 9.2 trillion in assets under management urge tech companies to: 

Commit to and implement human rights due diligence mechanisms that identify how freedom of expression, privacy, and user rights may be affected by the company's full spectrum of operations through effective engagement with affected stakeholders;

Maximize transparency on how policies are implemented by disclosing comprehensive and systemic data that enables users - as well investors, researchers, policymakers, civil society, and other third parties - to have a clear understanding of how platforms and services restrict or shape speech and how they assess, mitigate, and provide redress for risks to users; 

Give users meaningful control over their data and data inferred about them, including providing clear options for users to decide how their data is collected and used, for what purpose, and how to access remedy when needed; and

Account for harms from artificial intelligence by disclosing the development and deployment of algorithmic systems and targeted advertising, publishing and updating policies to specify where they are used and what rules govern them, and releasing data relevant to the protection of digital rights in both areas. 

If you are interested in participating, please sign on to this investor statement.