It’s our 10-year anniversary, which means that RDR has been keeping tech companies accountable for over a decade. In honor of this occasion, the RDR team held its first-ever in-person retreat to set our new strategic priorities! It was a fantastic opportunity to get the team together, and also the first time some of our remote team members met in person. Our team arrived from Washington, D.C., Barcelona, Paris, and Montreal.

While we marked this milestone for RDR, we in the digital rights community at large also find ourselves at an inflexion point: Almost every week, tech companies are announcing the integration of artificial intelligence into their services. Our research seems, today, more crucial than ever. With the integration of algorithm transparency and targeted advertising standards into our methodology in 2020, RDR has always kept abreast of trends in the field. Yet, the speed of evolution in AI today is unprecedented. It calls for us to evolve like never before.

With this in mind, our objectives for the retreat were:

  1. Examine both our accomplishments and our challenges over the past decade. Based on this assessment, determine how we can have the most impact moving forward in a rapidly changing tech landscape.
  2. Determine ways to refine our methods and standards (to allow for more flexibility and to allow us to build new products, including mini indexes!) and consider how we can best strengthen and fortify our relationships with civil society partners and the investor community.
  3. Determine upcoming dates for the release of our next Corporate Accountability Indexes.


The Retreat: RDR’s Strategic Priorities 

On day 1, we met in Amsterdam and headed toward our venue.

Over the next 3 days, we determined 5 strategic areas that RDR will prioritize moving forward:

  1. Setting the standard: Our standards should remain relevant and enhance our collaborations with all stakeholders including benchmarks, regulators, policymakers, investors, and ESG rating agencies. This means making sure our standards are flexible and can be deployed to evaluate both new and emerging technologies.
  2. Fostering actionable methodology: We identified the need to ensure our methodology is more flexible and efficient, and therefore adaptable to both new technology and to a wider array of companies. We also explored ways of providing more targeted information to inform specific policy issues, further extending the work we have begun with our Scorecard “Lenses.”
  3. Catering to investor needs: RDR’s work with investors has grown in recent years. Over the past two years, our data has been employed to great success in a growing number of investor proposals. RDR has directly engaged in many of these. Some of the most successful include:a) A proposal at Alphabet (Google) that informed the company’s decision to terminate its FLoC targeted advertising project.

    b) A proposal at Meta calling for a human rights impact assessment of its targeted advertising business model, which became the most successful topical shareholder proposal in Meta’s history.

    This year we are once again actively partnering with investors to bring human rights issues to the table at companies like Amazon, Google, and Meta through shareholder proposals at their upcoming annual meetings. In particular, we conceptualized and developed the first ever proposal on transparency reporting filed at Amazon (and likely at any e-commerce company). The success of these proposals and of our investor work more broadly has made clear the value of an investors-first approach to our work. For this reason, we will begin mapping out investor needs and set out to address them as directly as possible in our upcoming indexes.

  4. Growing the movement: The number of civil society partners with whom we engage has also grown, with RDR currently involved in supporting 12 adaptations covering 34 countries and 122 companies. This means a slew of new data has become available, including on subsidiaries of the companies we rank. This data can also be useful for the investor community, among others. We will work to both expand and refine the release of such reports.
  5. Communicating smarter: RDR will work to ensure that the stories that emerge from our data translate easily into media-friendly stories. We will ensure the media is aware that the RDR staff is able to speak on a number of topics and trends and not just our Scorecards. Some of these include: investors, Big Tech and Telcos, targeted advertising and algorithms, and other important questions related to emerging technologies like AI.

Looking Forward: What’s Next for RDR?

While RDR will spend the remainder of 2023 publicly reflecting on a decade of lessons learned in tech-sector accountability for human rights, to keep up with the rapid advances in new technology, RDR is also taking advantage of our non-Index year to work on the development of new standards for artificial intelligence. These will be shared publicly at the end of this summer. We’re also taking stock of how our methodology and standards can be refined to best serve the needs of our stakeholders, including investors, as well as companies and civil society, when our next Corporate Accountability Indexes are released.

With this in mind, RDR is happy to announce that our next Big Tech Scorecard will be out in the fall of 2024. And our Telco Giants Scorecard will be released in 2025. Also, look out for: New mini indexes on emerging technologies, coming soon!

Stay tuned!

The conference season is ramping up with two of the most important digital rights events coming up: EngageMedia’s second DRAPAC conference next week and Access Now’s 12th annual RightsCon at the beginning of June! And RDR will be there.

Our sessions this year will be focused on three essential issues: 1) Bringing the tech accountability movement to the Majority World, 2) the potential of ESG standards for digital rights, and 3) the perils of targeted advertising. Two of these represent key areas for expansion as RDR looks to the future during its 10th anniversary year. And RDR has long been at the forefront of ringing the alarm on the perils of targeted ads, a growing area of public concern.

Come check out our sessions to: 1) hear from us on these important topics, 2) chat with us in person, and 3) connect with us to join the tech accountability movement!

DRAPAC23

The Digital Rights in the Asia-Pacific 2023 Assembly will take place between May 22 and 26 in Chiang Mai, Thailand. It is hosted by EngageMedia, an Asia Pacific-based non-profit promoting digital rights and open technology. The conference will bring together academics, journalists, advocates, and many others, as well as representatives from marginalized communities, to discuss the most pressing issues impacting digital rights in the region.

Check out the following sessions at DRAPAC23 to hear from, and connect with, RDR staff:

  1. “Adapting global standards to local contexts: Uncovering the policies and practices of digital platforms and telecoms in Malaysia, South Korea, and Taiwan”

When: Monday 22nd May (12:15 – 14:00 pm local time)
Where: CMU Convention Center – Auditorium
What: The Digital Asia Hub (DAH), Open Culture Foundation (OCF), and OpenNet Korea (ONK), in association with RDR, are launching three new research reports, analyzing the policies and practices of tech companies in Malaysia, South Korea, and Taiwan.

The organizations are part of a pilot initiative of global organizations that have adapted Ranking Digital Rights’s Corporate Accountability methodology to analyze company policies in their local contexts. RDR and our partners will discuss key findings from the studies and look at how each organization approached adapting the methodology to suit their needs. This is the perfect case study for anyone considering their own adaptation project.

  1. “Research Against the Tech Machine: Building a community of corporate accountability researchers & advocates with Ranking Digital Rights”

When: Tuesday 23rd May (13:15 – 15:15 pm local time)
Where: CMU Art Center – Exhibition Hall 2
What: This workshop will introduce participants to the RDR Corporate Accountability methodology and the resources that can be used to study the policies of tech companies across the Majority World, tailored to unique local contexts.

Join the RDR team as we showcase our new Research Lab, including our Knowledge Center and Scorecard Toolkit. This set of tools is making it easier than ever to adapt our standards to create individual projects.

During the workshop, we’ll explain how to use these tools and resources and what to keep in mind when adapting our methodology. Our aim is to grow the community of researchers that are able to study local and regional tech companies outside of the U.S. and EU.

  1. “Facing tech power: using evidence-based research to hold tech companies accountable in Asia-Pacific”

When: Wednesday 24 May 13:00 – 14:45 pm local time)
Where: CMU Convention Center – Auditorium
What: This session will lead participants in a discussion of regional trends in corporate accountability, based on the experience of adapting the Ranking Digital Rights research methodology to study countries across the Asia-Pacific region, and other regions in the Majority World.

Attendees will discuss existing challenges in studying the region’s technology companies. We’ll also look at strategies for engaging with companies and other key stakeholders (including regulators and policymakers) using research data from RDR adaptations. Finally, we’ll identify opportunities to collaborate between organizations in the region (and globally) on advocacy campaigns aimed at confronting tech power.

RightsCon

RightsCon, which bills itself as the “world’s leading summit on human rights in the digital age,” will take place between June 5 and 8 in San José, Costa Rica. It’s organized by international human rights organization Access Now. The conference also brings together a wide range of stakeholders, from tech companies to human rights defenders, for a civil society-led conference aimed at building a “rights-respecting digital future.”

We’ll be hosting four sessions at RightsCon on a range of issues. Come join us for:

  1. “Deep pocket inspection: the promises and perils of ESG standards for digital rights”

When: Tuesday, June 6 (10:15 – 11:15 am local time)
Where: Room Tortuga
What: During this workshop, participants will learn from members of the investor community about the ESG ratings ecosystem, including benchmarks like RDR. Facilitators will walk participants through a brainstorming session examining opportunities and challenges in this space. Together, participants will then develop a call to action aimed at decolonizing ESG.

  2.  Too small to count, too big to ignore: scrutinizing tech companies in the Majority World”

When: Tuesday, June 6 (16:30 pm – 17:30 pm local time)
Where: Room Guanacaste 2
What: While much of the digital rights community is focused on the impacts of big, international tech companies, smaller, local, and home-grown tech companies have been getting a free pass while facing inadequate scrutiny.

The Greater Internet Freedom Project, in collaboration with Ranking Digital Rights, has been conducting adaptations of the RDR Corporate Accountability Index methodology to assess and rank the performance of small and local companies around the world.

During this dialogue session, we’ll share trends and patterns from adaptations and assessments of almost 60 small and local tech companies in 17 countries across Central Asia, Sub-Saharan Africa, Latin America, and Eastern Europe, which were conducted using RDR’s methodology. We’ll discuss avenues for developing advocacy strategies to hold tech companies—from small and local, to firms with an established regional business—accountable to international human rights standards. If you’re looking to hold tech power accountable at the local or regional level, we know you’ll learn a lot from joining RDR at this final RightsCon session.

  3.  “Online content moderation: Let’s play “Social Media Against Humanity – Ads Edition””

When: Wednesday, June 7 (9:00 – 10:00 am local time)
Where: Room Tucan
What: Social Media Against Humanity (SMAH) is a game where participants discuss content moderation on social networks and how its “rules” and parameters can be biased by each person’s beliefs. It was presented for the first time at RightsCon 2021, and hosted by Embajadores de Internet. The game is back again this year, and, this time, it’s co-hosted by Ranking Digital Rights! And there’s also a new twist: We’ll focus specifically on ads and their potential for promoting harmful or inappropriate content.

During the game, the facilitator will present 10 controversial sponsored social media posts (real or invented) and the audience will vote on, and discuss, what content moderation actions should be taken.

  4. “Targeting telcos: how carriers profit from personalized advertising, and what to do about it”

When: Wednesday, June 7 (11:30 am – 12:30 pm local time)
Where: Room Tucan
What: This workshop sheds light on some of the world’s most powerful companies shaping our world from behind the scenes. Telcos — the companies that run your mobile plans and connect you to the internet — though less flashy than their Big Tech cousins, are also juggernauts of targeted advertising and surveillance infrastructure. Around the world, including in countries like India, Germany, South Africa, and the U.S., telcos are growing their ad businesses — and creating a full roster of human rights concerns.

This is an all-hands-on-deck workshop. An opening presentation will bring participants up to speed on the adtech activities of 12 of the world’s biggest multinational telcos, as well as their risks. Participants will identify knowledge gaps in this area, zero in on specific human rights threats, and chart pathways to more effective advocacy. Their ideas will be published in a blog post and incorporated into future Ranking Digital and Check My Ads projects.

We invite anyone looking to keep big tech accountable to international human rights standards to join our sessions and connect with us in person. We’re particularly excited to connect with two key groups: 1) Any Majority World civil society experts and 2) Investor activists interested in digital rights. Whether in Thailand or Costa Rica, we hope to see you soon!

The release of ChatGPT in November and the imminent reality of artificial intelligence permeating our lives like never before has redirected the public’s attention to the ever-growing power of new technologies and the companies that wield them. Today, as part of our mission to provide support for civil society organizations holding tech power accountable around the world, we’re releasing two new resources: the RDR Knowledge Center and the RDR Scorecard Toolkit. These tools are both part of the RDR Research Lab, a hub for everything you need to learn how to use, adapt, and localize our approach to corporate accountability research. The hub was launched in October of last year.

In the Knowledge Center, researchers will find detailed information on how to apply each of our 58 indicators to assess company policies and their alignment with human rights. Anyone beginning a research project using this tool will be able to access detailed explanations for each indicator and its related elements, together with examples of how they have previously been translated into specific company scores. In addition, researchers can use this space to post comments and questions for the RDR team and other researchers.


Alongside the Knowledge Center, we’re also introducing our new Scorecard Toolkit, a web-based application where researchers can more quickly than ever before create the data management infrastructure needed to assess and score any company against our standards.

Previously, building data collection and management spreadsheets presented a high barrier for independent researchers who wanted to evaluate companies using RDR’s research methodology.

With this new tool, researchers will be able to set up the necessary materials to begin carrying out research in a short time, without needing specialized technical skills. They will be able to do so with the help of tutorials and explanations that will make following the RDR methodology only a tad more complicated than following your average cooking recipe.

More specifically, the Scorecard Toolkit will allow researchers to generate a data management infrastructure consisting of:

  • Data collection spreadsheets: These are generated for each company selected for the analysis and include the specific services chosen for evaluation. Each data sheet has one tab per indicator, where each individual element is assessed.
  • Scoring spreadsheets: These include two main components. The first is a data table featuring all the scores calculated for every company and service selected, along with individual company scoring tables. Using these scores, researchers will then be able to create bar charts and graphics to design their own scorecards showcasing companies’ performance. You can find tutorials and ideas in the “Analyzing your data” section of the Research Lab.

Until recently, the only option for many organizations who wanted to conduct RDR-like research was to go it alone, or almost alone, with ad hoc guidance from RDR. These new resources make it much easier to set up the research infrastructure and get expert guidance, which ultimately will help generate more corporate accountability research, faster. Already, these tools have allowed us to support recently published original research in Central and Southern Africa and South and Southeast Asia. Such studies set a baseline for tech accountability in markets where data about company practices is often nonexistent or hard to find.

Since I joined RDR as the Global Partnerships Manager in 2021, work has begun on 15 new corporate accountability studies, which in total evaluate more than 120 companies in 35 countries. By making our tools publicly accessible, we aim to encourage more civil society groups to carry out research that illuminates how companies behave in their particular environments, as well as how tech power is wielded across the Majority World.

As new and potentially perilous technologies permeate our lives, it has become more pressing than ever to scrutinize the companies behind the digital services we use. The launch of this triad of new research tools, including the Research Lab, is an important milestone not just for RDR but for the broader global movement to hold technology and telecom companies to account for respecting and promoting our human rights.

If you’re interested in carrying out your own research using our methods and standards, feel free to write us at partnerships@rankingdigitalrights.org


Revelations about the harms of Big Tech, from misuse of customer data to social media-fueled violence, have produced a new era of scrutiny for the sector, including stronger regulation in the EU and fines for privacy violations by giants like Amazon and Meta. Many in the investment community have also begun recognizing the downsides of surveillance capitalism.

This shift has taken place against the background of explosive growth in ESG investing, which considers environmental, social, and corporate governance factors in investment decisions. Consulting giant PwC recently estimated the volume of ESG-labeled capital at $18 trillion USD worldwide and there’s no sign of it stopping there. But, as we venture further into 2023, it is clear that ESG investing is facing increasing pressure from several directions.

In the U.S., conservative politicians have accused the ESG investing community of promoting “woke ideologies,” withdrawn investments from ESG-labeled funds, and banned state pension funds from using ESG criteria to guide their decisions. This effort to fire up conservative voters in the U.S. culminated on Monday, when Joe Biden issued the first veto of his presidency. The veto struck down a Republican bill that would have barred retirement funds from considering ESG factors in their investment decisions. 

House Speaker Kevin McCarthy accused the President of favoring “woke Wall Street over workers.” But once we cut through the noise of these bad-faith arguments, there are legitimate critiques that must be addressed. Paywalled scores, disparate standards, conflicts of interest, and ratings that don’t always end up reflecting corporate impacts all limit the value of ESG data for the protection of human rights.

That’s where RDR comes in. In a new piece by Jan Rydzak, RDR’s Company and Investor Engagement Manager, we lay out the value of non-profit human rights benchmarks for the ESG community and how it can hold companies to account. In contrast with many ESG data providers, RDR uses human rights frameworks and focuses on the rights that tech and telecom titans are most likely to enable or jeopardize: freedom of expression and privacy. These are rights that rating providers often overlook within the “social” category when scrutinizing the tech world.

Meanwhile, the meteoric rise of new AI tools has helped spark investors’ interest in their promises and perils. Growing awareness of these and other risks within the responsible investment community has helped shape and accelerate RDR’s work with investors. For years, investors have drawn on our data to inform a range of shareholder proposals that pressed companies to improve their practices. Over the last two years, we’ve taken on a more proactive role, working directly with shareholders, fellow civil society groups, and the Investor Alliance for Human Rights to launch proposals in key areas.

In 2022, a proposal at Meta we helped craft called for a human rights impact assessment of its targeted advertising business model. Nearly 80% of Meta’s shareholders (besides Mark Zuckerberg) voted in favor of this proposal at the company’s annual meeting. This was one of the strongest results in Meta’s history, edged out only by two perennial proposals to abolish Zuckerberg’s near-dictatorial power over the tech giant. In 2023, the investors who led it are planning to redouble their efforts. With RDR’s support, a second team of shareholders is driving a parallel effort on ad targeting’s human rights impacts at Google, which still controls more than a quarter of the Internet’s ad revenue.

Ahead of the tech industry’s 2023 annual shareholder meetings, RDR partnered with allies to develop a proposal at Amazon calling on the e-commerce giant to report on the censorship demands it receives from governments. The proposal has already helped bring about a watershed moment for investor advocacy: This year marks the first time that Amazon’s shareholders coalesced around a common digital rights theme in their proposals—a key signal of their fast-growing awareness of these issues and the risks they pose to people and communities.

We view this growing reckoning as a chance not for cynicism, but for renewal. We will remain at the forefront of the movement that’s reshaping ESG by integrating true human rights and transparency standards, ensuring companies are held to account. 

Read more about RDR’s unique value to ESG and human rights-focused investors. —>


Op-Eds From RDR Spotlight the Power of Telco Giants 

In December, we released our first-ever Telco Giants Scorecard, a ranking of the world’s most powerful telecommunications companies on their policies related to users’ fundamental rights, including freedom of expression and privacy. We argued that the effects of telcos on these rights, from mass surveillance to network shutdowns, had been too-often neglected in recent years. This was particularly true as we witnessed the rising power of, and public interest in, Big Tech.
Two op-eds we published at the end of December helped further spotlight the under-discussed power of telecom companies to trample over basic rights: 

  1. In Thomson Reuter Foundation’s Context News, RDR’s Senior Editor Sophia Crabbe-Field published an op-ed based on our TGS findings explaining “Why telecom firms should care more about human rights.” 
  2. For Slate, RDR’s Program Manager for the Corporate Accountability Index, Veszna Wessenauer, examined “The Tech Companies That Wield Even More Power Than Facebook or Google” by investigating, among other cases, the troubling sale of Vodafone in her home country, Hungary, to the government of authoritarian leader Viktor Orbán.

Sports and Surveillance After the Qatar World Cup

Late last year, RDR teamed up with pan-Arab digital rights organization Social Media Exchange (SMEX) to release “Red Card on Digital Rights,” a three-part series investigating the state of the internet and digital surveillance in Qatar as it hosted the 2022 FIFA World Cup.

During the games, we found opaque and worrisome policies for the country’s mandatory Hayya app that jeopardized the data of World Cup visitors, alongside massive surveillance, with the presence of 15,000 cameras across eight stadiums.

This month, we released part three of the series, looking at the dangerous precedents set in Qatar, the massive surveillance plans already underway for the 2024 Paris Olympics, and how future host countries can balance safety considerations with a rights-respecting framework.


Looking to the Future, with Help from Proton

2022 was a big year for RDR: We released our first-ever Big Tech and Telco Giants Scorecards in 2022. Also in 2022, RDR was selected by the Proton Mail Community as one of 10 recipients of their 2022 Lifetime Account Fundraiser raffle, alongside fantastic allied orgs like Access Now and Fight for the Future.

We’re happy to say that Proton beat last year’s fundraiser, selling 68,000 raffle tickets and raising over $684,000, a portion of which will support RDR’s activities in 2023.

Thanks to this generous support, RDR is jumping into 2023 with the goal of ensuring our standards become increasingly accessible to anyone who might be able to use them to yield better human rights results in the tech sector, including policymakers, investors, companies, journalists, and civil society. But we’re focused on two groups in particular: 

  • Civil society organizations around the globe, particularly in the majority world, and 
  • Responsible investors, who use RDR’s standards as an important touchstone in their decision-making.

Working alongside both of these groups, we aim to ensure that 2023 is a memorable year for holding the tech industry accountable for protecting our rights online and off.

Read more about the Proton raffle and our goals for 2023 →


RDR Media Hits

Fast Company:Jan Rydzak discussed the drastic decline in transparency reporting from Twitter under Elon Musk’s leadership: “Musk strip-mined the core structures responsible for protecting human rights at Twitter and concentrated all decision-making power in himself, severing ties with the human rights community entirely.”

Read More at Fast Company 


American Geographical Society: The American Geographical Society interviewed RDR Director Jessica Dheere and Research Manager Zak Rogoff for a piece on their Ubique blog on how companies use personal information like location data.

Read More at Ubique


Upcoming Events

Mozilla Festival 2023 

MozFest 2023 will take place online from March 20-24. 

Register here

RightsCon 2023

RightsCon will take place online and in person in Costa Rica from June 5-8, 2023. Buy a ticket now with early bird deals or plan to attend online and hear from the RDR team and digital rights experts around the world! 

Sign up here

And, World Press Freedom Day is May 3, 2023!


Support Ranking Digital Rights!

If you’re reading this, you probably know all too well how tech companies wield unprecedented power in the digital age. RDR helps hold them accountable for their obligations to protect and respect their users’ rights.

As a nonprofit initiative that receives no corporate funding, we need your support. Do your part to help keep tech power in check and make a donation. Thank you!

Donate

 

Subscribe to get your own copy.

On January 20, Ranking Digital Rights (RDR) submitted a comment to the United Nations Educational, Scientific and Cultural Organization (UNESCO) expressing concerns about the organization’s proposed “model regulatory framework for the digital content platforms to secure information as a public good.” The draft will be further discussed this month at a UNESCO conference in Paris.

The proposed framework seeks to guide the development of national laws and regulations governing online speech on the largest platforms, including Meta and Twitter, while also proposing modes of self-regulation. Positively, it encourages regulation that requires content rules compatible with human rights, transparent processes for content moderation, and systematic risk assessments. It endorses the Santa Clara Principles on Transparency and Accountability in Content Moderation, which RDR helped develop. 

RDR is an independent research program and human rights organization based at the think tank New America. We evaluate the policies and practices of the world’s most powerful tech and telecom companies and study their effects on people’s fundamental human rights, primarily through our yearly Corporate Accountability Index. Using this research, we push platforms hard to increase transparency and improve their respect for human rights. We have also conducted in-depth research on the role of the targeted advertising business model, a key driver of today’s massive proliferation of harmful content. 

We, therefore, strongly share UNESCO’s concern about the prevalence of harmful content on digital platforms, including hate speech, harassment, doxxing, misinformation, and other types of content that is damaging to freedom of expression and information, privacy, and other human rights. Much of this content disproportionately harms marginalized groups, creating additional barriers to their participation in civic discourse.

We also want to call attention to important problems with the proposed framework, and its development process, which will hamper its usefulness as a tool for addressing these shared concerns. These problems include:

 

  • Unclear mandate: It is not clear that development of this framework is within UNESCO’s mandate. Its development should therefore not proceed without a decision by the UNESCO General Conference, the organization’s chief decision-making body. UNESCO should also cooperate closely with the UN Office of the High Commissioner for Human Rights, which could ensure that the Framework does not inadvertently harm freedom of expression. 
  • Minimal consultation process: The draft was first published on December 19, 2022, with a deadline for public comment on January 20, 2023. This meant the public had only a month to provide their input, during a period when many people around the world are celebrating holidays. This truncated comment period disproportionately hampers organizations with limited resources, including those representing marginalized groups. Further, despite the fact that the framework is intended to be globally applicable, UNESCO does not seem to have proactively reached out to a diverse set of civil society stakeholders for input. 
  • Neglect of the role of targeted advertising: As RDR has documented, the incentives for the amplification of harmful content stem directly from the targeted advertising business model. Among other harms it facilitates, this model rewards content providers and advertisers for publishing content (paid and unpaid) designed to attract and keep users’ attention for as long as possible. This incentive perverts the value of the internet as a trusted information source by amplifying the most sensational and extremist content to generate page views and, thus, advertising revenue. The best way to address harms without encroaching on the right to freedom of expression is by protecting the data that is used to create such advertising, and by regulating how companies and advertisers are then able to transform this data in order to target messages and ads. At minimum,  governments should require greater transparency and due diligence from tech companies about how ads are targeted and moderated. Unfortunately, the present draft framework calls only for encouraging—rather than enforcing—advertising transparency, and it does so only for political ads. 
  • Lack of attention to inferred data about users: To expand the data available for targeted ads and user-generated content, platforms also often algorithmically infer information about their users. This inferred information has a high risk of being inaccurate and biased. Regulation should therefore require enhanced transparency around algorithmic inference, and the framework should incorporate guidance for how to mitigate the effects of inferred data on the amplification of hate speech, disinformation, and other information harms it seeks to address. More areas of concern are discussed in our full comment

We appreciate the intention of the framework’s draftees. We agree, however, with comments made by our civil society partners, such as the Global Network Initiative and Article 19, who have argued that its development should not proceed unless these issues are fully addressed.