RDR is now an independent initiative. Our website is catching up.  Read our announcement →


In 2018, the Cambridge Analytica scandal helped propel the perils of surveillance capitalism into the mainstream. The following year, the release of Shoshana Zuboff’s pivotal book,
The Age of Surveillance Capitalism, cemented the issues of data privacy and targeted advertising as top problems of our time; not just for a bevy of experts, but for the public at large. It was in this context that Ranking Digital Rights released its first major report, It’s the Business Model. The report argued that the rampant misinformation and hate speech we were seeing perpetuated by social media companies was not the sole product of a lack of content moderation, and therefore could not be addressed through intermediary liability reform (in other words, by getting rid of Section 230).

Rather, it was argued that the pathologies of the online environment were the downstream result—a negative externality, in economics terms—of the incentives created by the industry’s targeted-advertising business model: collect and monetize all data, automate everything, scale up, and wait for the profits to roll in.

The report was influenced by recent changes that RDR had made to its methodology, as the consensus around these trends and their pervasiveness in the industry began to solidify. These changes included the addition of new indicators on algorithms and targeted advertising. As the report’s lead author, and Ranking Digital Rights’s former Policy Director, Nathalie Maréchal recalls, “the Big Tech business models had all kind of started to converge toward the collection and monetization of data, either for the purpose of advertising or for the purpose of AI development.” For these companies, the acquisition of data became both “a business imperative, and also an ideological imperative.” 

This was different from how things were back when the methodology for the first RDR Corporate Accountable Index was conceived in 2013. At the time, most of RDR’s indicators evaluated either “things that companies were doing at the behest of governments or things that basically amount to negligence [for example, poor data security].” But, since then, it had become clear, both to Maréchal and to RDR Founder Rebecca MacKinnon, that companies also made a lot of decisions based purely on their own self-interest. Meanwhile, Nathalie found herself fed up with the reigning policy discourse in D.C. and Brussels at the time, which gave the impression that “the only thing wrong with social media is that CEOs are insufficiently motivated to do content moderation correctly.”

Sara Collins, Senior Policy Counsel at Public Knowledge, agrees. For a while, most D.C. policy discussions centered around how platforms “may spread misinformation and threaten democracy” would, reflexively, also become about “how to get rid of Section 230.” As she explains, the report helped “thread the needle about why [data collection] has residual content harms.” This is especially important for organizations like Public Knowledge, which places a strong emphasis on free speech online.

Nathalie recalls a metaphor MacKinnon shared with her at the time: Performing only content moderation is like trying to remove pollutants in a stream using only a pipette. It was clearly going to be insufficient to clean up the polluted lake that was the vast networks of disinformation across platforms. The It’s the Business Model report was conceived of as part of a necessary narrative shift and drew on RDR’s new indicators to strengthen the connection between the business model and the harms RDR was observing directly through its company evaluations and close relationships with global civil society organizations.

Changing the Conversation: The Report Comes Out

Unfortunately, the report’s release event was planned for March 17, 2020, the day before the COVID-19 pandemic was declared; and the launch event was canceled. However, we know that the report had important implications across the policy sphere, with a number of allies reporting decisive effects of the report on their thinking about the business model.

Jesse Lehrich, Co-Founder of Accountable Tech, explained that “the It’s the Business Model report was really critical and ahead of its time as far as moving the advocacy community and policymakers to think beyond content moderation and deplatforming.” He describes the report as “formative” in shaping a lot of his organization’s work and, in particular, their “Ban Surveillance Advertising” campaign, which brought together over 50 organizations around the globe. A focus on the surveillance advertising business model has also served as a way to “break down silos” between different parts of the advocacy community, Jesse points out. 

Privacy advocates, civil rights groups, and anti-monopoly activists are sometimes at odds; but the business model was something they could all coalesce around. And this remains true as the community begins to grapple with the potential impacts of AI. In fact, the AI Now Institute, in a recent report, referenced Accountable Tech’s campaign to ban surveillance advertising as an important model. Though the arguments in RDR’s report spoke most clearly, at the time, to surveillance advertising, Nathalie Maréchal agrees that “today we see the same cold logic applied in the world of artificial intelligence and automated decision-making.”

Meanwhile, Jesse believes the report also played an important role in galvanizing legislation and regulatory frameworks that have come about since. He points, for example, to the inclusion of bans on targeted advertising of children and on the use of sensitive data for targeting in the EU’s Digital Services Act (DSA) as the kind of regulatory response that was made possible thanks, in part, to RDR’s work. Sara Collins agrees, noting that, “I do really think that [the report] has shaped how people are talking about the content space. You still obviously get the Section 230 bills, but now that’s not the only solution put forward.”

At the time of the report’s release, Anna Lenhart was working on tech oversight for Representative David Cicilline, in the House Judiciary Committee. One prime area of focus for her was ad targeting and ad libraries and understanding what kind of information is useful to measure ad targeting discrimination. By 2021, Anna was advising on a number of potential bills, including Congresswoman Lori Trahan’s Social Media Data Act, that took aim at the surveillance advertising industry. In particular, this act mandates that companies keep thorough ad libraries to help bring about transparency in ad targeting. One thing Anna was looking for while conducting her research were reports that provided examples of potentially problematic advertising and ad campaigns. And this was something she found in the Business Model report. “It’s always really helpful [to have examples] when you’re trying to tell the story to constituents or briefing members of Congress,” she explains.

That year, Nathalie was called upon to testify on the Hill and Anna requested her expertise during several meetings while the Congresswoman and her staff worked to craft the bill. Notably, Anna’s former boss, Congressman David Cicillini, himself also made several references to the “business model” during an antitrust hearing, while grilling leaders of the major Big Tech companies. Finally, at the international level, UN Special Rapporteur Irene Khan referenced the “business” model in an important report for the UN Human Rights Council report on “disinformation and freedom of opinion and expression.” 

Though the report was released at a time when other events and thinkers were helping to shift the conversation, RDR’s report played an essential role, at a pivotal moment, to help further popularize the idea of the “business model” as the real root of the growing problems of mis- and disinformation. Its release came at just the right time to help galvanize policymakers and civil society alike and to create a lasting imprint on ongoing policy conversations, conversations which have taken on new meaning and urgency with the growing AI arms race now upon us.


From its inception, Ranking Digital Rights’s standards and methodology were designed with investors in mind. Indeed, our Corporate Accountability Index
was devised almost a decade ago alongside ESG ratings provider Sustainalytics. Since then, RDR has aimed to ensure our standards would be usable for responsible investors interested in tackling growing concerns around the regulatory and human rights risks linked to Big Tech. As RDR Founder Rebecca MacKinnon mentioned in our inaugural Investors Research Note in 2017, though “digital rights issues [had] been hiding in plain sight for more than a decade,” the complexity of the issues involved had made “it hard for many investors to recognize the potential significance of specific abuses or to track evolving performance standards.” These sentiments were recently echoed by former RDR Investment Engagement Manager Jan Rydzak, who explained that benchmarks like RDR continue, today, to “highlight companies’ impact on rights that have often been neglected by existing ESG frameworks.”

In the investor community, “there’s this traditional view of what human rights are and what impacts human rights, including supply chain issues and worker safety issues,” Lydia Kuykendal, Director of Shareholder Advocacy at Mercy Investment Services, explained. “A lot of investors, a lot of people that do our work, have those more traditional views and do not feel comfortable with any type of tech, let alone cutting edge tech,” she continued. Therefore, having the kind of support that RDR provides has been “more important than in almost any other space.” Working with organizations like RDR is also particularly useful for those in the investor community who are working across different ESG issues, as Michela Gregory, Director of ESG Services at NEI Investments, added. Much of this work has been facilitated through RDR’s close working relationship with the Investor Alliance for Human Rights (The Investor Alliance).

RDR and the Investor Alliance for Human Rights Join Forces

The Investor Alliance was formed in 2018 as an initiative of the Interfaith Center on Corporate Responsibility (ICCR) to augment the number and capacity of global investors engaged on business and human rights concerns. The Investor Alliance’s work is centered on the UN Guiding Principles on Business and Human Rights, the same set of international principles that guides RDR’s work. ICCR’s genesis in the early 1970s came in response to Desmond Tutu’s call for religious and faith-based investors to divest from South Africa to pressure the government to abolish apartheid. Like her other colleagues in the responsible investment space, the Investor Alliance’s Director, Anita Dorett, found that, initially, most investors, and the businesses they engaged with, had a narrow view of human rights risk generally focused on supply chain concerns. The Investor Alliance’s decision to focus on human rights risks in the tech sector represented an important shift.

Meanwhile, RDR’s focus on digital rights and its alignment with the same international guiding principles made the two natural allies, Anita explained. In addition, she said, “we want to ensure all of our engagements are research-based and data-driven; comparative data is really important. So RDR was kind of an obvious choice for us.” RDR’s value-added was clear as soon as Anita started engaging with Founder Rebecca MacKinnon, who, she said, “poured her attention and her focus on investors utilizing the RDR data and really rallied around collaborating with us, understanding that the critical users of this data will be investors.”

Though, over the years, RDR continued to speak to investor needs—including through successive investor updates, it was the release of the Investor Alliance’s “Investor Statement on Corporate Accountability for Digital Rights” in 2021 that truly cemented RDR’s key role in helping to galvanize shareholder proposals around human rights concerns in the digital sphere. The statement, signed by 176 investors representing over US$9.2 trillion in investments, outlined the need for companies to adapt to “investor expectations in line with evaluations and recommendations of the 2020 Ranking Digital Rights Corporate Accountability Index,” in particular around privacy and freedom of expression. 

The Investor Alliance convenes and helps coordinate the collective work of a diverse group of investors. As its Director Anita explained, for this to be successful, “you need everybody on the same page sharing a common set of investor expectations.” Therefore, “the investor statement represents the articulation of investors’ expectations, based on the data RDR provides, and using RDR’s recommendations, with RDR’s expertise and analysis, to hold companies to account or to drive companies to fill in the gaps in their digital rights performance.” The decision by signatories to align their expectations around RDR’s work didn’t come as a surprise to Mercy Investments’s Kuykendal, who added that “familiarity and trust with RDR among the investor community made it easier for many to sign onto the statement.”

The statement also represented the culmination of growing investor interest in the potential digital risks presented by the tech sector. A first iteration of it, in 2019, garnered just under 50 signatories. But by 2021, interest in tech issues had increased significantly, Anita explained. During this time, shareholder proposals had been put forward for the first time at tech companies including Apple, Amazon, and Facebook (Meta), demanding everything from human rights policies to dedicated governance structures. And these helped to further grow awareness, even among investors who voted against them. Unsurprisingly, according to Anita, today “every time we speak to a new investor, they want to talk about tech.”

Before 2021, Lydia Kuykendal recalled that Mercy had done little in the tech space; most of their growing body of work in this space has indeed come through their affiliation with the Alliance. For NEI’s Michela Gregory, the statement has served as an important launchpad for the engagement and dialogues with companies that have come since. The Investor Alliance’s Digital Rights Engagement Initiative continues to coordinate outreach to RDR’s ranked companies, by the statement’s many signatories, which include NEI and Mercy Investments.

RDR Begins Supporting Key Proposals, Including at Meta

While RDR was first cited in a proxy resolution in 2020, we began directly supporting the crafting of such proposals in 2021. At Meta, for the second time running, shareholders recently voted on one of the most consequential RDR-supported proposals, calling for a human rights impact of the company’s targeted-advertising policies and practices. It has been one of the most successful in the company’s history, earning a strong majority of support from independent shareholders (those who are not founders/controlling shareholders). As we noted ahead of the vote on the original proposal in 2022, human rights impact assessments are essential for any company that is part of the “targeting ecosystem.” This is especially true of a company like Meta, which then accounted for more than a quarter of all U.S. digital ad spending.

The Meta proposal, which RDR helped prepare, was filed by Mercy Investments and co-filed by NEI Investments. “I don’t think I could have done it without RDR,” Lydia, who was the lead filer, explained. For her, a lot of RDR’s value-added has come from “tracking the legislation in the U.S., the EU, in Japan. I don’t have the capacity to do that. I don’t know other organizations that are particularly good at that.” She uses RDR’s data to track regulatory risks to companies like Meta for exempt solicitations—where shareholders are able to make a longer case for their resolution, and respond to company opposition—as well as to present these regulatory risks to investment giants like BlackRock and Vanguard, in the hope of attracting their large trove of investor votes.

Lydia recognizes the impact of the Meta proposal, which received “the second highest support apart from dual-class share voting.” She has noted, however, that, as long as multi-class share structures remain, “we’re never going to go anywhere.” These share structures give funders of companies inflated voting power at annual general meetings, and play a big role in limiting the success of human rights-based proposals. At Meta, CEO Mark Zuckerberg holds 61% of voting power, meaning he could single-handedly vote down any proposal. For this reason, Lydia is in “favor of investors really examining strategies to focus on a single issue, which is eliminating the dual-class share structure.”

And this is why RDR, alongside its support for individual proposals, has also been at the forefront of efforts to break down dual-class share structures. In 2022, RDR sent a letter to the U.S. Securities and Exchange Commission (SEC), signed by 20 other human and civil rights groups, urging an end to such structures, while pushing lawmakers to take action. Moving forward, RDR will continue to support shareholders in crafting proposals that put human rights at the forefront of company policy and practice while also advocating for governance structures that ensure investors are finally given a fair voice at the table.


When RDR’s first Corporate Accountability Index was released in 2015, grading 16 tech and telecom companies on their respect for privacy and freedom of expression, it was the first of its kind to rank the impact of companies on specifically digital rights. Unsurprisingly, it took the policy world by storm, including at international forums like the UN. Several U.S. media outlets also took an interest in what the Index revealed about the activities of large global platforms and telcos like Google and AT&T. Meanwhile, however, in the Majority World, many civil society organizations were taking note, instead, of the potential for these standards to hold local and subsidiary companies accountable closer to home.

The first adaptations of RDR’s methodology to study these local contexts globally began in 2018, when Arab region digital rights organization SMEX used the RDR methodology in a report on the state of digital rights for mobile users within Arab states. That same year, an adaptation was conducted examining mobile operators in Russia, while Internet Without Borders produced a report looking at the performance of large international telecom subsidiaries across Africa. In many cases since, adaptations have been carried out in countries or localities with little existing corporate accountability, particularly for the tech sector. Oftentimes, they have been used in precarious socio-political contexts.

In response to the enormous potential of this work, RDR began providing direct support to organizations worldwide in 2021. As Leandro Ucciferri, RDR’s Global Partnerships Manager, put it, “these projects are putting new companies under the spotlight, which have not received enough scrutiny from the digital rights community.” Since this work began, we’ve provided this support to adaptations covering 35 countries and 127 companies. Though adaptations have taken place under various grants, many recent projects have been conducted under the auspices of the Greater Internet Freedom (GIF) project. Among others, these include recent, successful projects across Africa as well as in Central Asia (another report was recently released covering Southeast Europe), both of which we’ll highlight below.

First Evaluations of Central Asian and African ICT Sector: Major Gaps in Human Rights

Internet shutdowns, executed through the telecom sector, are rampant in both of these regions. As we’ve recently detailed, politically motivated shutdowns have taken place frequently in Central Asia over recent years. In Kazakhstan, shutdowns were executed in 2022, after the breakout of mass protest, following on the heels of previous shutdowns in both 2021 and 2020. In addition, in both Uzbekistan and Kyrgyzstan, laws have been introduced aimed at imposing online censorship. In Africa, shutdowns have been weaponized recently in Tanzania and Zimbabwe, among other examples.

It is within this context of poor digital rights protections that RDR recently partnered with GIF and local partner organizations, in both Central and Southern Africa, as well as in Central Asia, to establish a baseline for corporate accountability. According to Mavzuna Abdurakhmanova, GIF’s Central Asia Digital Rights Consultant, her region’s report was notable for being the first of its kind. “From the civil society perspective, no one questioned the private sector on the protection of the digital rights of their users,” she explained. “Nobody was even thinking about asking questions about human rights to the business sector.” And yet company transparency and accountability is particularly important in such countries, where weak human rights protections alongside fragile democracies often lead to the participation of telcos and other ICT service suppliers in infringing on the basic rights of users.

In Central Asia, the report entitled “Ranking Digital Rights in Central Asia” was conducted by Tajikistan-based Civil Internet Policy Initiative (CIPI) and looked at digital rights across three sectors: telecom, e-commerce, and fintech companies. These companies were located in four countries: Kazakhstan, Tajikistan, Uzbekistan, and Kyrgyzstan.

A growing number of GIF reports have been released recently covering East, Central, as well as East and Southern Africa. In 2022, RDR supported local partner Paradigm Initiative (PIN) for the creation of their report “Ranking Digital Rights in Angola, Democratic Republic of Congo and Central African Republic” as well as Internet Freedom Lesotho’s report on “Digital Rights in Lesotho.” Paradigm Initiative focused on three telecommunication companies, one for each country they covered. The Lesotho report, meanwhile, covered four companies: two telcos and two financial companies. Building on previous success, a new report covers countries in Eastern Southern Africa: Uganda, Tanzania, Zimbabwe, and Zambia, examining the policies of top telecom operators.

The findings in these reports have been stark: Researchers discovered that companies will routinely point to government requests to excuse the high number of internet shutdowns that they adhere to. When she began working on this project, Wakesho Kililo, who leads GIF’s Africa work, wondered whether companies would actually have any policies in place to handle such demands. Unsurprisingly, she found that such policies were frequently missing. Transparency about potential actions taken in responding to censorship demands was also limited. Mavzuna recalled that almost all companies evaluated by the Central Asia report were responding to government requests and providing personal information of users, but there was neither a publicly available policy about how they responded to these, nor general data provided on the number of requests received or responded to.

Across the board, these company gaps were clear and gaping. According to Mavzuna, no companies covered by the research publish annual reports on their websites. There were no companies publishing information about their governance structure. Wakesho, meanwhile, noted that Terms of Use are also rarely comprehensive. In fact, her research across Africa showed that a majority of companies there were failing to translate Terms of Use into local languages. In addition, many of the companies Wakesho helped evaluate have privacy policies that are sorely lacking, and sometimes non-existent. This was true, for example, of NetOne Zimbabwe, which has no privacy policy for any of its services investigated for the report. Wakesho remembered thinking, as she completed her research, that “users’ digital rights are being abused. They’re not being protected, either at all or at the rate they should be.” She added, “When a telco doesn’t even have a privacy policy, what are they doing with user data? We don’t know.”

Another pattern of note has appeared across RDR-supported adaptations, and was also evident in both of these regions: Parent companies of corporations based in Western Europe often offer more robust human rights policies to their clients at home than they do to the users of their subsidiary companies abroad. GIF Central Asia’s Mavzuna explained that, when they conducted a comparative data analysis, many good policies and practices of parent companies were notably absent in those of their subsidiaries. “Why didn’t you take those good policies and good practices from European parent companies and adapt them to our local context?” Mavzuna wondered. (For more on this discrepancy, please check out our essay from the 2022 Telco Giants ScorecardThe disconnect between HQ and local subsidiaries results in less transparency and protection.”)

A First Experience with Company Engagement: Is Change Really Possible?

Following the completion of their reports, Mavzuna and Wakesho were determined to engage with companies—as RDR does following the release of our Scorecards—with the hope of influencing new company policies. In all cases, many of the largest telecom companies simply ignored them. However, smaller companies (and a couple of larger telcos) expressed interest in improving their policies based on the findings of the reports.

For example, Mavzuna was able to attend a meeting with SMEs (small to medium-sized enterprises) in Tajikistan where the Central Asia report findings were presented for the first time. There, a representative for Alif Bank, a fintech company, expressed appreciation for the report and the important gaps it uncovered, which he promised to address. Though she didn’t engage directly in Uzbekistan, her local partner in that country reported a positive reception; many within that country’s private sector are aware of a positive correlation between improved human rights respect and investment from Western countries.

Meanwhile, in Lesotho, the report’s researcher, Nthabiseng Pule, was able to meet directly with representatives from Vodacom Lesotho and Vodafone. Not only that, but the company also heeded some of the report’s top recommendations. This included hiring a language expert to translate the company’s terms and conditions into Sesotho, Lesotho’s main language. In addition, the company agreed to create a privacy portal, where one can find all information relating to user privacy. This represented an important win and first step for the organization. Paradigm Initiative was also able to meet with France-based Orange, a telco with subsidiaries across the African continent. Finally, GIF, with support from the Global Network Initiative, was also able to meet with Vodacom Tanzania.

Engagement also extended beyond companies. In Lesotho, a policy brief was sent to the country’s regulator, the Lesotho Communications Authority, noting the report’s main findings. Meanwhile, the report also opened many eyes in civil society. One organization, Internet Society Lesotho, began a campaign around the right to privacy, based on the report. Wakesho also highlighted engagement at international forums, including the Forum on Internet Freedom (FIF) Africa, the Digital Rights and Inclusion Forum, and RightsCon 2023, where she got to share the reports’ findings with civil society as well as with company representatives, some of which even requested their companies be evaluated in the future.

According to Mavzuna, she and her colleagues are now feeling “more confident” after receiving positive feedback from the private sector. Mavzuna hopes to continue engaging with companies interested in implementing more robust human rights protection. Indeed, she believes this is just the beginning for this kind of engagement. Although there was a lack of willingness from larger companies to engage during this first round, she maintains hope based on the interest she’s witnessed mostly from smaller companies, including from the region’s nascent e-commerce sector, where companies are mostly younger and more dynamic.

Despite initial successes, Wakesho believes that “we need more of this work. There’s more to be done. We need the regulators to be aware… We need to call the companies to account.” Both Wakesho and Mavzuna believe this must involve eventually getting the larger telcos to the table and holding them accountable for the same level of human rights protections they offer users in their home country. And RDR hopes to continue supporting them in that task.

As RDR’s Leandro Ucciferri has explained, “By making it easier to use our methods and standards, we aim to grow the community of corporate accountability advocates that are bringing these conversations to new countries and regions outside the Silicon Valley and Brussels bubbles.” Indeed, we’ve already doubled down on our commitment to support this movement globally, including through last year’s launch of the RDR Research Lab, a hub for digital rights researchers and experts across the world. This represents, however, just one part of our commitment to ensuring the full democratization of the global tech accountability movement, as we continue to help global allies hold the ICT sector to account for the rights of all users, everywhere.

Image created with MidJourney

Soon Ranking Digital Rights will release the Generative AI Accountability Scorecard, evaluating major consumer-facing generative AI services’ respect for the human rights to privacy, non-discrimination, freedom of expression, and freedom of information. Today, we are sharing a consultation draft of the indicators we will use to evaluate companies for the GAIA Scorecard. But, to ensure they are credible and effective, we need your help!  If you’re an expert, or if you have knowledge of these technologies and their related risks to human rights, we’re inviting you to participate in the fast and flexible feedback process. All participants will be credited in the final report unless they prefer to remain anonymous.

Read the draft indicators and give feedback!

The draft indicators are based on preliminary standards we shared along with a detailed rationale and Q&A about the project, in June 2023. The project will use RDR’s ten years of experience ranking tech companies to help address the human rights risks of generative AI, including “turbocharged information manipulation,” bias, non-consensual pornography, fraud, and incentives for continued privacy violations.

Whether or not you wish to provide feedback on the standards, experts and stakeholders are invited to join RDR’s discussion mailing list about civil society and academic projects to evaluate the policies and transparency of generative AI services. To join, send a message request to methodology@rankingdigitalrights.org.


Photo via MidJourney

Despite Central Asia’s political turmoil and fragile democratic institutions, its digital services sector has grown steadily in recent years, a trend accelerated during the recent Covid-19 pandemic. With countries across the region continuing to introduce programs focused on digitization—from work-from-home schemes to e-shops and e-pharmacies—the time was ripe to establish the region’s first baseline for corporate accountability of the growing tech sector. Tracking the way new digital products and services are affecting existing power dynamics between individuals, companies, and government bodies in this region has become more important than ever.

As we’ve noted in the past, government-imposed shutdowns and censorship are some of the most nefarious ways that the ICT sector is deployed by governments to negate user rights. This has happened, in recent years, across this region. For example, in January 2022, protests broke out in Kazakhstan due to rising gas prices. Protestors, who the government accused of attempting a coup, were met with violent repression from law enforcement authorities and armed groups, resulting in over 200 deaths. Access to the internet was also quickly restricted, as the latest Freedom on the Net report recently detailed. This was not the first time that such measures had been taken by the Kazakh government, as mobile internet access had previously been shut down during smaller, local protests in 2020 and 2021.

A similar tendency toward restricting freedom of expression online has also been well-documented in Uzbekistan, where social media platforms and messaging apps were blocked in both July and November 2021 with the introduction of the amended Law on Personal Data, which introduced data localization requirements for online services. Certain platforms were deemed non-compliant and were blocked, including Twitter and TikTok. Meanwhile, in Kyrgyzstan, the law on Protection from False Information, passed in July 2021, granted power to a government body to demand that websites and social media platforms delete content within 24 hours.

These are just a handful of examples that illustrate how autocratic-leaning governments in the region are creating new challenges to freedom online that call for the explicit safeguarding of users’ rights.

Adapting the RDR Methodology to Central Asia

It is within this context that the Public Fund Civil Internet Policy Initiative (CIPI) chose to carry out research and publish their new report, “Ranking Digital Rights in Central Asia,” using the RDR Corporate Accountability methodology. The report looks at how and whether technology companies in Central Asia have committed to respecting human rights and protecting their users.

We partnered with CIPI to guide them through the process of adapting our methodology to carry out this important new research. Although previous adaptations have been carried out in neighboring countries, such as Russia, this project marked the first time that the methodology was used to evaluate companies in Kazakhstan, Tajikistan, Uzbekistan, and Kyrgyzstan.

CIPI focused on three key sectors: telecommunications, e-commerce, and financial technologies (fintechs). They evaluated a total of 16 companies. In each country, these included the two most popular telecom providers, as well as one e-commerce platform and one fintech. This list also includes a mix of fully local services and larger subsidiaries of international companies. This allowed CIPI to compare whether subsidiaries or local companies were more responsive to human rights concerns. For example, CIPI evaluated three subsidiaries of Beeline, owned by VEON, a Dutch connectivity operator. In the e-commerce sector, they included two subsidiaries of OLX Group, owned by Prosus, also based in the Netherlands.

CIPI used a selection of 22 indicators from our methodology, including some covering each one of our areas of focus: governance, freedom of expression, and privacy. Their findings show clear gaps with regards to companies’ disclosures across the board. However, we’ll spotlight two areas of particular concern. Firstly, as network shutdowns continue to present an imminent threat to human rights across Central Asia, companies need to improve their transparency on policies regarding how they respond to such demands and how it affects their users. Secondly, the report highlights the disparity between the human rights commitments large parent companies provide to their users at home (oftentimes in Western Europe) compared to those provided by their subsidiaries in the regions; this is a gap that needs to be bridged.

From Shutdowns to Subsidiaries: Tracking Telcos’ Biggest Human Rights Gaps

As detailed above, the region is no stranger to government-imposed network disruptions, and all the companies evaluated have a track record of filtering websites or even shutting down access entirely in response to government demands.

Despite this fact, CIPI found that none of the telecom companies they evaluated disclosed their processes for responding to government demands to restrict or shut down access to the internet. Users are seldom informed about these shutdowns before they take place. When they are notified, it’s mostly retroactively and indirectly, via social media posts and the company’s website.

For example, when the internet was disrupted during the previously-mentioned protests in Kazakhstan, Beeline Kazakhstan posted a brief apology in the news section of their site, referring to circumstances beyond their control and offering financial compensation. In Tajikistan, following an internet shutdown that took place during a period stretching from November 2021 to March 2022, amidst political tensions, Tcell, a mobile company owned by the Aga Khan Fund for Economic Development, posted a brief note on their website attributing the suspension of services to weather conditions. According to CIPI’s report, similar explanations for disruptions have been employed by telcos in Uzbekistan and Kyrgyzstan as well.

Post-COVID uptake of digital services has been particularly notable for the region’s nascent e-commerce sector. E-commerce remains a young, albeit growing, market in Central Asia, and one that has garnered a great deal of interest from international development agencies and institutions. Uzbekistan and Tajikistan have both introduced specialized programs to foster the development of the sector.

CIPI evaluated two subsidiaries of OLX Group— an online marketplace, where people can buy and sell cars, find housing and jobs, as well as buy and sell household goods—in both Kazakhstan and Uzbekistan. At the parent level, though the company doesn’t directly mention freedom of expression, it has clear human rights policies and commitments in place, and even references the UN Guiding Principles. However, the local websites of OLX Kazakhstan and Uzbekistan fail to uphold these same commitments.

The two other local e-commerce platforms evaluated, Lalafo in Kyrgyzstan and Somon.tj in Tajikistan, appear, on the other hand, to have no company-wide human rights policy whatsoever. The OLX subsidiaries performed better in disclosing their data protection practices, including how data is collected, how it’s shared and with whom, and how the company explains the purposes behind the data processing.

CIPI’s report urges all 16 companies to take concrete steps to publish clear and strong commitments to respect human rights and to enhance transparency relating to their oversight mechanisms and corporate structures. They could achieve this, among other ways, by publishing annual transparency reports. CIPI also calls on subsidiaries of large international companies to adopt their parent company’s best practices at the local level, including codes of corporate ethics.

Among the three industries that CIPI focused on, though e-commerce companies showed glaring gaps in human rights policies, this young and dynamic market showed the greatest interest in open communication with CIPI to discuss how they can best improve their policies and deepen their collaboration with civil society. The report’s authors hope that larger companies may eventually follow their lead. Ideally, these comparisons can spark further competition toward stronger human rights safeguards, and companies that follow internationally-recognized good practices will eventually lead the industry.

We invite you to read CIPI’s report in full, where you’ll find detailed explanations of the findings for each indicator and company. If you’re interested in adapting the RDR Corporate Accountability methodology and designing your own research project, you will find all the necessary information in our Research Lab, our free and open hub for the guidance and tutorials you need to launch your own RDR-style research project. If you’re interested in collaborating with us directly, or have any further questions, you can get in touch with us directly at partnerships@rankingdigitalrights.org.