RDR is now an independent initiative. Our website is catching up.  Read our announcement →


Photo created by Bricklay. Via Noun Project. 

On November 21, RDR submitted comments to the Federal Trade Commission (FTC) in response to its Announcement of Proposed Rulemaking (ANPR) on commercial surveillance and data security.

As we note in the submission, we commend the FTC for its thoughtful consideration of the problems associated with commercial surveillance, including surveillance advertising and data security, and welcome the opportunity to respond to the ANPR. The concerns raised by the FTC have important implications for privacy, freedom of expression, the right to non-discrimination, and the enjoyment of other fundamental rights. In the absence of robust private or public mechanisms for corporate accountability, the harms stemming from commercial surveillance practices are simultaneously less visible than they should be and increasingly dangerous and difficult to address. We conclude our comment with a set of recommendations for the Commission to consider in its future rulemaking proceedings, which you can find below.

In our comment, we highlight the myriad harms of commercial surveillance, while urging the Commission to use its authority to regulate it and ultimately abolish surveillance advertising. While doing so, the Commission must also recognize that the path ahead is fraught with political and legal uncertainty. Not least among these uncertainties is the future of the American Data and Privacy Protection Act (ADPPA). The Commission should not let the perfect be the enemy of the good, nor should it be unnecessarily timid in its ambition to protect consumers.

More specifically, our recommendations include:

  • That the FTC regulates commercial surveillance as an unfair trade practice–de facto banning surveillance advertising. 
  • Moving beyond “notice and consent” frameworks, which put the onus on consumers to read and understand complex privacy policies and legal notices. 
  • Establishing standards for data minimization and purpose limitation through a Section 5 Unfairness Rulemaking. 
  • Specifying permissible purposes for data collection, use, and sharing. Our recommendation on this item is largely congruent with what’s proposed in the ADPPA, while also urging the FTC to prohibit all surveillance-based targeted advertising. 
  • Finally, RDR recommends that the Commission obligate companies to disclose their data practices to the FTC and to the public, as well as submitting to regular audits. 

Read RDR’s full submission to the FTC on Commercial Surveillance and Data Security.


It’s been less than one month since Twitter officially became the plaything of billionaire Elon Musk. In that time, the company that topped our 2022 Scorecard has gone a long way toward unraveling much of the work that recently earned it our top spot.

In other words, Musk has already proven our worst fears about his management of the platform. A decade ago, many in the top echelons of Twitter saw the platform as the “free speech wing of the free speech party.” Yet, since then, the company made a concerted effort to improve both its enforcement of rules and its transparency. A lot of this seems out the window now. For one, Musk has gotten rid of the entire human rights team and other experts on international human rights standards. Twitter, like other platforms, had in place a practice of pushing back against calls for violence from government actors and their attempts to block content. Twitter’s ability to do both these things is severely curtailed. Musk also introduced an $8 fee for “verification” that many worried could imperil the work of journalists and activists. This new feature was paused on Friday after an immediate proliferation of impersonations of high-profile accounts.

On the platform in question, RDR’s Jan Rydzak has a must-read thread on these and other ways Musk is already turning Twitter into a paragon of “pay-for-play anarchy.”

Is a boycott from advertisers, Twitter’s real customers, now the only way to hold the “Chief Twit” accountable for respecting democracy and human rights? This is the question RDR’s Policy Director Nathalie Maréchal asks in a piece for our home institution, New America.

Unfortunately, that may just be the case. That’s why RDR has joined more than 60 other civil society groups in calling on Twitter advertisers to demand that Musk #StopToxicTwitter.

But if advertisers have that kind of power, that’s only because the system is rotten. At the heart of the problem, Nathalie points out, is that a profit-seeking entity dependent on advertising revenues can never be a truly democratic digital public square. This is true whether or not Musk decides to run Twitter with profit in mind, or by enacting his own murky concept of “freedom,” in other words his own personal whims. (We’ve already seen that parodies of Musk seemingly aren’t protected by his version of the First Amendment.)

Social media companies are thus “trying to square an impossible circle”: provide what billions of people have come to see as an essential public service while delivering returns to shareholders. And taking Twitter private won’t free the company of this issue: Repaying the banks who underwrote the sale could cost up to $1 billion a year, The New York Times has reported.

As Nathalie explains, and as RDR has pointed out time and time again, any business that relies on pervasive and sustained human rights violations will only foster more abuse. And this is very much the case with surveillance advertising. Algorithms optimized to make sure users click as much as possible, to see as many ads as possible, necessarily lead to a major decline in the quality of our information. Yet, moderating content at scale in a responsible way is extraordinarily expensive, and the money has to come from somewhere.

For this reason, RDR will be watching with a close eye the seeming mass migration from Twitter to Mastodon. Might Musk’s disastrous first days as Twitter head be the spark that pushes us to seek out new more democratized and decentralized communications systems? And if so, what new business models will emerge to fund them? Only time will tell.

For now, read more from Nathalie about “The Dangers of Elon Musk’s Twitter Takeover and a For-Profit Digital Public Square.” →


Red Card on Digital Rights at the 2022 World Cup

As the 2022 FIFA World Cup approaches, RDR has teamed up with Arab digital rights NGO Social Media Exchange (SMEX) to launch a three-part series, “Red Card on Digital Rights.” The series investigates the state of the internet and digital surveillance in host country Qatar, amid a slew of criticism over the country’s human rights record.

Read Part One of “Red Card on Digital Rights.” →

See also how SMEX adapted RDR’s methodology to analyze the policies of the Hayya app, which is required to attend the games, and the risks it poses to users’ privacy.

Read more from SMEX on how the Hayya app falls short in protecting user privacy.


The RDR Research Lab Is Finally Here! How We’re Helping Grow the Global Tech Accountability Movement

For years, civil society organizations across the world have been adapting RDR’s standards to highlight how a lack of platform accountability affects digital and human rights in their home countries.

In October, RDR launched a new online learning hub for digital rights researchers and advocates who wish to join them and launch their own project to keep digital platforms, telecommunication companies, and other digital service companies accountable to users and to the public, anywhere in the world.

The site guides researchers through the process of designing, executing, and promoting research on platform accountability using RDR’s methodology and standards. This guide is based on our experience producing the RDR Corporate Accountability Index and Big Tech and Telco Giants Scorecards, as well as on feedback from civil society partners who have published their own RDR-style reports.

Read more from RDR’s Global Partnerships Manager Leandro Ucciferri about the Lab.


How RDR’s Standards Are Being Used in Some of the Most Precarious Spots for Digital Rights


Photo by Dying Regime via CC 2.0

This year, in South and Southeast Asia, EngageMedia and other local digital rights organizations worked with RDR to investigate the policies of local telcos, as well as the subsidiaries of telco giants like Orange, within the context of the digital security issues faced by human rights defenders in the region.

The dramatic growth in the use and availability of mobile broadband across Asia has meant unprecedented access to new tools like email, messaging apps, and social media for the region’s human rights activists. But this has also resulted in the growth of online attacks to intimidate those fighting injustice, including female journalists, indigenous youth, and LGBTQ activists.

With governments around the world failing in their duty of protection, corporate accountability has become an increasingly important tool for civil society actors looking to enhance digital rights.

Read more about how RDR’s standards are helping defend human rights activists in South and Southeast Asia →

Also, find out how researchers and advocates in Lesotho, the Democratic Republic of Congo, Angola, and the Central African Republic are using our standards to hold telcos accountable for privacy and other essential human rights. →


Digital Rights Dialogues: Hear Directly From People Holding Platforms Accountable for Human Rights

Our research lab features interviews with the advocates and researchers already using RDR’s standards to keep platforms accountable, including in some of these most critical spots for digital rights.

In September, protests erupted across Iran following the death of 22-year-old Mahsa Amini, who died in police custody after being arrested for “improperly” wearing her hijab, according to the country’s “morality police.” This mass mobilization was sparked over social media, where news of the death spread rapidly. Yet, in response to ongoing protests, online organizing has been met with internet shutdowns and app outages.

We recently published an interview with Roya Pakzad and Melody Kazemi, of Taraaz and Filterwatch, whose 2020 report evaluated the policies of four popular local and two international (WhatsApp and Telegram) messaging apps in Iran.

In this conversation, they discuss the longstanding use of government shutdowns, the Iranian government’s efforts to push citizens onto government-controlled apps, the proposed “User Protection Bill” that threatens to further block access to social media and the web, and the importance of the country’s corporate accountability movement.

Read more from Roya and Melody about the state of digital rights in Iran. →

And, check out our interview with Jenni Olson from GLAAD about how the LGBTQ rights org used RDR’s standards to keep Big Tech accountable for online hate. →


RDR Media Hits

The Washington Post: The Technology 202 covered Fight for the Future’s “Make DMs Safe” campaign calling on tech platforms to implement end-to-end encryption by default, which RDR joined. The article quotes RDR Policy Director Nathalie Maréchal: “I think most people don’t understand that if they communicate through, say, Facebook Messenger … that it’s not actually private.”

Read More at The Washington Post 


Consumer Reports
: RDR Policy Director Nathalie Maréchal was quoted in a new report from Consumer Reports on how Facebook ads target vulnerable users with harmful supplements: “Facebook should police ads far more strictly to keep potentially harmful information or products off its platform.”

Read More at Consumer Reports

 


Recent Events

Global Voices | Can citizens of democracies still trust the law?

RDR Program Manager Vezsna Wessenauer joined a Global Voices panel to discuss how governments are increasingly using the law to infringe on citizens’ digital rights.

Watch the panel

 


Save the Dates!

December 5: Save the Date for the First Edition of RDR’s Telcos Giant Scorecard!

December 7: 
Save the Date for our TGS Launch Event!


Support Ranking Digital Rights!

If you’re reading this, you probably know all too well how tech companies wield unprecedented power in the digital age. RDR helps hold them accountable for their obligations to protect and respect their users’ rights.

As a nonprofit initiative that receives no corporate funding, we need your support. Do your part to help keep tech power in check and make a donation. Thank you!

Donate

Subscribe to get your own copy.

**FOR IMMEDIATE RELEASE**

October 11, 2022

Contact: comms@rankingdigitalrights.org

Ranking Digital Rights Launches New Research Lab to Help Global Civil Society Groups Hold Big Tech Accountable 

Washington, D.C. – Today, Ranking Digital Rights (RDR) is launching the RDR Research Lab, which will serve as a learning hub for researchers and advocates across the globe. It provides the tools necessary to implement and adapt RDR’s methodology and human rights standards in various local contexts to help keep digital platforms, telecommunication companies, and other digital service companies accountable for safeguarding the human and civil rights of users.

Since 2013, the RDR methodology has served as the gold standard for keeping Big Tech power in check. The RDR Corporate Accountability Index, including the Big Tech and Telco Giants scorecards, is the only open dataset on companies’ commitments and policies affecting users’ rights. Since this work began, many companies we rank have made significant improvements in their adherence to human rights principles and transparency. But, with growing threats to these rights stemming from disinformation, surveillance advertising, as well as network shutdowns and internet censorship, among others, much work remains. These threats are amplified in the majority world where, for too long, tech companies have taken advantage of information asymmetries to further neglect user rights, resulting in well-known instances of violent unrest and human rights violations.

Civil society organizations around the world have been inspired by RDR and have used our standards to push technology companies toward greater respect and protection of people’s rights to both privacy and freedom of expression. Between 2016 and 2021, nine research reports were published that adapted RDR’s methodology in Pakistan, India, Kenya, Senegal, Russia, the Arab region, New York City, Iran, and Ukraine. 

In 2021, RDR began providing direct guidance and technical assistance to civil society organizations. Since then, new research has been published, with RDR’s direct guidance, in Lesotho, Angola, the Democratic Republic of Congo, Central African Republic, Cambodia, Indonesia, Maldives, Nepal, Philippines, and Sri Lanka. With the support of the Research Lab, RDR will expand the network of researchers and advocates using RDR’s methodology to hold tech power to account.

 

RDR Global Partnerships Manager Leandro Ucciferri says: 

New apps and digital services are entering the market at unprecedented rates, thanks both to local tech companies and global monopolies like Amazon and Meta. These services have permeated many facets of our lives, but as they become increasingly intertwined with the way we interact with society around us, a lack of accountability means that they also pose serious threats to our rights. The collection of troves of our personal data by companies with opaque, if any, policies for safeguarding it should worry us all. This trend jeopardizes everything from our right to reproductive health care to the integrity of our electoral systems. And it is that much worse outside of the regulatory environments of Europe and the United States, where even greater negligence of human rights standards has resulted in gross violations.

It is therefore imperative that we bring increased scrutiny to as many corners of the tech industry as possible. RDR has provided, and will continue to provide, the standards needed to measure whether tech company policies respect the human rights of their users. In the years to come, the Research Lab will therefore have a key role to play in helping to grow a successful global tech accountability movement.  

 

Media contact: comms@rankingdigitalrights.org

Ranking Digital Rights is an independent tech research and policyprogram at New America in Washington, D.C. RDR ranks leading tech and telecom companies on their publicly disclosed policies and practices affecting users’ freedom of expression and privacy. 

Learn more about Ranking Digital Rights:

Our website: http://rankingdigitalrights.org 

Twitter: https://twitter.com/rankingrights/ 


Today we formally launch the
RDR Research Lab, a new resource that guides digital rights researchers through the process of designing, executing, and promoting research on platform accountability anywhere in the world, using RDR’s methodology and standards. This guide is based on our experience producing the RDR Corporate Accountability Index and Big Tech and Telco Giants Scorecards, as well as on feedback from civil society partners who have published their own RDR-style reports.

Civil society organizations around the world have been inspired by RDR’s mission, using our open methodology and standards to push technology companies toward greater respect for, and protection of, people’s rights to privacy and freedom of expression. Between 2016 and 2021, nine research reports were published around the globe that adapted our methodology to evaluate the policies of tech companies and their potential impact on human rights in local environments. These included studies on Pakistan, India, Kenya, Senegal, Russia, the Arab region, New York City, Iran, and Ukraine. 

Most of these efforts were undertaken independently, with little to no direct support from RDR. But in 2021 we were awarded two grants that enabled us to provide direct guidance and technical assistance to civil society organizations around the world, and particularly in the global south. Our goals are both to expand the network of researchers and advocates using RDR’s methodology to hold tech power to account and, through their research, to help equalize an information asymmetry that has allowed tech companies to pay less attention to their platforms’ human rights risks in the majority world than at home. 

Still, even with added resources, we can’t be everywhere at once, so we created the Research Lab to explain our research process and approach and make it easier for new researchers to get involved. The Lab consists of four sections—Prepare, Collect, Analyze, and Apply—each of which offers guidance for a specific part of the research process. We describe the Lab in more detail below.

To develop the Lab, we tried to better understand how people interact with our methodology, looked at potential roadblocks in evaluating a range of tech companies and digital services across the globe, and considered how researchers could make the most of our standards. So far, we have helped guide several local digital rights organizations to publish new research, in Lesotho, Angola, the Democratic Republic of Congo, Central African Republic and in Cambodia, Indonesia, Maldives, Nepal, Philippines, and Sri Lanka. Coming up are adaptations from across Eastern Europe, Southern Africa, and South America.


Diving Into the Research Lab

When you access the RDR Research Lab landing page, you will see buttons for four sections, dedicated to the distinct stages of a project:

  1. Prepare(ing your research)
  2. Collect(ing your data)
  3. Analyze(ing your data)
  4. Apply(ing your research)

In the Prepare section, researchers can read about the fundamentals of writing a project brief and learn how to structure a research project based on our standards. Templates are provided in this section that can be used to carry out a risk assessment and a jurisdictional analysis of the targeted region/country. Other tools for complementing the research and policy analysis are also included, alongside examples drawn from RDR’s indicators.

In the Collect section, you’ll find the research guidance needed to directly evaluate companies, based on our 58 indicators. This guidance is available in English, French, Spanish, Arabic, Russian, and Portuguese. We’ve also included recommendations for software tools that researchers can use to improve their workflow. In both the Prepare and Collect sections, we have included useful checklists to help you keep track of what tasks you need to complete at each stage of the process.

In the Analyze section, you’ll find suggested approaches for studying the data collected and creating narratives that highlight your research findings. You will also find recommendations for data visualization tools, along with a tutorial for learning how to create your first charts.

Finally, in the Apply section, we provide tips and best practices that will help you strategically engage with tech companies, as well as ideas for potential advocacy actions targeting other stakeholders, including policymakers and regulators.

Through our recent partnerships, RDR has helped organizations establish baselines for tech accountability in countries where the industry had, in many cases, so far dodged real scrutiny. Whether through the examination of new types of companies or by evaluating the local subsidiaries of corporations already ranked in our Big Tech and Telco Giants scorecards, we hope to continue seeing greater scrutiny of the tech industry globally. And we believe the RDR Research Lab will play a key role in facilitating that work. 

We want to hear from you! If you have ideas or feedback about the materials in the Research Lab, or if you’re interested in carrying out your own research using our methods and standards, write to us at partnerships@rankingdigitalrights.org.


A woman looks at an inscription on the wall indicating an uncensored Internet spot, 2016. 


In this 2021 conversation, RDR’s Global Partnerships Manager Leandro Ucciferri and RDR’s former Communications Officer Aliya Bhatia spoke with
Roya Pakzad and Melody Kazemi, authors of the joint Filterwatch and Taraaz report “Digital Rights & Tech Sector Accountability in Iran” that evaluated six domestic and international messaging apps popular in Iran using RDR’s standards, between January and September 2020. This included four domestic services (Soroush, Gap, Bale, and Bisphone) and two foreign competitors (WhatsApp and Telegram). 

In September 2022, protests erupted across the country following the death of 22-year-old Mahsa Amini, who died in police custody after being arrested for “improperly” wearing her hijab, according to the authorities who arrested her. This mass mobilization was sparked over social media, where news of the death spread rapidly. Yet online organizing has been met with “nightly internet and app outages,” according to the New York Times, along with impediments to messaging encryption and restricted Google searches. With fears that an impending internet bill could block what remains of Iranians’ access to social media, we believe this interview, and the work that preceded it, is more important than ever.

The report found that though all companies disclosed something about their privacy policy and terms of services, none of the Iranian companies evaluated disclosed how, or how much, they enforce them. Meanwhile, foreign companies often failed to translate this kind of information into Persian and lacked transparency around language and country-specific procedures for enforcement. Alongside the report, the authors created a workbook intended to assist Iranian tech companies wishing to respect human rights self-assess and begin internal discussions.

Companies generally received the lowest scores for policies relating to the handling of government or third-party requests for users’ information, the censoring of content, and the restriction of accounts. This lack of transparency occurs alongside state-controlled internet localization: The Iranian government has heavily involved itself in the growth of the domestic technology sector. E-government services have recently been added onto messaging apps, raising concerns about data sharing with the government. 

This report makes clear that the threats faced by Iranian social messaging users stem both from major tech companies, such as Meta-owned WhatsApp, but also from much smaller, localized companies, highlighting the need for accountability across all forms of tech. This research also makes clear that RDR’s standards have an important role to play in assessing user’s freedom of expression and privacy to evaluate companies in countries facing extreme government repression. 

The following conversation touches upon the country’s government-led startup boom, the localization efforts that threaten users’ rights, and the unique role that the diaspora plays in advocating for digital rights within Iran.

 

Leandro Ucciferri: Roya, Melody, welcome! Please introduce yourselves.

Roya Pakzad: I’m the founder and director of Taraaz, a research and advocacy nonprofit based in Santa Cruz, California. I’m originally from Iran but I’ve been living in the U.S. for the past 11 years. My background is in electrical engineering and human rights studies. Since 2015, I have been focused on human rights issues in the context of digital technology.

Melody Kazemi: I’m a researcher at the Filterwatch project, which focuses on monitoring and providing analysis on developments around internet policy and digital rights in Iran. We provide monthly outputs on network connectivity, shutdowns, and the overall state of the internet in Iran. This report was part of our collaboration with Roya at Taraaz to look further into the landscape of the private sector and their impact on human rights in Iran.

LU: I think from the outside, for people who are not familiar with the Iranian context, it may be surprising to learn how active the startup and tech community really is and the number of players in the ecosystem. What’s the relationship between users and companies, including startups, like? Is there a lot of trust in startups, in services, in apps? Or, given tight government control, are people really cautious about using them?

RP: There is not much trust because Iranian users know their government, they know that the government would like to have control.

Very early on, after the Green Movement [2009], when the government started blocking Twitter and Facebook, the Iranian diaspora community started working on providing circumvention tools to local users and just generally providing cybersecurity, as well as raising awareness about digital rights issues. There was this kind of capacity in the Iranian community to know that, “Okay, we cannot trust a service that is offered by the government or is controlled heavily by the government.”

Iranians have always known that the government is going to, and wishes to, control the internet and the information available to them.

MK: Yeah, Iranians have always had to live under a very restricted internet. And then, as platforms became more ubiquitous and people became more familiar with them and the government understood the potential uses for mobilizations and for sharing information and their association with social movements, the government began filtering them. So Iranians have always known that the government is going to, and wishes to, control the internet and the information available to them. 

Telegram is the most popular messaging app because it lets you create channels with huge numbers of people (according to Telegram, groups can include up to 200,000 people) that can be used for information sharing or organizing movements. It makes sense that Iran would try to make its own alternatives [to Telegram] and make it a part of its mission to control and localize the internet, not just to filter and block international platforms. Because the amount of information and the sensitivity of the information shared on those platforms is so important to them.

A lot of Iranian messaging apps that have government connections have made profiles without users’ consent by copying profiles from Instagram, for example. And the government has created more and more incentives to push users onto these accounts, for example by offering vaccination appointments through these apps. In one instance, we saw that certain university students needed to have one of the messaging apps in order to be able to get a verification code to access the university’s online platform. So, for these government-affiliated or government-owned apps, it’s not necessarily about how well they’re competing with their international counterparts, it’s just that they’re making sure they become a part of people’s daily lives.

And they’re not going to go away. With the increasing use of internet shutdowns in Iran to disrupt movements, we’ve seen domestic platforms stay online, thanks to our domestic infrastructure, while the global internet is cut off. Some users, just out of sheer necessity, in order to be able to communicate with each other, will have to use these domestic alternatives.

You can therefore see how important messaging apps have become in Iran’s policy of controlling the internet. And it’s especially important in a context where users don’t have any legal protections or regulations that might hold platforms to account—there is no independent judiciary and people are constantly prosecuted for their online behaviors.

With the increasing use of internet shutdowns in Iran to disrupt political movements, we’ve seen domestic platforms stay online while the global internet is cut off.

LU: Is that why you chose to focus on messaging apps for your research?

RP: Since the inception of the Islamic Republic of Iran in 1979, the government has had an information monopoly. Many people began to fear that messaging apps threatened to jeopardize this information monopoly. We heard representatives from [state-run companies] comparing themselves positively to their foreign competitors, like Telegram and WhatsApp, but there wasn’t any substance behind their claims. So, with the RDR Index in hand, we had an opportunity to tell them, “Here’s an actual ranking that you can use to compare yourselves based on international standards.”

We tried to look at state-run and other telecommunications companies—Hamrahe Aval (MCI) and MTN Irancell, for example—but they don’t have enough information about their privacy policies and terms of services. We wanted to use the RDR methodology as an introduction to the concept of business and human rights and the role of the private sector in upholding human rights. We wanted to showcase services that have at least a minimum in terms of privacy policies.

We wanted to use the RDR methodology as an introduction to the concept of business and human rights and the role of the private sector in upholding human rights.

Aliya Bhatia: With this interest in building local competitors as a means to localize the internet, can you tell me more about how the government views foreign companies? Has there been antagonism against local companies that don’t comply? There are incentives to drive user engagement, but is there any proposed regulation to restrict the use of these local or even foreign apps?

MK: There are problems with intermediary liability issues. We’re seeing a trend where the owners and founders and creators of these platforms are being held liable for user-generated content or user behavior on these platforms. The law isn’t clear and the Iranian government immediately goes after them, arrests them and fines them, or hands out prison sentences. We’ve seen this quite a few times in the past year at least, if not longer.

The second big thing that every Iranian has been talking about is this draft legislation that we call the “User Protection Bill.” It has changed names a number of times, so if you hear about an internet bill in Iran, it’s probably this one. The government has also been looking into creating state-sanctioned VPNs that the government can control. Users would have to qualify through a government scheme to use them. Based on who you are, based on identity or profession, they would grant you a different level of access to the internet, something our colleague Kaveh Azarhoosh called “layered filtering,” which is sort of the grander vision for Iran’s internet.

There’s been a huge amount of public backlash about the bill from ordinary users, but also a lot of vocal backlash and criticism from Iran’s tech sector about the dangers that filtering international services will pose to them. There’s been a huge petition, signed by over a million people in Iran, to stop the progress of the bill. And as we’re talking, it’s still going through the parliamentary process where it’s being reviewed. In April [2022], Iranian deputies voted to dissolve the committee which had been charged with determining the future of the bill, which means that this is now in the hands of the entire Iranian parliament.

Based on who you are, the government could grant you a different level of access to the internet, something that has been referred to as “layered filtering.” This is the grander vision for Iran’s internet.

AB: Correct me if I’m wrong, but it seems that filtering makes it inevitable that most Iranians would have multiple messaging apps on their device. Do locals prefer to use international apps or is there a similar movement amongst citizens to use domestic apps instead of WhatsApp and Telegram?

RP: Before Telegram was being filtered, or blocked, there was, as mentioned, a lot of interest in Telegram because Telegram is super user-friendly from a technical standpoint. Once it became filtered, users moved to WhatsApp. People want to be in touch with, or access, certain channels and information, so they choose a messaging app based on that. They can use VPNs to use Telegram and then if they cannot use VPNs, they use WhatsApp. The real issue is with this e-government shift, where you have to use certain apps to register for your entrance exams for university, or you need your vaccination card, or you need to use the e-Health app, the list goes on. So yes, it will become inevitable to use multiple apps.

MK: Definitely. There’s an article from Insider that compares the local alternatives in Iran to their international competitors. Iranians still prefer international apps, Telegram and WhatsApp, to domestic equivalents. Iranians have gotten quite tech savvy with VPNs and trying to go around filtering. So natural, organic uptake of these local apps is quite low comparatively.

LU: How did you approach adapting the methodology to the Iranian context? From previous conversations with Roya, I know that it was a huge decision to deploy the whole methodology, which is a big undertaking. Walk us through a little bit of that process.

RP: For a long time now, the Iranian digital rights ecosystem has been Iranian people resisting government censorship and the Iranian government trying to censor the internet. If you read literature from 2008 until 2016, you see that civil society wasn’t really focusing on the role of companies in their digital rights advocacy. The focus was mainly on government censorship. So we wanted to say, “Oh no, there are so many actors in the middle, and we have to focus on them because they have a responsibility too.” Part of that was just introducing the idea of corporate social responsibilities.

We wanted to introduce GNI (Global Network Initiative), a non-governmental organization that assists companies in respecting freedom of expression and privacy rights when faced with pressure from governments to hand over user data or remove or restrict content, into the conversation. We wanted to introduce multi-stakeholder engagement. We wanted to introduce human rights impact assessments. We wanted to introduce Ranking Digital Rights’s great index and show that you can use that for evaluating yourself as a company, or journalists can use it to evaluate you. That’s why we didn’t just pick certain indicators, we used all of them, because the main purpose was an educational approach with the idea of business and human rights, introducing all of the ideas of human rights due diligence and human rights impact assessment policies.

We did have to adapt for the context of Iran and its current lack of discussion about business and human rights. And the other thing we noted is e-government services being an add-on to other services. The government incentive to use Iranian messaging apps also means you pay less than you do to use Telegram or WhatsApp, because the data that you pay for costs less than data to access foreign apps. If you don’t have enough money to pay for VPNs, it means that you can only use Iranian apps, which penalizes people because of their socio-economic status, as the government changes the tariff for data, for example. In the context of Iran, we had to pay attention to the narrative that we use and explain why we are using privacy and freedom of expression indicators and mixing them with a discussion of the socio-economic context.

LU: What are some of the takeaways and impact achieved since the report was first released?

RP: Kaveh and I also recently worked with the Iran Academia’s MOOCs program to record a lecture based on our RDR report. We have seen a lot of attention directed at the role of technology companies and technologists in digital rights in Iran. The gap that we saw back in 2017, with regards to the lack of attention to the private sector, has been shrinking dramatically in just a year. We have seen so much mobilizing, dialogue, and resistance from the tech ecosystem in Iran against government policy, like tech companies putting up banners on their websites publicly announcing their objection to the bill. There have also been cases of naming and shaming public-private partnerships and contracts.

Companies have told us, informally and through back channels, that they are interested in using the workbook to revise their policies and update them. A non-ranked company even asked me to give a talk in their forums and for their employees (which I decided not to do, because I was worried about getting them in trouble). We have seen ICT journalists inside the country using approaches from the RDR Index to compare company policies.

LU: Roya, you mentioned to me that you had a lot of engagement with some of the local messaging app companies. What were those interactions like?

We have seen so much mobilizing, dialogue, and resistance from the tech ecosystem in Iran against government policy.

RP: Company engagement is something that we learned a lot from. The companies that we evaluated completely ignored us, to be honest with you. Sometimes we saw that some people from the evaluated company added us on LinkedIn. So we knew that they read the report, but they didn’t engage, even though we contacted them over email, we sent Twitter messages, we sent LinkedIn messages.

But non-evaluated companies, such as marketplace apps, said, “Oh we want to update our policies and we will use the workbook.” Because they were not evaluated they were like, “Okay, we are safe.” They interacted with us and with journalists and students in tech policy; they were interested. I think Melody and I, and the general diaspora community, always have these concerns about how to engage and how to balance safety with engagement.

So those are some things that might be helpful for researchers, who, like us, cannot go back to their country. Yet they always have to be worried about the safety of the people inside the country. We always talk about this: What is going to happen to someone if we mention their name, mention their company, mention their policy? That is something I think about for researchers that are going to adapt Ranking Digital Rights’s methodology, who are part of diaspora communities like us; it will be important.

Interested in more? You can follow Taraaz at @TaraazResearch, Roya is also on Twitter at @RoyaPak.

The Role of Domestic Messaging Apps in Iran’s Information Controls by Melody Kazemi can be found on the Filterwatch site.

The full report is also available online: Digital Rights and Technology Sector Accountability in Iran.

If you’re a researcher or advocate interested in learning more about our methodology, our team would love to talk to you! Write to us at partnerships@rankingdigitalrights.org.