From LGBTQ Rights Online to User Rights in Iran: Hear Directly from RDR’s Partners

Share Article


Since 2018, organizations across various socio-political contexts have adapted RDR’s methodology to hold platforms and other ICT services accountable. What do they each have in common? Protecting the rights of some of the globe’s most marginalized and discriminated against populations. Two of the most successful and notable cases of adaptations include: measuring how well social media companies protect LGBTQ rights online and evaluating messaging apps under conditions of state repression in Iran.

The former was conducted by American LGBTQ media advocacy organization GLAAD to create their now yearly Social Media Safety Index (SMSI), which includes a Platform Scorecard evaluating five top platforms (Facebook, Instagram, Twitter, YouTube, and TikTok) on how well they protect LGBTQ users. Some of the main issues identified by Jenni Olson, the Senior Director of the Social Media Safety Program, as harming LGBTQ users include: inadequate content moderation and enforcement, harmful algorithms, and a lack of transparency. The SMSI helps shed light on how online hate speech, as well as misinformation and conspiracy theories, toward LGBTQ people is able to spread and manifest online unchecked. Each year, the SMSI has made headlines. But the 2023 SMSI stirred the most controversy for the large drop in score experience by Twitter. The company, now under the helm of Elon Musk, fell 12 points (all other platforms saw their score increase) and became the most dangerous company for LGBTQ people this year.

The second adaptation was used to create a report called “Digital Rights & Technology Sector Accountability in Iran” through a joint collaboration between Filterwatch and Taraaz, examining both local and international messaging apps. When Roya Pakzad, Taraaz’s Founder and Director, last spoke to us about her work on the report, protests were ongoing over the death of 22-year-old Mahsa Amini, who died in police custody after being arrested over the “improper” wearing of her hijab. The mass mobilization that followed was sparked over social media and was met with internet shutdowns and outages. Meanwhile, the government was being accused of pushing through the “draconian” Internet User Protection Bill, which may vastly curtail what Iranians could access on the web.

These two conversations with Jenni and Roya were first published in 2022, as part of our interview series, Digital Rights Dialogues. Among other things, we spoke to them about why they chose to use the RDR methodology, how they adapted it, and how it is helping them achieve their goals:

Jenni Olson: We had been thinking about doing a scorecard and trying to decide how to go about that. We knew that we wanted to lean on someone with greater expertise. We looked to Ranking Digital Rights as an organization that is so well respected in the field. We wanted to do things in a rigorous way. We connected with RDR and you guys were so generous and amenable about partnering. RDR then connected us with Goodwin Simon Strategic Research, with Andrea Hackl [a former research analyst with RDR] as the lead research analyst for the project. That was such an amazing process and, yes, a lot of work. With Andrea, we went about developing the 12 unique LGBT-specific indicators and then Andrea attended some meetings with leaders at the intersection of LGBT, tech, and platform accountability and honed those indicators a little more and then dug into the research. For our purposes, the scorecard seemed like a really powerful way to illustrate the issues with the platforms and have them measured in a quantifiable way.

Though it’s called the “Social Media Safety Index,” we’re looking not only at safety, but also at privacy and freedom of expression. We developed our indicators by looking at a couple of buckets. The first being hate and harassment policies: Are LGBTQ users protected from hate and harassment? The second area was around privacy, including data privacy. What user controls around data privacy are in place? How are we being targeted with advertising or algorithms? Then the last bucket would be self-expression in terms of how we are, at times, disproportionately censored. Finally, there is also an indicator around user pronouns: Is there a unique pronoun field? Due to lack of transparency, we can’t objectively measure enforcement.

What we end up hearing the most about is hate speech, but it’s important to note that LGBTQ people are also disproportionately impacted by censorship. We’re not telling the platforms to take everything down. We’re simply asking them to enforce the rules they already have in place to protect LGBTQ people from hate. 

I’m not naïve enough to believe that the companies are just going to read our recommendations and say “Oh wow, thank you, we had no idea, we’ll get right on that, problem solved, we’re all going home.” This kind of work is what GLAAD has done since 1985: create public awareness and public pressure and maintain this public awareness and call attention to how these companies need to do better.

There are times when it feels so bad and feels so despairing like, “Oh, we had this little tiny victory but everything else feels like such a disaster.” But then I remind myself: This is why this work is so important. We do have small achievements and we have to imagine what it would be like, how much worse things would be, if we weren’t doing the work. I’m not naïve that this is going to create solutions in simple ways. It is a multifaceted strategy and, as I mentioned a minute ago, it is also really important that we’re working in coalition with so many other civil society groups, including with Ranking Digital Rights. It’s about creating visibility, creating accountability, and creating tools and data out of this that other organizations and entities can use. A lot of people have said, “We’re using your report, it’s valuable to our work.”

Roya Pakzad: For a long time now, the Iranian digital rights ecosystem has been Iranian people resisting government censorship and the Iranian government trying to censor the internet. If you read literature from 2008 until 2016, you see that civil society wasn’t really focusing on the role of companies in their digital rights advocacy. The focus [of the report] was mainly on government censorship. So we wanted to say, “Oh no, there are so many actors in the middle, and we have to focus on them because they have a responsibility too.” Part of that was just introducing the idea of corporate social responsibilities.

We wanted to introduce GNI (Global Network Initiative), a non-governmental organization that assists companies in respecting freedom of expression and privacy rights when faced with pressure from governments to hand over user data or remove or restrict content, into the conversation. We wanted to introduce multi-stakeholder engagement. We wanted to introduce human rights impact assessments. We wanted to introduce Ranking Digital Rights’s great index and show that you can use that for evaluating yourself as a company, or journalists can use it to evaluate you. That’s why we didn’t just pick certain indicators [from Ranking Digital Rights’s methodology, we used all of them, because the main purpose was an educational approach with the idea of business and human rights, introducing all of the ideas of human rights due diligence and human rights impact assessment policies.

We did have to adapt for the context of Iran and its current lack of discussion about business and human rights. And the other thing we noted is e-government services being an add-on to other services. The government incentive to use Iranian messaging apps also means you pay less than you do to use Telegram or WhatsApp, because the data that you pay for costs less than data to access foreign apps. If you don’t have enough money to pay for VPNs, it means that you can only use Iranian apps, which penalizes people because of their socio-economic status, as the government changes the tariff for data, for example. In the context of Iran, we had to pay attention to the narrative that we use and explain why we are using privacy and freedom of expression indicators and mixing them with a discussion of the socio-economic context.

[A colleague] and I also recently worked with the Iran Academia’s MOOCs program to record a lecture based on our RDR report. We have seen a lot of attention directed at the role of technology companies and technologists in digital rights in Iran. The gap that we saw back in 2017, with regards to the lack of attention to the private sector, has been shrinking dramatically in just a year. We have seen so much mobilizing, dialogue, and resistance from the tech ecosystem in Iran against government policy, like tech companies putting up banners on their websites publicly announcing their objection to the [Internet User Protection bill]. There have also been cases of naming and shaming public-private partnerships and contracts.

Companies have told us, informally and through back channels, that they are interested in using the workbook [we produced with the report] to revise their policies and update them. A non-ranked company even asked me to give a talk in their forums and for their employees (which I decided not to do, because I was worried about getting them in trouble). We have seen ICT journalists inside the country using approaches from the RDR Index to compare company policies.

Company engagement is something that we learned a lot from. The companies that we evaluated completely ignored us, to be honest with you. Sometimes we saw that some people from the evaluated company added us on LinkedIn. So we knew that they read the report, but they didn’t engage, even though we contacted them over email, we sent Twitter messages, we sent LinkedIn messages.

But non-evaluated companies, such as marketplace apps, said, “Oh we want to update our policies and we will use the workbook.” Because they were not evaluated they were like, ‘Okay, we are safe.’ They interacted with us and with journalists and students in tech policy; they were interested.

If you are interested in adapting the RDR methodology to achieve your corporate accountability goals in the tech sector, please get in touch at partnerships@rankingdigitalrights.org or visit our Research Lab for more information.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!