Misinformation By Design: RDR Helps Lay Bare the Surveillance Advertising Business Model

Share Article


In 2018, the Cambridge Analytica scandal helped propel the perils of surveillance capitalism into the mainstream. The following year, the release of Shoshana Zuboff’s pivotal book,
The Age of Surveillance Capitalism, cemented the issues of data privacy and targeted advertising as top problems of our time; not just for a bevy of experts, but for the public at large. It was in this context that Ranking Digital Rights released its first major report, It’s the Business Model. The report argued that the rampant misinformation and hate speech we were seeing perpetuated by social media companies was not the sole product of a lack of content moderation, and therefore could not be addressed through intermediary liability reform (in other words, by getting rid of Section 230).

Rather, it was argued that the pathologies of the online environment were the downstream result—a negative externality, in economics terms—of the incentives created by the industry’s targeted-advertising business model: collect and monetize all data, automate everything, scale up, and wait for the profits to roll in.

The report was influenced by recent changes that RDR had made to its methodology, as the consensus around these trends and their pervasiveness in the industry began to solidify. These changes included the addition of new indicators on algorithms and targeted advertising. As the report’s lead author, and Ranking Digital Rights’s former Policy Director, Nathalie Maréchal recalls, “the Big Tech business models had all kind of started to converge toward the collection and monetization of data, either for the purpose of advertising or for the purpose of AI development.” For these companies, the acquisition of data became both “a business imperative, and also an ideological imperative.” 

This was different from how things were back when the methodology for the first RDR Corporate Accountable Index was conceived in 2013. At the time, most of RDR’s indicators evaluated either “things that companies were doing at the behest of governments or things that basically amount to negligence [for example, poor data security].” But, since then, it had become clear, both to Maréchal and to RDR Founder Rebecca MacKinnon, that companies also made a lot of decisions based purely on their own self-interest. Meanwhile, Nathalie found herself fed up with the reigning policy discourse in D.C. and Brussels at the time, which gave the impression that “the only thing wrong with social media is that CEOs are insufficiently motivated to do content moderation correctly.”

Sara Collins, Senior Policy Counsel at Public Knowledge, agrees. For a while, most D.C. policy discussions centered around how platforms “may spread misinformation and threaten democracy” would, reflexively, also become about “how to get rid of Section 230.” As she explains, the report helped “thread the needle about why [data collection] has residual content harms.” This is especially important for organizations like Public Knowledge, which places a strong emphasis on free speech online.

Nathalie recalls a metaphor MacKinnon shared with her at the time: Performing only content moderation is like trying to remove pollutants in a stream using only a pipette. It was clearly going to be insufficient to clean up the polluted lake that was the vast networks of disinformation across platforms. The It’s the Business Model report was conceived of as part of a necessary narrative shift and drew on RDR’s new indicators to strengthen the connection between the business model and the harms RDR was observing directly through its company evaluations and close relationships with global civil society organizations.

Changing the Conversation: The Report Comes Out

Unfortunately, the report’s release event was planned for March 17, 2020, the day before the COVID-19 pandemic was declared; and the launch event was canceled. However, we know that the report had important implications across the policy sphere, with a number of allies reporting decisive effects of the report on their thinking about the business model.

Jesse Lehrich, Co-Founder of Accountable Tech, explained that “the It’s the Business Model report was really critical and ahead of its time as far as moving the advocacy community and policymakers to think beyond content moderation and deplatforming.” He describes the report as “formative” in shaping a lot of his organization’s work and, in particular, their “Ban Surveillance Advertising” campaign, which brought together over 50 organizations around the globe. A focus on the surveillance advertising business model has also served as a way to “break down silos” between different parts of the advocacy community, Jesse points out. 

Privacy advocates, civil rights groups, and anti-monopoly activists are sometimes at odds; but the business model was something they could all coalesce around. And this remains true as the community begins to grapple with the potential impacts of AI. In fact, the AI Now Institute, in a recent report, referenced Accountable Tech’s campaign to ban surveillance advertising as an important model. Though the arguments in RDR’s report spoke most clearly, at the time, to surveillance advertising, Nathalie Maréchal agrees that “today we see the same cold logic applied in the world of artificial intelligence and automated decision-making.”

Meanwhile, Jesse believes the report also played an important role in galvanizing legislation and regulatory frameworks that have come about since. He points, for example, to the inclusion of bans on targeted advertising of children and on the use of sensitive data for targeting in the EU’s Digital Services Act (DSA) as the kind of regulatory response that was made possible thanks, in part, to RDR’s work. Sara Collins agrees, noting that, “I do really think that [the report] has shaped how people are talking about the content space. You still obviously get the Section 230 bills, but now that’s not the only solution put forward.”

At the time of the report’s release, Anna Lenhart was working on tech oversight for Representative David Cicilline, in the House Judiciary Committee. One prime area of focus for her was ad targeting and ad libraries and understanding what kind of information is useful to measure ad targeting discrimination. By 2021, Anna was advising on a number of potential bills, including Congresswoman Lori Trahan’s Social Media Data Act, that took aim at the surveillance advertising industry. In particular, this act mandates that companies keep thorough ad libraries to help bring about transparency in ad targeting. One thing Anna was looking for while conducting her research were reports that provided examples of potentially problematic advertising and ad campaigns. And this was something she found in the Business Model report. “It’s always really helpful [to have examples] when you’re trying to tell the story to constituents or briefing members of Congress,” she explains.

That year, Nathalie was called upon to testify on the Hill and Anna requested her expertise during several meetings while the Congresswoman and her staff worked to craft the bill. Notably, Anna’s former boss, Congressman David Cicillini, himself also made several references to the “business model” during an antitrust hearing, while grilling leaders of the major Big Tech companies. Finally, at the international level, UN Special Rapporteur Irene Khan referenced the “business” model in an important report for the UN Human Rights Council report on “disinformation and freedom of opinion and expression.” 

Though the report was released at a time when other events and thinkers were helping to shift the conversation, RDR’s report played an essential role, at a pivotal moment, to help further popularize the idea of the “business model” as the real root of the growing problems of mis- and disinformation. Its release came at just the right time to help galvanize policymakers and civil society alike and to create a lasting imprint on ongoing policy conversations, conversations which have taken on new meaning and urgency with the growing AI arms race now upon us.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!