After the launch of her pivotal book Consent of the Networked, which called upon civil society to hold Big Tech accountable for human rights, RDR is founded by Rebecca MacKinnon.
RDR collaborates with Consumer Reports to develop a digital standard to measure the privacy and security of products, apps, and services.
The UN Human Rights Council’s report on platform content regulations cites RDR’s findings and recommendations.
Algorithmic and targeted advertising indicators are added to RDR’s methodology and standards, representing a growing consensus around the functioning of the surveillance advertising business model.
RDR’s seminal “It’s the Business Model” report series is published. The report argues that the rampant misinformation and hate speech being perpetuated by social media companies is not the sole product of a lack of content moderation, but is, instead, a product of the “business model” itself.
RDR begins direct investor engagement aimed at helping investors craft proposals, global engagement to support civil society groups adapting the RDR methodology, and policy engagement on issues like surveillance advertising.
RDR produces its first mini report, evaluating TikTok against its peers.
RDR joins other civil society organizations for the launch of the Santa Clara Principles 2.0, a civil society initiative to provide clear transparency guidelines for digital platforms.
GLAAD releases its second Social Media Safety Index, evaluating the safety of LGBTQ users on five major platforms, with a Platform Scorecard using the RDR methodology. A 2023 Index is released in June of that year.
RDR celebrates 10 years of tech accountability in action, and begins a reevaluation process of its methods and standards to ensure they can best serve its various stakeholders, including civil society and investors.
RDR begins work on a new Generative AI Accountability Index.