Elements of Successful Ranking Systems

by Jon Diamond

Preliminary Recommendations

1.     Develop mission statement
2.     Define primary target audiences for the rankings
3.     Allow primary target audience concerns to help drive indicator formulation
4.     Cite sources for all input data
5.     Prioritize indicators of performance—even if data is not initially available
6.     Publicize results of company engagement to the extent possible
7.     Normalize performance indicators according to context (e.g. sector, legal jurisdiction)
8.     Include forward-looking indicators of both performance and disclosure
9.     Subscribe to recognized research quality standard(s), e.g. ARISTA 3.0, GISR, etc
10.   Ensure full transparency at every step of the process

Defining Materiality

The best rankings provide value both to the companies they rank and to their stakeholders by acting as a middleman of sorts, relaying valuable information from one party to the other. Many of the most effective rankings target their research and analysis toward a specific subset of stakeholders, e.g. consumers or investors, rather than targeting all stakeholders simultaneously. Those targeting more than one stakeholder should make clear which elements are meant for whom. For example, the Enough Project’s conflict minerals index [pdf] targets consumers specifically, arguing that “consumer activism will…serve as a motivator for corporate action.” Oxfam’s Behind the Brands similarly focuses on “the people who buy and enjoy [the ranked companies’] products.”

Transparency International UK’s Defence Companies Anti-Corruption Index, on the other hand, serves as an example of ratings which target multiple stakeholders simultaneously. It specifically cites defense company leaders, institutional investors, defense ministers and procurement chiefs, and civil society as target audiences and offers suggestions on how each constituency can make use of the ratings.

Benefits

Challenges

  • Opportunity for stakeholder engagement. What is most material to the rankings’ primary audience?
  • Transparent approach to defining indicators.
  • Useful way of keeping indicators to a manageable number.
  • Ensures relevance to companies and stakeholders.
  • If materiality is defined along the lines of stakeholders’ concerns, will need also to define the universe of stakeholders to be consulted.
  • May need to reconcile gaps between stakeholder concerns and underlying criteria based on international human rights norms.

Company Engagement

Rating the Raters—a project of independent think tank and strategic advisory firm SustainAbility assessing the quality of corporate sustainability ratings—notes that most rating systems engage with companies only at the beginning and end of the rating process. Out of a universe of 23 companies selected for in-depth evaluation, Rating the Raters found that the most successful ratings were those that “[take] a systematic approach to engaging companies in the ratings process,” e.g. through “webinars and in-person workshops.”

The CPA-Zicklin Corporate Accountability Index [pdf], which measures corporate disclosure and accountability with respect to political spending, consults with companies during the scoring process. The Index’s 2012 report noted that 45% of companies surveyed responded to correspondence with questions and comments and that “many companies committed to or implemented increased disclosure and oversight of political spending” as a result. Transparency International UK’s Defence Companies Anti-Corruption Index not only offers to discuss initial desk research with the companies themselves, but documents which companies respond and provides separate scorecards for public and public and internal information.

Benefits

Challenges

  • Build trust with companies and stakeholders.
  • Add value for the rankings’ target audience by bringing new information to the table.
  • Provide better insights on how policies are translated into practice.
  • Directly encourage improvements in company policies/practices.
  • Though a certain degree of company input may build trust and improve the methodology, too much input risks turning rankings into PR tool for the most active companies.
  • Companies may not be willing/able to publicly release details provided to raters.

Performance and Disclosure

Because many rankings rely predominantly on publicly available information, there is a strong tendency to measure only the extent to which companies disclose details on their policies and practices—or their transparency. Talk is, however, arguably cheap: rankings should aim to evaluate how companies implement their policies, given the opportunity to do so. EFF’s Who Has Your Back? ratings, for instance, covers both transparency (what service providers declare in their terms of service, acceptable use policies, and privacy policies) as well as performance (legal defense of their users’ rights).

In a similar manner, Oxfam’s Behind the Brands [pdf] explicitly covers four categories of indicators spanning transparency and performance–awareness, knowledge, commitments, and supply chain management–which intersect with the ratings’ seven themes, or issue areas. The Access to Medicine Index [pdf] also analyzes seven technical areas of focus along the four “strategic pillars” of commitment, transparency, performance, and innovation. Of these, performance bears the greatest weight of the four at 40%, “reflecting the widely held view that monitoring and evaluating performance drives results.”

Benefits

Challenges

  • Capture added value from company engagement. Companies may be able to direct attention to performance indicators that more closely reflect how they handle their own peculiar challenges.
  • Might lower scores across the board but would hopefully encourage higher performance (and not simply transparency) over time.
  • Companies may not have had distinct opportunities to “perform,” e.g. because they are small or because they operate in exceptionally liberal (or illiberal) political climates. Solution: Measure performance relative to the challenges companies face.
  • By definition, more difficult to gather data on performance than on transparency.

Looking Forward

Rankings should strive for a balance between what GISR’s 2013 report [pdf] describes as “lagging vs. leading,” or backward- vs. forward-looking measures of performance. Rating the Raters also notes that the most successful rankings include forward-looking criteria. The Dow Jones Sustainability Index and Maplecroft Climate Innovation Index both engage with companies on questions of innovation, for example.

Benefits

Challenges

  • Another opportunity to capture added value from company engagement. Most backward-looking indicators can be gleaned from the public domain, whereas forward-looking indicators are more likely to come from close collaboration with companies themselves.
  • Accelerate the “race to the top,” by encouraging companies to pursue improvements in both the near- and long-term.
  • Time-consuming to engage with each company individually.
  • Difficult to define forward-looking indicators from anything other than a policy perspective—unless the company has made concrete (e.g. financial, contractual) commitments to certain initiatives.

Transparency

Rankings should aim for maximum transparency with respect to their methodologies as well as their relationships with companies and stakeholders. To the extent possible, rankings should make public their ranking criteria and scoring schemes, including details on indicator weightings, how ambiguities in the data are resolved, and quality control mechanisms. The Access to Medicine Index provides a detailed description of all facets of its methodology, from the scope of companies ranked and geographical areas covered to the specific criteria and weighting schemes used. The Enough Project as well as EFF’s Who Has Your Back? provide comprehensive descriptions of their scoring schemes, both of which involve a simple point system. The Dow Jones Sustainability Index similarly details the ways in which it measures a wide range of variables including “intangibles”—though its system is more complex.

Rating the Raters additionally suggests providing details on the ranking team, e.g. their experience and level of involvement, as well as specific sources for all input. EFF’s Who Has Your Back? documents the sources for each of its claims.

Benefits

Challenges

  • Improves credibility and builds trust with companies and stakeholders.
  • Provides an avenue for constructive criticism and engagement with companies, stakeholders, and peer organizations.
  • Insofar as companies provide information unavailable in the public domain, it may be difficult to cite all sources consulted during research.

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!