F3(c). Advertising targeting rules and enforcement

The company should clearly disclose its policies governing what type of advertising targeting is prohibited.

Elements:

  1. Does the company clearly disclose whether it enables third parties to target its users with advertising content?
  2. Does the company clearly disclose what types of targeting parameters are not permitted?
  3. Does the company clearly disclose that it does not permit advertisers to target specific individuals?
  4. Does the company clearly disclose that algorithmically generated advertising audience categories are evaluated by human reviewers before they can be used?
  5. Does the company clearly disclose information about the processes and technologies it uses to identify advertising content or accounts that violate the company’s rules?

Definitions:

Account / user account — A collection of data associated with a particular user of a given computer system, service, or platform. At a minimum, the user account comprises a username and password, which are used to authenticate the user’s access to his/her data.

Advertiser — A person or entity that has created and/or paid for advertising content. The advertiser typically determines the targeting parameters for each advertisement.

Advertising content policies — Documents that outline a company’s rules governing what advertising content are permitted on the platform.

Advertising targeting policies — Documents that outline a company’s rules governing what advertising targeting parameters are permitted on the platform.

Advertising audience categories — Groups of users, identified for the purpose of delivering targeted advertising, who share certain characteristics and/or interests, as determined on the basis of user information that a company has either collected or inferred.

Algorithms: An algorithm is a set of instructions used to process information and deliver an output based on the instructions’ stipulations. Algorithms can be simple pieces of code but they can also be incredibly complex, “encoding for thousands of variables across millions of data points.” In the context of internet, mobile, and telecommunications companies, some algorithms—because of their complexity, the amounts and types of user information fed into them, and the decision-making function they serve—have significant implications for users’ human rights, including freedom of expression and privacy. See more at: “Algorithmic Accountability: A Primer,” Data & Society: https://datasociety.net/wp-content/uploads/2018/04/Data_Society_Algorithmic_Accountability_Primer_FINAL-4.pdf

Clearly disclose(s) — The company presents or explains its policies or practices in its public-facing materials in a way that is easy for users to find and understand.

Targeting parameters — The conditions, typically set by the advertiser, that determine which users will be shown the advertising content in question. This can include users’ demographics, location, behavior, interests, connections, and other user information.

Third party – A “party” or entity that is anything other than the user or the company. For the purposes of this methodology, third parties can include government organizations, courts, or other private parties (e.g., a company, an NGO, an individual person).

Users — Individuals who use a product or service. This includes people who post or transmit the content online as well as those who try to access or receive the content. For indicators in the freedom of expression category, this includes third-party developers who create apps that are housed or distributed through a company’s product or service.

Indicator guidance: The ability for advertisers or other third parties to target users with tailored content—based on their browsing behaviors, location information, and other data and characteristics that have been inferred about them—can significantly shape a user’s online ecosystem. Targeting, which can include both paid and unpaid content, can amplify offline social inequities and can be overtly discriminatory. It can also result in so-called “filter bubbles,” as well as spread problematic content, including content intended to mislead or to spread falsehoods (see the draft indicators publication from 2019).

Therefore, companies that enable advertisers and other third parties to target their users with tailored ads or content should have clear policies describing their ad targeting rules. Companies should clearly disclose whether they enable third parties to target their users with tailored ads or other types of (non-paid) content (Element 1), and clearly disclose what targeting parameters—like using certain types of audience categories, like age, location, or other user characteristics—are not permitted (Element 2). Companies should also disclose their processes for identifying breaches to targeting rules (Element 5).

Potential sources:

  • Company advertiser portal, ad policy, political ad policy
  • Company acceptable use policy
  • Company support, help center, or advertiser FAQ
No Comments

Post A Comment

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!