The term “dark pattern,” coined in 2010 to describe deceptive user interfaces, has been gaining traction over the past several years. The term suggests an effort to trick, deceive, mislead and otherwise hoodwinking the average consumer. These are not the kind of activities in which most advertisers think they are engaging when creating a campaign. Instead, brands’ goals are to create a meaningful and valuable connection with consumers. Regulators, however, seem to think differently.
What are regulators so worried about? When designing ad campaigns that include aspects of information collection, advertisers should take heed. There is an increasing fear that marketers, in their zeal to cut through the noise and reach consumers, are defaulting consumers into connections they might not otherwise have agreed to.
In particular, regulators are worried that privacy choices presented to consumers are unduly confusing. As a result, they worry, consumers will part with more information than they would have done otherwise. Or, that advertisers will “trick” consumers into agreeing to use of their information which they would not have done had they truly understood how their information would be used. In other words, engaging in a dark pattern or creating “noise”—impairing or confusing judgment—as described by Cass Sunstein, Daniel Kahneman, and Oliver Sibony in “Noise: A Flaw in Human Judgment.”
Regulators who are focused on dark patterns include the Federal Trade Commission (FTC) and European Data Protection Board (EDPB). By way of example, the federal Consumer Finance Protection Bureau recently sued Active Network, the provider of an online payment portal, for enrolling consumers into annual subscriptions through “dark pattern” techniques. Of particular concern for companies is that some of the activities in which companies have engaged—activities labeled dark patterns—are not that unusual.
How can you avoid having your next interactive advertising campaign labeled a dark pattern? The following recommendations come from guidance released by the FTC, the EDPB, individual states’ negative option laws, the forthcoming California Age-Appropriate Design Code Act (and similar UK law), California’s CCPA and EU’s GDPR and its forthcoming EU Digital Services Act. Themes run throughout these laws and guidelines, and are worth keeping in mind when collecting information from consumers:
Don’t distort a consumer’s ability to make informed choices
Examples might be a “non-neutral” choice mechanism—or making it easier for an individual to give consent (clicking “agree” to get a pop-up to disappear). Other mechanisms of concern are having choices spread out over multiple pages or screens where the consumer gets lost
The FTC and California are also concerned about ambiguous toggle buttons. The European Union, for its part, has concerns with a variety of activities it sees as dark patterns, including “overloading” (an overload of options), “skipping” (setting things up in a way that users forget about privacy), “hindering” (not giving people the full information) and “fickle” (an unclear interface).
Examine an interface from a consumer’s perspective
This recommendation comes from the FTC and suggests it expects companies to research the impact an interface has on users. In other words, going beyond “the people on our team think this makes sense” to “we have done testing and have research to substantiate that this interface makes sense.” This work will be familiar for advertisers who spend time on claims substantiation.
Get express consent for information uses
This consent should be informed and should not rely on a pre-checked box. This is a particular concern for regulators when the request for consent is to use information in a way that is not clearly related to the purpose for which it is being collected or might be unexpected to the consumer.
Don’t use default to use information in unusual ways or collect more than is reasonable
This is a concern for the FTC. It is also prohibited under California’s upcoming Age-Appropriate Design Act with respect to children—a prohibition mirrored in the federal level Children’s Online Privacy Protection Act.
Making direct and meaningful connections is the holy grail for marketers, but they shouldn’t encourage flawed or impaired judgments. In fact, regulators want companies to have supporting evidence, akin to claims substantiation, for the fairness and transparency of their user interfaces.
"interface" - Google News
March 28, 2023 at 04:30PM
https://ift.tt/jWzikJ8
Dark patterns—how brands can avoid deceptive user interfaces in ad campaigns - AdAge.com
"interface" - Google News
https://ift.tt/0fgS56q
https://ift.tt/CV6ivAw
Bagikan Berita Ini
0 Response to "Dark patterns—how brands can avoid deceptive user interfaces in ad campaigns - AdAge.com"
Post a Comment