Navigating the murky waters of dark patterns

It's important for testers to not only recognize dark patterns, but to also feel empowered to raise concerns about the risks anti-patterns can create.

You may have seen my post on LinkedIn or another social networking site asking for help. For those who didn’t see it, I was attempting to get a fake Instagram account that was impersonating me taken down. This fake account was stealing my photos and stories, and reaching out to people demanding payment for services. Eventually, after myself and others reported the account, it was deleted. Interestingly, the account was taken down but Instagram responded to every report with a message that the account did not violate their terms of service and it was allowed to remain.

This experience has me thinking about dark patterns, sometimes called anti-patterns. Dark patterns are UI or interaction designs that intentionally deceive or manipulate users. While often profitable and successful in the short term, dark patterns have the potential to jeopardize the safety of the business and users over time. 

As testers, we have a responsibility to our users and organizations to call out these risks when we encounter them.

What are dark patterns?

From July to December 2020 Instagram removed ​​65,277,800 posts from the platform. Each one of these removed posts represents a report made by a user and not all reports made by users are valid. This means the number of reports in that 6 month period is likely in the hundreds of millions. Needless to say, not every report can be reviewed by a human content reviewer. Some posts will be acted on by an AI content review, some will be reviewed by one of Instagram's thousands of human reviewers, and others will go unacknowledged by the platform and eventually close due to age of the report or the account being shut down for other reasons. So what do you do if you’re a social media application and you’re getting more reports than you can handle? One tactic might be t convince users that it’s not worth their time or effort to report content, therefore lowering the total number of reports that come in.

From my perspective, systems that manipulate users into believing their actions will not result in desired outcomes is a dark pattern — and it isn’t the only one. There are at least 12 recognized dark patterns that intentionally manipulate, trick, or confuse users. 

Dark patterns to be aware of

There are several dark patterns and it’s helpful to become familiar with them. The patterns you’ll encounter most often are dependent on industry and the context of your software. Below are some of the more common dark patterns you may encounter.

Trick question: Wording of a question is intentionally misleading or ambiguous. Wording may use double negatives or use words with multiple and similar meanings to confuse users. For example a dialogue when canceling a subscription service that asks “Would you like to continue or cancel?”

Privacy Zuckering: Application uses deceptive tactics to lure users into sharing more information than they feel comfortable sharing. This information may be used by the application itself or shared with data brokers. This is a more recent addition to the list of dark patterns and was central to FTC findings in claims against the mental health service application BetterHelp.

Forced continuity: An application requires you to provide a credit card number upfront to start a free trial and automatically renews and charges users unless canceled prior to auto renewal date.

Forced enrollment: This is a common ecommerce tactic that requires users to register in order to view the content of a site. 

Why testers should care about dark patterns

UX and interaction design can often feel like decisions made in isolation and removed from our purview as testers. This may make dark patterns feel like something that we as testers don’t need to concern ourselves with, but this couldn’t be further from the truth. There are two primary reasons testers should care about dark patterns and the experience of using our software.

It’s the right thing to do

It goes without saying that doing the right thing should be paramount in our testing decision making. Testers need to advocate for doing the right thing for the user and the business. Utilizing dark patterns can lead to a short term profit gain, but in the long run will likely cost more than was gained. Additionally, over time users will grow to distrust the business and the software being used. In the months following the Facebook Cambridge Analytica scandal, Facebook saw a 20% drop in actions taken by users on the site. Regardless of revenue generated by the use of the dark pattern Privacy Zuckering, the reputation loss, government investigations, and increased distrust by their user base has cost Facebook in significant ways. 

Dark patterns are a risk 

Testers are responsible for identifying and helping the team mitigate risk and dark patterns represent significant risks to both the business and users. Users can be exposed to exploitative practices using their personal information or become trapped in paid subscriptions that are impossible to cancel. Personally, I’ve found myself on the receiving end of more than a few subscriptions that auto renew and require complicated processes to cancel. Or worse, subscriptions that I didn’t know existed and billed me for years without me knowing.

The business can be even more at risk than users when dark patterns are used. Since 1999 there have been at least 87 unique regulations from across the globe created to safeguard users from deceptive and harmful practices. These regulations range from rules like the CAN-SPAM Act in the US that bans deceptive email practices to broad sweeping rules contained in GDPR. Violating these regulations can result in costly penalties and open organizations up to potential government intervention.

What testers can do 

The most important thing testers can do about dark patterns is speak up when you feel something is hostile or harmful to users. While there are defined patterns listed in various taxonomies, anything that’s hostile to a user can be considered a dark pattern. So if you see something, say something!

Additional Resources 

If you’d like to learn more about UX dark patterns, here’s some resources to get you started.

You've successfully subscribed to Qase Blog
Great! Next, complete checkout to get full access to all premium content.
Error! Could not sign up. invalid link.
Welcome back! You've successfully signed in.
Error! Could not sign in. Please try again.
Success! Your account is fully activated, you now have access to all content.
Error! Stripe checkout failed.
Success! Your billing info is updated.
Error! Billing info update failed.