Skip to main content

User Experience Center


Dark Patterns: Identifying Dark Patterns (Part 2)

by Shaniah Tullis

Where do “design insights" end and "dark patterns" begin? This is part 2 of a two-part series on dark patterns.  In part 1 Soy Mamo, introduced dark patterns.  In part 2, I will cover different types of dark patterns in regard to design insights. 

Design insights refer to breakthroughs in UX research that reveals behavior patterns to drive bold decisions. Whereas dark patterns refer to methods used to persuade users into doing something they may not want to do for the benefit of sales and views. For example, a company observing that majority of their customers opt for $19.99 express shipping over $4.99 standard shipping near the Christmas season is a behavioral insight. However, the behavioral insight becomes a dark pattern when the company begins to automatically charge customers for express shipping without the consumer’s consent.  

 It is our duty as UX researchers and designers to keep the well-being of our global community in mind while designing all forms of products and services. There are cases of unethical design in practically all product domains. As UX designers and researchers we are responsible for influencing user behavior. Design Ethics is the practice of making conscious design decisions that provide no harm to anyone intentionally or unintentionally. Below are several forms of dark patterns with real-world examples.  


Types of Dark Patterns 

Sneak into the basket  

‘Sneak into the basket’ is when websites add items to users’ carts without their knowledge or consent. This is typically done through an opt-out radio button or checkbox on a prior page during the user’s purchasing journey. The following example is based on a real-live example of a website. 

The website offers customers a special “buy 3 and save 70%” bundle priced at $20. If the user selects the deal, on the next page, the website notifies the user of an automatic additional charge of $7.99 for a 1-year privacy protection plan. Unknown to the user, the privacy plan price is actually quadruple that amount ($7.99 x 4), because each item in the bundle the user is purchasing requires a separate privacy plan. In addition, the website tricks the user into buying a 2-year privacy protection plan, although the prior page stated that the privacy plan was for only one year. As a result, the user ends up paying $51.96 for a bundle that the website presented as a $20 deal. This type of dark pattern is extremely misleading to the user and may cause users to purchase products and services they do not wish to buy at a price they did not mean to spend. 


Roach Motel 

The Roach Motel refers to subscriptions or situations that are easy to get into but hard to get out of or cancel. For example, purchasing a gym membership at some gyms or fitness centers. Purchasing the membership is very easy and can be done over the phone, online, or in their mobile app. However, to cancel the membership customers must go in person to their main planet fitness location to cancel the membership. This is a common business tactic since businesses know that the likelihood of customers coming in person to cancel their membership is low. Most customers would rather continue to pay their monthly fee of $15 than add visiting the gym to their busy schedules.  


Privacy Zuckerberg  

Privacy Zuckerberg, named after Facebook CEO, Mark Zuckerberg, refers to users being tricked into sharing more of their personal data than they intended to. In the past, Facebook made it impossible for users to have any control over the amount or the type of their data that Facebook shared by not providing links to any data privacy settings. This resulted in the emersion of more user-centered advertisements on and off of the platform. In response to negative feedback from privacy groups and consumers, Facebook’s privacy settings are now more user-friendly and easier to manage.  

Now more than ever Privacy Zuckerberg is conducted behind the senses through the data brokerage industry. Commonly hidden in the terms and conditions when making purchases or using services users agree to their data being sold to anyone. Once on the market data brokers buy users’ info and combine it with other info, they have gathered on the user to create a profile. The profile can include sensitive information such as the users’ mental health and physical status as well their sexual preferences. It is predicted that in the future such profiles may affect users’ ability to take out loans or apply for insurance. Unfortunately, this industry is not heavily regulated and it's almost impossible for users to opt-out of having their data brokered. However, many platforms have taken action in providing users with a simple and intuitive pop-up notification to opt-out of their usage being tracked once installing an app.   




Confirmshaming is a pattern used to keep customers from opting out of something. This is commonly seen in situations where companies want users to join their mailing lists or participate in their promotions. In the screenshot below a fashion brand confirmshames users to discourage them from ignoring their sale promotion in an attempt to gravitate more sales. Notice how the ‘Get the Offer’ Button is large and outlined in red while the negative call-to-action button is linked below the positive call-to-action button without any borders stating ‘I don’t want my mystery offer’. Since most users don’t want to admit to not wanting a potential coupon, many users will select the ‘Get Offer Button’ regardless of their prior intentions before visiting the website. It is widely known that coupons with expiration dates increase customers’ willingness to buy more products, faster. By making customers aware of coupons automatically being added to their cart it eases the user’s flow of entering the promo code at checkout thus increasing their chances of purchasing something from the website.  More examples of confimshaming are listed on



How to Combat Dark Patterns  

The best tool to combat park patterns is education. According to Brignull, “If you know what cognitive biases are and the kind of tricks that can be used to change your mind to persuade you to do things, then you're less likely to have them trick you.” The more familiar designers and users are with the various forms of dark patterns, the more they will be able to identify the tricks embedded in the interface or product and avoid them. As designers in the UX community, it is our responsibility to be aware of the usage of dark patterns and to push back on orders that requires us to use them. Thus, empowering us to promptly call out companies that are taking advantage of their users. Companies do not want bad attention in the media therefore calling them out on their actions publicly will be more efficient in promoting change in their interfaces and/or products. 




Shaniah Tullis
Research Associate

Shaniah Tullis is a Research Associate at the User Experience Center. Prior to joining the UXC, she interned as a Research and Data Analysis Fellow for The Center for Black Innovation. In this role, she aided in executing a landscape analysis project examining Black support organizationsPreviously, Shaniah assisted in research and developmental projects centered on rehabilitation sensors. 

Shaniah holds a Bachelor of Science in Mathematics and a Bachelor of Science in Computer Science with a minor in Visual Arts from Lincoln University of Pennsylvania. She is currently pursuing a Master of Science in Human Factors in Information Design from Bentley University.