The design community has long described "dark patterns" as interfaces and click behaviors that are designed with intent to manipulate the user. These are choice architectures, interface designs, and default settings that nudge users in a preferred direction. For example: How hidden is the unsubscribe function for a pricey app? Or picture countdown timers that create a sense of urgency to pressure shoppers to checkout. Princeton researchers systematically scanned 11,000 shopping websites and found instances of at least 15 different types of dark patterns on 11% of the scanned sites. The same study identified at least 22 different third-party providers that offer products for ecommerce clients to add turnkey dark pattern plugins to their websites.
Policymakers are now picking up the “dark pattern” language. As Wired points out, the California Privacy Rights Act (CPRA) which passed in November now includes language that says consumer "agreement obtained through use of dark patterns does not constitute consent." The language first made its way into a proposed congressional bill in 2019—the Deceptive Experiences To Online Users Reduction (DETOUR) Act. Now other states' proposed privacy legislation are borrowing the language, as well.
There’s work left to be done to define what constitutes a dark pattern before industry practices will change. California's new Privacy Protection Agency is ironing out the details, fleshing out how the CPRA defines dark patterns as having the effect of "subverting or impairing user autonomy, decision-making, or choice." Getting clarity on what counts as a dark pattern will be essential for tech firms and industry to comply and future proof their practices. Legislation like the CPRA could have big impacts on the now ubiquitous "confirmshaming" pop-ups and obstructive, hard-to-cancel behaviors that tech's designers and engineers have leaned on for so long. It will certainly help shape how meaningful consent is secured from users going forward.