Column:: ‘Dark patterns’ are steering many internet users into making bad decisions - Los Angeles Times
Advertisement

Column:: â€˜Dark patterns’ are steering many internet users into making bad decisions

A European study last year found that Facebook and Google are masters of steering people into making choices that aren’t in their best interest.
A European study last year found that Facebook and Google are masters of steering people into making choices that aren’t in their best interest.
(Kimihiro Hoshino / AFP/Getty Images)
Share via

Even if you’ve never heard the phrase “dark patterns,†you’re almost certainly familiar with them. They’re the sneaky ways online companies trick you into agreeing to stuff you’d normally never assent to.

Classic example: You encounter a prompt asking if you want to sign up for some program or service, and the box is already checked. If you don’t uncheck it — that is, if you do nothing — you’re enrolled.

A bipartisan bill has been introduced in Congress that would prohibit websites and online platforms (hi, Facebook!) from employing such deliberately deceptive tactics, and would empower the Federal Trade Commission to crack down on sites that keep trying to fool people.

Advertisement

The Deceptive Experiences to Online User Reduction Act (a.k.a. the DETOUR Act) is the brainchild of Sens. Mark Warner (D-Va.) and Deb Fischer (R-Neb.). They’re hoping the legislation will be included as part of sweeping privacy regulations now under consideration in the Senate Commerce Committee.

Warner and Fischer will be hosting tech and privacy experts on Tuesday for a Capitol Hill seminar on the various ways consumers can be hoodwinked online.

“For years, social media platforms have been relying on all sorts of tricks and tools to convince users to hand over their personal data without really understanding what exactly it is that they’re handing over,†Warner told me.

Advertisement

He says website developers aren’t stupid. They closely study behavioral psychology to understand how internet users can be most easily misled.

“Our bill is pretty simple,†Warner said. “We just want consumers to be able to make more informed choices about how and when to share their personal information.â€

Fischer told me separately that there needed to be far greater transparency surrounding the click of an “OK†button.

Advertisement

“These manipulative user interfaces intentionally limit understanding and undermine consumer choice,†she said. “Any privacy policy involving consent is weakened by the presence of dark patterns.â€

This sort of deception is one of those things most internet users probably are aware of but likely don’t give much thought to. Many of us just take for granted that websites are trying to separate us from our personal info.

But this perpetual siege on our privacy doesn’t have to be the default setting. There’s no reason for consumers to simply accept that just because online businesses are desperate for our data, there’s nothing we can do about it.

The fact that there’s a term of art for these practices — “dark patterns†— tells us that increasingly sophisticated methods are being employed.

A European study last year found that Facebook and Google in particular had become masters of steering people into making choices that weren’t in their best interests.

“The combination of privacy-intrusive defaults and the use of dark patterns nudge users of Facebook and Google, and to a lesser degree Windows 10, toward the least privacy-friendly options to a degree that we consider unethical,†the study’s authors said.

Advertisement

They found that the companies used misleading wording, take-it-or-leave-it choices and hidden privacy options to compel users to reveal as much about themselves as possible.

“When digital services employ dark patterns to nudge users toward sharing more personal data, the financial incentive has taken precedence over respecting users’ right to choose,†the researchers concluded.

In response to the study, eight U.S. consumer advocacy groups, including Santa Monica’s Consumer Watchdog, called on the FTC to investigate use of dark patterns by internet companies.

“The entire online ecosystem, at least the commercial ecosystem, manipulates users into doing what companies want them to,†said Carmen Balber, executive director of Consumer Watchdog.

“Every internet site on some level drives people where companies want them to go,†she said. “But it’s one thing to try to drive clicks to your shoe ad — people expect that. They don’t expect a site to intentionally misdirect them.â€

She observed that federal authorities decades ago clamped down on subliminal advertising on TV — the planting of messages that could make an impression on consumers even though they appeared too quickly to be consciously noted. Authorities, however, have yet to acknowledge the similar effect of dark patterns.

Advertisement

“It’s time online users were protected from internet companies’ intentionally deceptive designs,†Balber said. “Rules of the road for privacy design are overdue.â€

The DETOUR Act would provide such rules.

It would make it illegal “to design, modify or manipulate a user interface with the purpose or substantial effect of obscuring, subverting or impairing user autonomy, decision-making or choice to obtain consent or user data.â€

The bill would introduce more transparency to the online experience by requiring sites to disclose “any form of behavioral or psychological research†and “any experiments†they employ to manipulate user behavior.

It also would create an independent review board to oversee “any behavioral or psychological research, of any purpose, conducted on users.â€

Language like that would seem far-fetched if this wasn’t really happening, as the European study illustrated. It said many of the techniques now being employed online are based on “the fields of behavioral economy and psychology.â€

In just one example, the study showed how Facebook steered users into accepting the company’s desired privacy settings by making the “accept†button an appealing bright blue, while the option for changing those settings was a dull gray.

Advertisement

“The option that the service provider wants users to choose was deliberately made more eye-catching,†researchers said. Moreover, “users that were in a rush to use Facebook were inclined to simply click the blue button and be done with the process, which results in the maximum amount of data collection and use.â€

Dark patterns also can take the form of ads disguised as navigation links, hidden costs that don’t appear until the very last step of a transaction, or free trials that turn into recurring payments that are very difficult to cancel.

Some might say these tactics are too simplistic to pose a threat to tech-savvy consumers. But the same could be said of hiding sexual images in magazine ads and movie posters.

Companies do it because they know, from years of quiet research, that it works.

Dark patterns are real. And they’ll keep being used to influence our online behavior.

Unless we do something about it.

David Lazarus’ column runs Tuesdays and Fridays. He also can be seen daily on KTLA-TV Channel 5 and followed on Twitter @Davidlaz. Send your tips or feedback to [email protected].

Advertisement