WASHINGTON — As Congress wrestles with how to address deceptive Internet practices, one Nebraska law professor is urging it to tread carefully.
At issue are the many ways in which people can be tricked online today, whether that involves videos altered realistically to smear public figures or manipulative user agreements designed so that nobody would ever read them.
Such practices are often lumped together under the broad term “dark patterns.”
But even that phrase is itself somewhat manipulative, and lawmakers should be wary of stifling legitimate, productive uses of new technology, according to Gus Hurwitz, an associate professor at the University of Nebraska College of Law and director of a new center that will study the issue. Hurwitz testified Wednesday during a House Energy and Commerce subcommittee hearing.
“Dark patterns are something that this committee absolutely should be concerned about, but this committee should also approach the topic with great caution,” Hurwitz testified. “Design is powerful, but it is incredibly difficult to do well. Efforts to regulate bad uses of design could easily harm efforts to do, and use design for, good.”
Many of the worst examples of dark patterns fall into categories where the Federal Trade Commission can already act, he said.
Hurwitz suggested that the agency could be more aggressive in exploring its existing authorities, then report to Congress on the results. Still, there’s clearly an appetite for more direct legislative action among those on both sides of the aisle.
For example, Sen. Deb Fischer, R-Neb., joined with Sen. Mark Warner, D-Va., to introduce the Deceptive Experiences To Online Users Reduction Act.
That legislation is aimed at cracking down on dark patterns and would prohibit platforms with more than 100 million active monthly users from relying on interfaces that intentionally impair a user’s decision-making.
The bill gained a couple of additional bipartisan backers this week, which Fischer has pointed to as a sign the bill is picking up steam.
“Nearly every time Americans use a new app on our smartphones or browse social media from our laptops, we run into dark patterns,” Fischer said in a statement. “These unethical tricks online platforms use as they battle to capture attention and manipulate users must be stopped.”
In an interview, Hurwitz described Fischer’s bill as an important first step that could encourage the industry to police itself. But he also said it illustrates the challenge of determining just what is a “dark pattern” and shutting those down.
“It’s hard to do that without banning the light patterns,” he said.
Wednesday’s hearing returned repeatedly to an analogy between common online marketing tactics and those used in the real world, as when brick-and-mortar stores arrange their aisles to keep customers in the building longer or place certain impulse items at the checkout registers.
Those pushing for more regulation say that understates the extent of the problem given that the online world is becoming the very infrastructure through which many people view reality.
Tristan Harris, executive director of the Center for Humane Technology, testified alongside Hurwitz and pointed to the video “autoplay” function of sites such as YouTube. A platform’s algorithms can continue playing video after video that tend toward extreme viewpoints, ultimately spreading misinformation and conspiracy theories.
Hurwitz responded that such a framing assumes the tool in question will always be used nefariously. What happens if the same function was used to push the user toward more enlightening videos?
“If we say autoplay is bad, then we’re taking both of those options off the table,” Hurwitz said. “This can be used for good.”