U.S. markets closed
  • S&P 500

    -11.65 (-0.27%)
  • Dow 30

    -158.84 (-0.47%)
  • Nasdaq

    +18.05 (+0.14%)
  • Russell 2000

    -9.21 (-0.51%)
  • Crude Oil

    -0.94 (-1.02%)
  • Gold

    -14.00 (-0.75%)
  • Silver

    -0.35 (-1.54%)

    +0.0008 (+0.07%)
  • 10-Yr Bond

    -0.0240 (-0.52%)

    -0.0005 (-0.04%)

    +0.0560 (+0.04%)
  • Bitcoin USD

    +80.06 (+0.30%)
  • CMC Crypto 200

    +0.90 (+0.15%)
  • FTSE 100

    +6.23 (+0.08%)
  • Nikkei 225

    -14.90 (-0.05%)

FTC's case against Vizio illuminates terrible tech industry habit

Vizio paid a $2.2 million settlement. Ethan Miller/Getty Images
Vizio paid a $2.2 million settlement. Ethan Miller/Getty Images

There is such a thing as a user-interface that’s so bad the government needs to put a stop to it. And Vizio has a $2.2 million settlement with the Federal Trade Commission and New Jersey to prove it.

The electronics giant paid up to settle allegations that it installed software on its smart TVs to track what people watched without getting their consent and with only vague disclosure. The FTC alleged that was an “unfair and deceptive” practice, one of the offenses the agency is charged with stopping.

In doing that, the commission has provided an excellent lesson in how one of the tech industry’s worst habits — what’s called a “dark pattern” interface — can go wrong.

Tricks of the trade

London-based user-experience designer Harry Brignull first suggested that we use the term dark pattern in 2010 for designs “crafted with great attention to detail, and a solid understanding of human psychology,” to steer people away from a sensible choice.

That captures the experience described in the joint complaint by the FTC, New Jersey Attorney General Christopher S. Porrino and Steven C. Lee, director of the New Jersey Division of Consumer Affairs.

After Vizio remotely installed its “automated content recognition” software on consumers’ TVs, their screens showed this notice: “The VIZIO Privacy Policy has changed. Smart Interactivity has been enabled on your TV, but you may disable it in the settings menu. See www.vizio.com/privacy for more details. This message will time out in 1 minute.”

After that, Vizio’s software would transmit a sample of what was on the screen every second to the company’s servers, according to the complaint. The content was then matched with a database of TV content and indexed against viewers’ IP addresses to let other companies to see, for example, who visited their website after seeing the company’s ad on TV. Companies could also mash up those IP addresses with third-party demographic data on household age, sex, education and income, among other details.

One of the few data points left out of this exercise: viewers’ names. But when you know this much about somebody, the only moniker necessary is “target.”

Undocumented data migration

People who bothered to check a Vizio TV’s settings menu, the complaint says, saw only this bland description of Smart Interactivity: “Enables program offers and suggestions.”

The complaint alleges that last March, after the investigation had begun, Vizio sent a second onscreen notification disclosing its vacuuming of viewership data — which vanished after 30 seconds absent interaction with it.

Vizio manuals were equally uninformative, and as of Thursday morning the company’s Smart Interactivity support webpage barely got beyond this airbrushed description of potential benefits: “bonus features related to the content you are viewing, the ability to vote in polls, or advertisements that match your interests.”

Vizio representatives did not say if viewers had received any of those rewards.

By Thursday afternoon, that page had been rewritten to lead off with a lengthy recap of the data Vizio reels in. It also explains how to disable the feature: Press the remote’s “Menu” button, select the “System” category, choose “Reset & Admin,” select “Smart Interactivity,” and press the right-arrow button to disable that.

Beyond the monetary payment, Vizio’s settlement with the FTC and the Jersey lawmen commits the company, which was bought by the Chinese firm LeEco for $2 billion last summer, to destroy all of the viewing data it harvested before March 1, 2016 and to get customer permission before collecting any more.

In a press release Vizio general counsel Jerry Huang said its settlement “set a new standard” with regard to making gathering viewing data an opt-in proposition.

“Today, the FTC has made clear that all smart TV makers should get people’s consent before collecting and sharing television viewing information and VIZIO now is leading the way,” Huang said.

You’ve seen this movie before

“This clearly is a dark pattern,” Brignull explained, adding that Vizio users probably wouldn’t have opted into participating in the data harvest if they had the option.

“Somewhere along the line someone at Vizio probably realized that changing it to an opt-out model would deliver the results they wanted,” he said. “The end result was something totally manipulative.”

There are, unfortunately, numerous precedents of companies constructing interfaces to herd you into acting against your interests, many of which Brignull outlines on his site Dark Patterns.

Some of the more egregious dark patterns come from the tech industry. Brignull noted how Microsoft (MSFT) resorted to a deceptive dialog box presenting its free upgrade to Windows 10. The confusing array of fees Comcast (CMCSA) applies when you plug in a second TV, which also erode the value of its new, “free” Roku app, should also qualify as a dark pattern.

But once you start looking for these shady interfaces, they keep showing up — especially in financial matters. I saw a particularly egregious example last November, when an ATM in Lisbon badgered me to have a cash withdrawal priced in dollars instead of euros. That “dynamic currency conversion” would have cost an extra 10% fee.

Companies playing these games with duplicitous defaults should remember that customers have memories too.

“Thankfully, the vast majority of the opt-ins these days are in consumers’ best interests,” said Dan Ariely, a professor at Duke University’s Fuqua School of Business and an expert in consumer psychology.

But the payback for abusing consumers’ trust with fishy “opt out” policies should be obvious: “People will start suspecting all the defaults.”

More from Rob:

Email Rob at rob@robpegoraro.com; follow him on Twitter at @robpegoraro.