Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the 8 years that have elapsed since this pitch was made at Google, one conclusion is absolutely crystal clear: the industry has as little self-regulatory control over its use of such dark patterns as the public has over avoiding them.

Regulation, taxation, and where necessary, criminalisation, are obvious and necessary.

I'll note that for once, the patterns are not entirely synonymous with massive tech monopolies, though it is the monopolists who have the greatest resources and capability to tune their attention-diverting practices to the absolute maximum degree.

The naivete Tristan reveals here is rather charming, however.



Criminalization of dark patterns?

It’s not impossible to imagine — intentionally tricking a user is sometimes called fraud — but I’m not sure we’ve thought through the implications of criminalizing website design.

The biggest companies are the ones least affected by regulation, because they can invest resources into complying. Small startups sometimes can’t. A teenager in their bedroom usually can’t. And that’s no small loss.


The scale argument against regulation is a frequent one. I'd like to see regulation have specific mandates for increased responsibility with size. There are examples of this elsewhere, e.g., exceptions from food inspecition and licensing regulations for home cooks, even for sales at relatively informal markets such as farmers' markets.

http://www.fma.alabama.gov/HomeProc.aspx

The case for the Internet is an interesting one, as there are numerous cases where a single individual does in fact provide services used widely (Alexandra Albakyan of Sci-Hub, and Chet Ramey of OpenSSL come to mind). Defining "scale" and commensurate "responsibility" does require some nuance.

The overregulation trope is a frequent one of the Libertarian movement, and appears frequently at (and on HN, from) Reason.com, among other sources. One suspects that Koch Industries and ilk would like to be similarly exempted from such onerous requirements as the individual hairdresser they so often performatively defend (often with a gross ignorance of the issues involved in hair care, beauty treatments, and hygiene).


> (often with a gross ignorance of the issues involved in hair care, beauty treatments, and hygiene).

wait.. you seriously going to defend hairdresser regulations? Care to enlightenment me on why they are a net good for society? I always thought they are a punching bag for libertarians because they are such an obvious net negative for society...


Public-facing intimate-touch services carry numerous risks of hygiene, chemical exposure, techniques, identifying potentially hazardous conditions in the client, and more.

I'm not going to argue that all regulations and licensing within the industry are sensible, that's simply never the case. And there is self-serving regulation for all the usual reasons (guilds, education, limiting competition and entry, all of which you can find in, oh, to pick a random example, the tech sector).

But the case is also not nearly as black and white as Reason would like to have you believe.

More generally, in my view, in any technology, the greatest elevation of knowledge and expertise comes not in applying the basics and in positive effects of the methods and systems involved, but in what I've called hygiene factors: the unintended consequences, side effects, complex interactions, and risks.

The notion of manifest and latent functions was developed by sociologist Robert K. Merton.[1] In discussing these, Merton makes the perceptive observation that because latent functions are not immediately apparent, obvious, or significant, they represent a greater increment of knowledge and understanding than manifest functions, which are obvious, evident, easily understood and communicated, etc.

TL;DR: there are nonevident hygiene and side-effect factors.

________________________________

Notes:

1. https://en.wikipedia.org/wiki/Manifest_and_latent_functions_...


The standard solution is to exempt small organizations.

It's why, for example, you can start a company with minimal accounting. When you hire your first employee, your requirements increase. Then above 50 employees the IRS considers you a Large Employer, etc.

It would be reasonable to apply basic consumer protection laws to any service with more than, say, a million users. "Any subscription entered into online must be cancellable online," that kind of thing. Laws against dark patterns. Dorm room startups remain totally unencumbered.

The fact that it's 2021 and a lot of very low-hanging-fruit protections like this don't exist, is evidence that our legislative apparatus is degraded. Maybe in a future decade we'll have better functioning institutions--it has happened before--but not today. Lina Khan running the FCC is a hopeful sign.


This is a great point; I forgot entirely about exemptions.

Do you happen to know of any existing regulation-with-exceptions that I can cite, to persuade others (and incidentally myself) that regulation isn't such a bad idea? Ideally something related to FAANG, but any software regulation with reasonable exemptions would be a fine example.

To be honest, I'm worried that politicians would muck things up during the legislative process, and that the right exemptions won't get applied. Or that we'll run into yet another law where you're required to pop up a banner saying "This site uses cookies, just like everyone else, lol." But if it's been done successfully, I'd like to know about it.


> Or that we'll run into yet another law where you're required to pop up a banner saying "This site uses cookies, just like everyone else, lol."

Is there any such law? I always understood that to be a widespread misunderstanding. Just like the way every website pops up a dialog box asking you to approve tracking or click through some complex flow to disapprove. They aren't required to do that. But business people are trying to make regulation seem like a burden on users to discourage legislators from creating new regulations.


Mm, I think as long as you’re not using the cookies for advertising purposes, you don’t have to pop up a banner. But if you’re using any kind of cookie for ads, you do.

GitHub recently published a post called “no cookie for you” or some such, which made that distinction clear. They said they were able to get rid of the popup by not doing any ads.

But still, it feels… wildly unhelpful to inform users that cookies are being used for ads. I’d love to understand the other viewpoint, though.


> Lina Khan running the FCC

I think you mean the FTC.

https://www.ftc.gov/about-ftc/biographies/lina-khan


> Criminalization of dark patterns?

Criminalisation is actually better than regulation. Regulation is about what you do. Criminalisation takes into account what you did and what you intended to do.

One year ago, you followed every website design of the time, and made the "Please respect my privacy" button look like a the text in some terms and conditions, but the "Please sell all my data in a way that will increase my costs and make you many many dollars" look like a get-rid-of-this-inconvenience button.

Since then, regulations came up that banned this behavior. You didn't hear about it and you got sued by the regulation agency. Bummer.

Since then, criminal law changed that banned this behavior. You didn't hear about it and there's plenty of evidence that you were just acting in good faith - including the fact that you never actually sold the data because you're too busy attracting paying customers. You're alerted to the issue and immediately in good faith ask what they recommend and implement that. Your behavior before, during and after the event all showed a complete lack of intent. So why would you be criminally prosecuted?

It's the business who have the resources to do A/B testing on barely known features who mysteriously have no data about this interface that every user comes into contact with at least once, who need to be worried about this. There would be good reason to suspect that they tried to cover up their misdeeds. Even in the absence of positive evidence, it would be evidence of bad faith if they are a business that makes more money of user data than entire countries make off everything they do and don't consider the laws around user data.

The bigger issue is that it would be a pretty major change of direction.


I don't believe the distinction you make between criminal and regulatory actually exists.


I believe they are confusing intent to perform an act with intent to break a law. In most cases, ignorance of the law is not a criminal defense.


A ton of laws are written to accommodate the resource of the companies/individuals subjected to them. You're not subject to the same paperwork if you have 2 employees or 2000.

Now to be honest, I am not sure why startups and teenagers shouldn't be punished if there is proof of willfully crossing the line where users get harmed. Feels to me like arguing that food poisoning shouldn't be criminalized because lemonade stands won't have the means to comply.


Depends on the regulation. Some things should be obvious: don't collect people's contacts and then send them spam without warning. Don't collect a credit card for a free trial and then start charging it without explicit user action once the trial is over.

But you might argue that doesn't cover all or even most dark patterns, and you'd probably be right.


We've made laws against plenty of thing in the advertising world.

- In Australia advertising cigarettes is illegal.

- In New Zealand billboards are illegal.

- In Australia there are very strict "truth in advertising laws" and breaking them has big penalties.

If we really care enough about this issue, we could absolutely make laws preventing certain dark patterns on the web. Given it seems likely that all commerce will move online in the next decade, it starts to be more important than those old modes of advertising I mentioned above.


Do you have a cite for the NZ outdoor advertising ban?

Best I can find (see down-thread) is one that applied in Auckland, though not the rest of the country. It's peculiarly little-referenced online, and I suspect has lapsed.


> In New Zealand billboards are illegal.

Yeah right.

Billboards are everywhere, not sure where you got that idea.


There is or was a ban in Auckland, specifically. Few mentions online so I suspect it may have expired or been repealed, but it's mentioned in this 2012 Guardian article:

https://www.theguardian.com/commentisfree/2012/apr/20/ban-ou...

(Generally: respond to the points or assertions, if you have evidence of falsity, provide it. If you find a claim unsupported, request references.)


> New Zealand billboards are illegal.

- yeah right.


Possibly unpopular opinion, but I don't see why so important to protect "small startups" from regulation in any situation.

If you serve a million users then those million users will be affected by your mistakes and your unethical business decisions, no matter how many people are on your payroll.

If you don't have the resources to comply, why are you serving a million users in the first place?


The answer is just below the surface: most people prefer laws apply only to people not "like them". It's the old "other people are immoral, but my friends and I are a victim of circumstances" pattern.


Just apply harsher standards the bigger the scale. Billion dollar corporations should be subjected to a lot more scrutiny than teenagers in their bedrooms. These two are not equal and must not be treated equally in matters of law.


Most regulators do not charge teenagers in their bedrooms except for the most egregious criminal acts. If you are a teenager operating a website from your bedroom, you functionally do not need to worry about ADA, GDPR, etc.


That's only true until a prosecutor needs to make a political campaign.

ADA is enforced (abusively) by private claims, not regulators, at least in CA.


I get what you're saying but that's only because teenagers in their bedroom are unlikely to get enough attention, not because they don't have enough influence.

It's totally possible for a bedroom-made startup to grow to the point where the lack of GDPR enforcement could be directly harmful to customers, for example. True it's unlike they'd get noticed, but I don't think that's a strong argument here.

I want reasonable widespread application of GDPR, ADA, etc with explicitly carved out safeguards for small operations like this teenager.

They shouldn't have to rely on getting lucky


Maybe another wave of dark pattern consent prompt to click agree?


Does the same standard apply to, say, robbery?

Small time crooks who steal $100 go to jail, but corporations who steal $millions go unpunished.


In addition to fraudulent pretenses for collection, there are practices which are simply predatory or harrassment.

Numerous sites present various dialogues on every visit (more so if you use incognito mode as a rule, I do), or treat Tor access highly differently from direct access (denying access entirely, raising GDPR blocks, putting various nags in the way, or popping up often-impenetrable CAPTCHA dialogues).

There's the case of sites which mandate javascript use to present any content, even if no interactive elements are used.

There are the repeat nags to use apps and such. Twitter's dialogue offers the options "Switch to the app" and that won't-take-no-for-an-answer-creepy-stalker "Not now". (Disabling JS breaks the site fully, reminding me that I actually want to use whatever Nitter site(s) are not yet rate-limited: https://github.com/zedeus/nitter/wiki/Instances)

There are buttons and other tools which follow users across the Internet and devices.

There are tools that track and correlate multiple signals, including audio tracking (embedded signals in commercial music and advertising broadcasts), MAC address, device ID, Bluetooth ID, SSID locations, etc.

Google have a an "opt-out" standard for SSID identifiers, which really ought to be criminal: https://support.google.com/maps/answer/1725632#how_opt_out&z... (HN discussions: https://news.ycombinator.com/item?id=27518415 https://news.ycombinator.com/item?id=7347397 https://news.ycombinator.com/item?id=22466447)

DuckDuckGo provides counterexamples to several of these, handily illustrating several practices I'd like to see far more widespread:

- There's a no-JS site which works fine in console-mode browsers. I use that for a bash-function CLI search-engine tool (actually, several leveraging DDG's bang searches). Two, actually: https://lite.duckduckgo.com/lite and https://duckduckgo.com/html (standard formatting, but HTML-only functionality).

- There's a "no-nag" URL so you can pop up the search page without any prompt to set DDG as your primary browser: https://start.duckduckgo.com/ (This is a particularly elegant solution IMO to the company's need to convert users to defaulting the search engine, and not annoy those who've long since done so.)

- The site Just Works on Tor. There's an Onion address as well: http://3g2upl4pq6kufc4m.onion/ I've certainly never encountered a CAPTCHA.

(DDG isn't perfect, the recent Tank Man image censorship resulting from the site's major reliance on Bing for Web and Image search being a notable recent issue.)

A major problem with normalising dark patterns is that they become acceptable, people cannot even conceive that they have an alternative, and that as technologies evolve from early-adopter toy (or secret advantage) to mass-public necessity for commerce, business, government, culture, and social essentials, there is simply no alternative but to be subjected to these patterns. Criminalising these, especially after an initial warning or finding (Because Reasons, regulators often seek relatively modest initial remedies, there are critics of this practice, and many, myself included, would strongly appreciate second offences and further to carry much deeper bite).

The trade and exchange in third-party data and information should be fully and completely outlawed where not specificly and directly authorised and necessary for transactional fulfillment.


> This is a particularly elegant solution IMO to the company's need to convert users to defaulting the search engine, and not annoy those who've long since done so.

There is no need for prompt, just a text box below search box should be enough. But the way web uses the dark patterns, it's now considered ok to have such annoyance everywhere.


+1 Upvoted and favorited; thank you for the DDG links and for articulating my precise sentiments so clearly.


I’d say that Apple has done an amazing job combating this and Google has made some strides.

This last WWDC shocked me with how tasteful it was in making devices more personal and respectful




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: