> The law itself is perfectly sane. The problem is that everybody try to apply it in the worst possible way.
You mean putting trust that a website behaves by implementing its own popup system versus enforcing it on the browser side with a single implementation? Doesn't sound sane to me.
Why don't we implement a law where visitors cannot enter your house when you are not at home, unless you consent. That way we can get rid of locks.
Konsoolo is right, this is the most stupid solution they could possibly come with. Every time I enter a website, I see the bloody useless cookie banner. Those who designed this law have no idea how people behave on the internet. Nobody is going to read a cookie policy on every single website they enter, people want to get to the content they are looking for as quickly as possible. Privacy controls should be available at a browser level, so that 1) you don't force me to accept/refuse each time, disrupting the user experience 2) I am not going to lose all settings if someone from customer support suggests to delete cookies 3) I only set my preferences once, instead of having to decide a million times. The outcome of this stupid regulation is that website owner can still find a million ways to trick users with all sorts of dark patterns and subtle manipulation of language, and users have no way to defend their privacy unless they are willing to spend time understanding the working of this on each website they visit.
The solution to avoiding tracking can't be on the client side, because it's not the client side doing the tracking. So it should be obvious that the law can't target the browser, it must target the server.
This is not only sane, it is very obviously the only way it could be done. Remember, the law isn't about cookies or headers or anything specific: it is a law about user tracking. You're delivering JS that paints a font in a hidden area of the screen? It's then measuring the results and reporting data back to you to track this particular user? Then you need to ask for consent. The browser can't possibly know the intent of the code it is running, so the browser can't be made responsible for protecting user privacy.
Ok. So we drop the cookies and invent/use something else that works like the cookies(e.g an iframe that pings to Google's server) What's that good for? Are you considering including the CORS, iframes and whatever feature may leak information about the visitor in the law as well?
Browser fingerprinting is a thing. In fact I suspect most of the supposedly GDPR compliant (so no cookies or local storage) still use fingerprinting in the background because you can't prove it's happening from the client (and the law is not being enforced anyway).
The cookie law is the ePrivacy Directive 2002,[1] not GDPR. And as a user, I would much rather control my privacy preferences regarding cookies from my own browser, instead of within hundreds of different implementations across websites.
We already have P3P to allow websites to declare how they want to use your information. European legislation should have focused on leveraging these existing tools and protocols to give control to the user, instead of annoying them with endless pop-ups.
GDPR is all about user data AFAIK. If I understand it correctly it avoided the trap that is to single out specific implementations.
Also it seems either I or someone else misread the context. I'm in the broader GDPR context while someone else seems to be in the older cookie law context.
You mean putting trust that a website behaves by implementing its own popup system versus enforcing it on the browser side with a single implementation? Doesn't sound sane to me.
Why don't we implement a law where visitors cannot enter your house when you are not at home, unless you consent. That way we can get rid of locks.
Very sane.