growthwtf 10 hours ago [-]
What a weird piece of writing. Is this like just chicken scratch? Or is this seriously some kind of part of the W3C working process?

Section 2: Third party cookies have gotten bad. Ok.

Section 3: There are legitimate use cases that third party cookies currently cover. Also ok. Then they throw in, "Be aware that a set of new technologies which carry minimal risk individually, could be used in combination for tracking or profiling of web users." Yes? Huge scope increase in the document though and all of a sudden we're now talking about tons of tracking technologies in aggregate? The authors move on without further comment.

Section 4: I think the first half is essentially saying that new technology coming online in the web platform will make the third party cookie problem worse, so we should fix it soon. OK, I'm with back with you. Then the document suddenly pivots to proposing general standards for web privacy again, saying that the burden of proof is on the people originating the proposal to, before concluding by saying (apparently without irony?) that justifying the removal of third-party cookies' impact on business is outside of the scope of the document.

I'm missing a ton of cultural context here about how W3C works, so I'm guessing this probably amounts to rough notes that somebody intends to clean up later that I'm being overly critical of, and they didn't expect it to get any traction on hacker news.

bilekas 9 hours ago [-]
It's W3c... They've never been the most coherent with standards ironically.
IshKebab 59 minutes ago [-]
Isn't W3C fairly irrelevant these days?
motorest 8 hours ago [-]
...or it's a design by committee thing, and some people in the room are doing their best to preserve current and future tracking technology.
bilekas 7 hours ago [-]
It's exactly this, there is a group who come together and never agree on rules, but when they do, they never enforce them. It's I believe the definition of a paper tiger, sadly. A great idea executed horribly.
__alexs 5 hours ago [-]
Standards bodies rarely enforce rules themselves.
squigz 4 hours ago [-]
Is it really on the W3C to enforce standards? How would that even work?
lukan 4 hours ago [-]
By shipping their own reference browser ..
squigz 4 hours ago [-]
In what way would that enforce standards?
lukan 2 hours ago [-]
Well, the same way google can enforce their standards via chrome.

(I did not say it is a realistic goal for a theoretical comitee)

echoangle 2 hours ago [-]
So not at all? Shipping something in chrome isn’t enforcing a standard in my opinion. Enforcing a standard would be a regulatory thing, like having to use USB-C in certain situations.
lukan 2 hours ago [-]
Chrome is in a monopol position. If they decide to ship a new feature .. then all the other browsers need to implement it as well, or their users assume their browser is broken.
motorest 6 hours ago [-]
> A great idea executed horribly.

No. It's sabotage.

milesrout 5 hours ago [-]
Never attribute to malice etc.
consp 4 hours ago [-]
Design by committee is more likely malice than accident or stupidity. Some factors work towards goals which are good for them but malice for the majority.
6 hours ago [-]
8 hours ago [-]
8 hours ago [-]
dbushell 9 hours ago [-]
The "replacement" is already being penned: https://www.w3.org/TR/privacy-preserving-attribution/

Which is just going to be in additional to 3rd-party cookies. Google's own study concluded removing 3rd-party cookies loses revenue and "privacy-preserving" tracking increases revenue: https://support.google.com/admanager/answer/15189422 So they'll just do both: https://privacysandbox.com/news/privacy-sandbox-next-steps/

surajrmal 8 hours ago [-]
There are regulatory agencies which have specifically told Google it is not allowed to remove 3rd party cookies without a replacement as while Google would be able to continue to function fine, their competitors would take a major loss.
JoshTriplett 3 hours ago [-]
Sounds like a great argument for running a different browser not developed by an advertising company, and thus not constrained by that.
chrisweekly 16 minutes ago [-]
Agreed. Curious what HNers feel is the most viable replacement. I'm experimenting w Arc this week...
pas 4 hours ago [-]
Do you have links for this? I'm curious about which bodies and what was their argument.
diogocp 4 hours ago [-]
dbushell 4 hours ago [-]
Seems like the CMA are concerned for other advertisers who profit from 3rd-party cookies, no concern for user's privacy. That poor billion dollar industry, how will it cope?
josefx 2 hours ago [-]
Another "trusted" third party based tracking system. All I need to know to avoid it even when it is printed on toiletpaper.
dbushell 42 minutes ago [-]
Yep, definitely "trusted third party". For example:

https://blog.mozilla.org/en/mozilla/mozilla-anonym-raising-t...

Owned by Mozilla, ran by ex-Facebook employees. I'm sure it's entirely coincidentally this W3C draft was written by Mozilla and Facebook employees.

red_admiral 5 hours ago [-]
I just want someone to explain how I can edit my own privacy preserving attribution database. Is it a local SQLite database or something?

I feel like storing my "preferences" locally without letting me edit them as a stupid move.

jeroenhd 5 hours ago [-]
Google's design stores the tracking data locally. Chrome already has a UI to manage topics of interest (chrome://settings/adPrivacy).
sedatk 9 hours ago [-]
If third-party cookies are removed, the tracking parties will just ask web sites to include the script on their web server, so their cookies become "first party" again. I don't understand how this helps the web unless protections against tracking itself, not the methods used, are established.
Dwedit 9 hours ago [-]
It's about trust, the third-party ad companies don't trust that the first party will be honest with them, not generating fake impressions or clicks.
thayne 7 hours ago [-]
There are also trust issues the other way. I've seen a lot of contention between developers and security teams and marketing about putting third party code or proxying third party domains on the first party site for analytics, tracking, ad attribution, etc.
sedatk 9 hours ago [-]
I doubt that. Their script could as well be "fetch that script from that URL and run it". They would have fraud detections already in place on their side regardless of which script runs on the client.
chii 6 hours ago [-]
> "fetch that script from that URL and run it"

but if you cannot have a third party cookie, the remote site from the tracker cannot be sure that the script was actually downloaded, nor executed.

littlecranky67 5 hours ago [-]
sure you can, if their script is making a 3rd party xhr request to that tracker.
chii 5 hours ago [-]
but this request could be faked, if the first party wanted to fake the traffic (for example, to make ad revenue). This third party cookie is what prevents this faking at this moment.
blacksmith_tb 9 hours ago [-]
That's old hat, the future is server to server calls from sites to vendors, profile the client but don't try to run any tracking js on it.
kstrauser 8 hours ago [-]
That's vastly more expensive, though. Now you have to run extra servers to make outbound connections to the ad tracker's API server instead of turfing off all the work to visitors. It would be enough to significantly affect the ad market.
Griffinsauce 6 hours ago [-]
Oh no!
ars 7 hours ago [-]
I don't think it's that expensive to do. All it takes is one well written package that is easy to install and this will be come standard.

I could even see a data broker centralizing this and distributing tracking to all of their clients. The client would just need to communicate with the central broker, which is not hard at all.

6 hours ago [-]
sedatk 9 hours ago [-]
That's also quite the possibility, and supports my point.
coffeefirst 3 hours ago [-]
This doesn’t actually help. If you consider Prebid, Criteo already has js running on the site serving the ads, but that js has no mechanism to figure out whether the user has something in their cart and is eligible for retargeting.

The workaround is looking more and more like IP, fingerprinting, and AI. I’d argue this is worse than 3p cookies, which were at least dumb and easy to clear.

fiddlerwoaroof 7 hours ago [-]
I think many adtech companies (at least in affiliate marketing) use redirects because third party cookies are unreliable and redirects make all the cookies first party. As mentioned elsewhere, they’ve also been switching to proxies and other such techniques to make it even harder to block their tracking endpoints.
parrit 8 hours ago [-]
Proxies for analytics are already a thing. E.g. plausable shows you how to set one up. A 3rd party cookie can however be the same value sent again and again from the same browser from different sites to the central server tracking you across the web. The global who you are is in the cookie.
timewizard 8 hours ago [-]
> include the script on their web server, so their cookies become "first party" again.

That script would execute with the origin of the server. It's access to resources and /shared state/ would be hampered by this. So as a cross-site tracking strategy I don't think this works.

> I don't understand how this helps the web unless protections against tracking itself, not the methods used, are established.

Which is why I think state partitioning[0] and CHIPs[1] are good technologies. It allows previously existing standards, like cookies, to continue to exist and function mostly as expected, but provides the user a good amount of default security against cross site trackers and other malware.

[0]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...

[1]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...

littlecranky67 5 hours ago [-]
Your point is pretty useless, as you assume the web server admins want to be more secure. The opposite is the case, usually they deliberately open up their security model to accomodate 3rd party tracking scripts. For example, Content-Security-Policy headers can effectively prevent all sorts of xss attacks, but they will also prevent 3rd party tracking scripts etc.
timewizard 4 hours ago [-]
You've misunderstood my point. It's not what the server admins want it's what the security policy will allow. If two sites, on two different domains, both use the same script, served directly from their domains, it creates absolutely no workaround for third party cookies. This is because the two sites have different origins. CSP does not create a bypass in this case.
freeamz 10 hours ago [-]
Feel like all this cookies thing is just white wash, when if you enable JS then they can track you no matter if you have cookies or not!

Nothing is private: https://nothingprivate.gkr.pw

More effort ought to be put into how to make web spec to NOT be able track user even if JS is turned on.

Browser vendor Brave, Firefox suppose to privacy browser are NOT doing anything about it.

At this point, do we need to using JS disabled browser to really get privacy on the web?

littlecranky67 6 hours ago [-]
Any other tracking methods are way more obvious, and way harder to implement for the advertising industry. We shouldn't think in black/white here - the more difficult it is to track a user, the less likely it is implemented. It is okay if 30% of tracking sites dissapear as the cost/value ratio don't work for them. We don't have to sit in silence and do nothing, just because we can't have the 100% privacy.
matthewdgreen 1 hours ago [-]
I do think there is a point here: any technical means to block tracking is going to be overrun by technical means to overcome the anti-tracking tech. There are simply too many dollars at stake for anything else to happen. If anti-tracking stops some players, that just means the industry will consolidate into a few large and well-resourced players.

While I am all in favor of continuing the technical battle against tracking, it’s time to recognize that the war will only be won with legislation.

idle_zealot 10 hours ago [-]
> At this point, do we need to using JS disabled browser to really get privacy on the web?

My thoughts are that we need a distinction between web pages (no JS) which are minimally interactive documents that are safe to view, and web apps (sites as they exist now) which require considerable trust to allow on your device. Of course, looking that the average person's installed app list indicates that we have a long way to go culturally with regards to establishing a good sense of digital hygiene, even for native software.

wtallis 7 hours ago [-]
It doesn't help that web browsers aren't even trying to help users make the distinction. They have an ever-growing list of features and permissions that sites can take advantage of, with no attempt to coalesce anything into a manageable user interface. Instead, it takes a hundred clicks to fully trust or distrust a site/app.
freeamz 3 hours ago [-]
More UI/UX distinction is needed! Just the green lock for security! The browser should indicate the level of privacy of the page. If the page use no js or any GPU compromising (css I'm looking at you), then it gets a green kind. For every privacy/security compromising feature you add the turns yellow. Once it start to ask for WebUSB, MIDI, then it should be in some kind of Native Mode. More like a UI/UX issue for the major browser makers!
GCUMstlyHarmls 9 hours ago [-]
https://nothingprivate.gkr.pw seems to (not) work fine in Firefox... I am running ublock-origin though, no other special things.
Diti 5 hours ago [-]
Same here, it’s not just you. Judging by the other comments, it only seems to “work” on Blink-based browsers.
Kovah 5 hours ago [-]
Also not working on Brave, without UBlock or similar extensions. Brave says it blocked one requests, probably that for fingerprinting.
karl-j 5 hours ago [-]
The site also fails to track on mobile Safari with ”Prevent Cross-Site Tracking” turned on.
gkbrk 4 hours ago [-]
Doesn't work on Brave. It says to check it on private mode, but when I switch to private mode it just asks for my name again.
FridgeSeal 4 hours ago [-]
Also doesn’t work on iOS (for me).
brookst 10 hours ago [-]
It’s an interesting question: is it possible for JavaScript to be turing complete, able to read/write the DOM, and somehow prevent fingerprinting / tracking?

My gut says no, not possible.

Maybe we need a much lighter way to express logic for UI interactions. Declarative is nice, so maybe CSS grows?

But I don’t see how executing server-controlled JS could ever protect privacy.

Enginerrrd 10 hours ago [-]
I've always thought there should be a way to use the browser like a condom. It should obfuscate all the things that make a user uniquely identifiable. Mouse movement/clicks/typing cadence should be randomized and sanitized a bit. And no website should have any authority whatsoever to identify your extensions or other tabs, or even whether or not your tab is open. And it certainly shouldn't allow a website to overrule your right click functionality, or zoom, or other accessibility features.
JSteph22 9 hours ago [-]
The obfuscation makes you more easily identifiable.
teo_zero 7 hours ago [-]
How so?
codyvoda 7 hours ago [-]
Eldo Kim

you stand out when you obviously hide

chii 6 hours ago [-]
only if you are the only one doing the obfuscation.

It's why tor browser is set to a specific dimension (in terms of pixel size), have the same set of available fonts etc.

klabb3 3 hours ago [-]
And yet you still stand out if you use tor.
chii 3 hours ago [-]
yes, and it's because not enough people use tor-browser (i meant the browser, not the network).

But if privacy is truly the desired goal, the regular browser ought to behave just like tor-browser.

febusravenga 4 hours ago [-]
Yes, it is.

Just create _strict_ content security profile, which doesn't allow any external requests (fetch) and only allow load of resources (css, image, whatever) from predefined manifest.

App cannot exfiltrate any data in that case.

You may add permissions mechanisms of course (local disk, some cloud user controls, etc).

That's a big challenge in standards and not sure if anyone is working on such strongly restricted profile for web/js.

chongli 9 hours ago [-]
It’s an interesting question: is it possible for JavaScript to be turing complete, able to read/write the DOM, and somehow prevent fingerprinting / tracking?

Yes, of course: restrict its network access. If JS can't phone home, it can't track you. This obviously lets you continue to write apps that play in a DOM sandbox (such as games) without network access.

You could also have an API whereby users can allow the JS application to connect to a server of the user's choosing. If that API works similarly to an open/save dialog (controlled entirely by the browser) then the app developer has no control over which servers the user connects to, thus cannot track the user unless they deliberately choose to connect to the developer's server.

This is of course how desktop apps worked back in the day. An FTP client couldn't track you. You could connect to whatever FTP server you wanted to. Only the server you chose to connect to has any ability to log your activity.

adrr 6 hours ago [-]
There's no point. If you diaable JS. Can track you other ways, fingerprint your dns packets like timestamp clock skew and other things. With IPV6 can assign you unique ip address for a dnslookup that can function like a cookie,

Don't want to be tracked. Don't go on the internet.

HumanOstrich 5 hours ago [-]
Websites can't fingerprint my dns packets by their clock skew, nor can they assign me a unique IP address for a dns lookup (what?). "Don't go on the internet" isn't a great starting point to improve things.
waynesonfire 8 hours ago [-]
Why does it have to be a technological solution? That's what the media industry tried to do with DRM and it failed. The solution is legislation. We need the equivalent of DMCA for our privacy. Make it illegal to fingerprint.
chongli 3 hours ago [-]
I’m completely unsold on legislation. Another headline that recently hit the top of HN is about how Apple flagrantly ignored a court order. The judge has recommended the case for criminal contempt prosecution [1].

The comments on the story are completely unconvinced that anyone at Apple will ever be convicted. Any fines for the company are almost guaranteed to be a slap on the wrist since they stand to lose more money by complying with the law.

I think the same could be said about anti-cookie/anti-tracking legislation. This is an industry with trillions of dollars at stake. Who is going to levy the trillions of dollars in fines to rein it in? No one.

With a technological solution at least users stand a chance. A 3rd party browser like Ladybird could implement it. Or even a browser extension with the right APIs. Technology empowers users. Legislation is the tool of those already in power.

[1] https://news.ycombinator.com/item?id=43856795

chii 6 hours ago [-]
> The solution is legislation. We need the equivalent of DMCA for our privacy

and how does one know their privacy has been invaded? How does the user know to enforce the DMCA law for privacy?

I think the solution has to be technological. Just like encryption, we need some sort of standard to ensure all browsers are identical and unidentifiable (unless the user _chooses_ to be identified - like logging in). Tor-browser is on the right track.

jenadine 8 hours ago [-]
That'd be the GDPR
cluckindan 7 hours ago [-]
Which is only applicable in the EU
6510 9 hours ago [-]
I don't know what it is called but if you try to open a window from a timeOut it wont work. The user has to click on something then the click even grants the permission.

You could make something similar where fingerprint worthy information cant be posted or used to build an url. For example, you read the screen size then add it to an array. The array is "poisoned" and cant be posted anymore. If you use the screen size for anything those things and everything affected may stay readable but are poisoned too. New fingerprinting methods can be added as they are found. Complex calculations and downloads might make time temporarily into a sensitive value too.

degamad 7 hours ago [-]
In the old days, something similar to what you're calling "poisoned" was called "tainted" [0].

In those scenarios, tainted variables were ones which were read from untrusted sources, so could cause unexpected behaviour if made part of SQL strings, shell commands, or used to assemble html pages for users. Taint checking was a way of preventing potentially dangerous variables being sent to vulnerable places.

In your scenario, poisoned variables function similarly, but with "untrusted" and "vulnerable" being replaced with "secret" and "public" respectively. Variables read from privacy-compromising sources (e.g. screen size) become poisoned, and poisoned values can't be written to public locations like urls.

There's still some potential to leak information without using the poisoned variables directly, based on conditional behaviour - some variation on

    if posioned_screenwidth < poisoned_screenheight then load(mobile_css) else load(desktop_css)
is sufficient to leak some info about poisoned variables, without specifically building URLs with the information included.

[0] https://en.wikipedia.org/wiki/Taint_checking

deadbolt 10 hours ago [-]
Just tried this with Brave and it didn't seem to work, assuming the site working means that it can remember me in an incognito browser. I gave the site a name, and then opened it in incognito (still using brave), and it acts as if I visited the site for the first time.

What am I supposed to witness?

cptskippy 9 hours ago [-]
It didn't work on Firefox mobile either... Why are all these browser companies breaking the web!
hi_hi 6 hours ago [-]
I think this is a bit overblown. Brave and Safari we're both private when I just tested. Chrome not so much, but thats expected.
antihipocrat 8 hours ago [-]
Unmodified server request headers contain enough information for tracking even if JS is disabled. If you're keen to modify http headers while browsing, then you could also modify any JS run on your system that snoops system information (or strip the info from any request sent to the server) and continue with JS enabled.
emsign 9 hours ago [-]
Web Browsers Must Be Removed

They run arbritrary code from sketchy servers called "websites" on people's hardware with way too many privileges. While free and open source standalone web applications exist that only use minimal JS code to access the same web resources with a much better user experience. Without trackers, without ads and third parties.

Kiro 8 hours ago [-]
I want a browser to be able to run arbitrary code. That's the whole point. I want to play a game or use a complex application in the browser without having to install anything.
afavour 9 hours ago [-]
It won’t happen because people don’t care enough.

I don’t mean to sound glib. But people derive a ton of utility from the web as it stands today. If they were asked if they supported the removal of web browsers they would absolutely say no. The privacy costs are worth the gains. If you want change you have to tackle that perception.

myHNAccount123 10 hours ago [-]
Works as advertised on Edge but not on safari
kstrauser 8 hours ago [-]
I can't get that site to work on Safari on my Mac, with JS enabled.
sensanaty 3 hours ago [-]
The more egregious and frankly disgusting one is https://fingerprint.com

IMO this service should straight up be made illegal. I love the tagline they have of supposedly "stopping fraud" or "bots", when it's obvious it's just privacy invasive BS that straight up shouldn't exist, least of all as an actual company with customers.

alkonaut 6 hours ago [-]
I have almost no hope that this is a matter that has a technical solution. The GDPR shows that law - even if not global, and even if not widely enforced - is pretty good at getting people to act. And most importantly, it will make the largest players the most afraid as they have the most to lose. And if just a handful of the largest players online are looking after peoples privacy then that is a huge win for privacy.

Doing what this demo shows, is clearly a violation of the GDPR if it works the way I assume it does (via fingerprints stored server side).

matheusmoreira 9 hours ago [-]
They can track you just fine via CSS and countless other ways. They'll even fingerprint the subtle intricacies of your network stack.

What we need to do is turn the hoarding of personal information into a literal crime. They should be scrambling to forget all about us the second our business with them is concluded, not compiling dossiers on us as though they were clandestine intelligence agencies.

hobs 10 hours ago [-]
I by default block JS on the web and only allow it for domains I accept. It's a tiny bit of work for a whole lot of safety.
jeroenhd 5 hours ago [-]
Google won't implement this spec. Currently, they're legally not allowed to, because advertisers called in the industry watchdog, asserting that without third party cookies to stalk users, they could not compete. Google extended their privacy sandbox, opened and closed it, talked about it, and eventually backed down from their plan to block third party cookies ASAP.

Maybe Chrome can get away with "the spec says it, sorry advertisers" but I doubt the courts will accept that.

nine_k 4 hours ago [-]
That is, Firefox can reject third-party cookies because it's not made by a company that deals in online advertising, but Chrome cannot, because Google is the biggest online ads dealer and thus would have an unfair advantage over other ads dealers, correct?
RainyDayTmrw 8 hours ago [-]
This is kinda hollow while Google controls Chrome, and Chrome has majority market share[1]. And, if regulators get their way, and Google divests Chrome[2], I'm not expecting that the new highest bidder would do any better with it.

[1] The exact figure may depend on which source you use, and there is some indication that ad and tracker blocking may artificially deflate Firefox and friends. https://gs.statcounter.com/browser-market-share [2] https://www.wired.com/story/the-doj-still-wants-google-to-di...

JoshTriplett 3 hours ago [-]
As long as the new steward of Chrome is not an advertising company, they will no longer be restricted from removing third-party cookies.
j16sdiz 9 hours ago [-]
> Some of the use cases that are important enough to justify the creation of purpose-specific solutions include federated identity, authorizing access to cross-site resources, and fraud mitigation.

Unpopular opinion: There are no privacy-preserving way for "fraud mitigation".

Either you accept fraud as cost to run business, or do away the privacy. Most business owner don't want the fraudulent user to come back, ever. If we value the privacy of user, we need to harm some business.

omeid2 8 hours ago [-]
In theory it is by possible by "blind attestations" by a 3rd party, in an indirect way, that is what you get by Cloudflare, where they monitor traffic from an "agent" using their own heuristics for identity, without sharing that identity with you.
oliwarner 1 hours ago [-]
Sure but this neither makes an attempt to list the valid uses of third party cookies, nor a suggestion of what magic definitely not a third-party cookie unicorn is going to ride in and offer us the safety we need. Pretty fluffy through and through.

I suggest that we do just need to keep third-party cookies but they're explicitly opt-in. That could just be allowing (once) a third party to be present everywhere (like a SSO) and browsers making it known when a third party is accessing data.

xnx 10 hours ago [-]
Careful what you wish for. Removing third party cookies without a replacement will make aggressive fingerprinting ubiquitous.
Springtime 10 hours ago [-]
I've always assumed fingerprinting was already ubiquitous. I look at the absolute absurdity of tracking/fingerprinting permission dialogs on sites, stating up-front their data sharing with 'trusted partners' in the hundreds ranges (thingiverse.com with over 900, theverge.com on mobile with over 800) and find it more surprising that the default state of all clients shouldn't be to block everything by default.

Edit: for clarity, I believe anything with the ability to analyze the user environment via Javascript/etc on major sites is likely fingerprinting regardless. Blocking, environment isolation and spoofing is already necessary to mitigate this.

deadbolt 9 hours ago [-]
Do you believe that while third party cookies exist, tracking companies aren't using other fingerprinting methods?
xenator 10 hours ago [-]
I have feeling that it is all related. When use see request to accept cookies with list of over 9000 trackers it doesn't mean that this page will have zillions of javasripts included on the page. It just means that site owners fingerprint user and process user interactions to third parties server side.

Only reason why we see this movement is because advertisers feels confident about removing third party cookies.

bennettnate5 10 hours ago [-]
...thus raising the bar for privacy-preserving techniques in client side browsing. Aggressive fingerprinting arrived years ago; if we can move beyond cookies altogether and focus on it as the next issue to tackle, I would think that's a net win. Saying that we should keep 3rd part cookies alive and healthy because it will keep websites using them against users rather than fingerprinting is just throwing the majority of users who don't know to block them under the bus. Plus it still leaves the door open for even privacy-conscious users to be defeated by fingerprinting anyways if a server is keen on tracking particular individuals.
xnx 32 minutes ago [-]
Fingerprinting defeating technology is just the kind of thing that I wish Firefox spent its effort developing instead of reimplementing features form Chrome like tab groups.
Terr_ 9 hours ago [-]
Yeah, the only way third-party cookies will block creepier fingerprinting crap is if the creepy stuff is prohibitively more expensive.

But once anyone gets a creepy fingerprinting system working, the barriers drop, and it becomes cheaper to resell the capability as a library or service.

It may offer some minor benefits in terms of enabling companies that "want to be more ethical than the competition", but that too seems like a long-shot. :p

9 hours ago [-]
johnmiroki 11 hours ago [-]
Replacement solutions must be provided before it's mandatory to remove third party cookies. Otherwise, it's doomed to fail.
recursive 10 hours ago [-]
Replacement for what use case? The whole point is to eliminate the behavior, not provide another feature that has the same problems. What does failure mean? It's a problem for ad networks, not for regular humans.
svieira 10 hours ago [-]
The use case of not having to log in to system A which is being embedded within system B because you already logged in to system A? Without needing to introduce a third party SSO C? That's pretty "regular human", even if it's "medium sized corporation" instead of "Joe Regular" (but even Joe likes it if he doesn't have to log into the comment box on every site that uses THE_COMMENT_SYSTEM_HE_LIKES.)
koolba 10 hours ago [-]
This exists already. You can have cookies at higher level of the same domain. So foo.example.com and bar.example.com can share cookies at example.com. You can also use CORS to interact with a truly third party site. None of these require third party cookies.
nwalters512 10 hours ago [-]
A use case this doesn't address is embedding across two completely different domains, which is pretty common in the education space with LMS platforms like Canvas (https://www.instructure.com/canvas) embedding other tools for things like quizzes, textbooks, or grading. I ended up in a Chrome trial that disabled third-party cookies which broke a lot of these embeds because they can no longer set identity cookies that they rely on from within their iframe.
svieira 10 hours ago [-]
As nwalters also points out, this isn't the same at all. System A and System A' both from Source Α are not the same as System A (Source Α) and System B (Source Β).

Which you know, because you say "you can also use CORS to interact with a truly third party site". But now, I invite you to go the rest of the way - what if the third party site isn't Project Gutenburg but `goodreads.com/my-reading-lists`? That is, what if the information that you want to pull into System A from System B should only be available to you and not to anyone on the net?

cuu508 8 hours ago [-]
Use OAuth2 to get system B's access token, then use authenticated server-to-server API requests to pull needed information from system B.
jfengel 10 hours ago [-]
The use case is web sites that want to earn income with as little user overhead as possible. Targeted ads have many downsides but they do pay websites without any money at all from the user, or even having to create an account.

So the problem for regular humans is the disappearance of features that they've grown used to having without paying any money. Finding a better way to support themselves has proven remarkably difficult.

deadbolt 9 hours ago [-]
I feel like many people here wouldn't care if those websites simply stopped existing.
bittercynic 8 hours ago [-]
Many people would, though.

For a long time I thought pinterest was search spam that no human could possibly want to see, but then I met real people in the world who like it and intentionally visit the site. I bet there are people who like ehow and the rest, too.

int_19h 6 hours ago [-]
The viability of their business model shouldn't be everyone's problem.
etchalon 7 hours ago [-]
People made money on advertising before the existence of cookies and ubiquitous tracking. Nature will heal.
JoshTriplett 3 hours ago [-]
And people had websites before the existence of Internet advertising. Let's set our expectations higher for how much healing is needed.
petesergeant 10 hours ago [-]
The article explicitly calls out that there are valid use cases (although doesn’t enumerate them). Federated sign-on and embedded videos seem like obvious examples
p_ing 10 hours ago [-]
Google/Chrome just declared that they won't be moving forward with removing 3rd party cookie support.

https://privacysandbox.com/news/privacy-sandbox-next-steps/

> Taking all of these factors into consideration, we’ve made the decision to maintain our current approach to offering users third-party cookie choice in Chrome, and will not be rolling out a new standalone prompt for third-party cookies.

svieira 10 hours ago [-]
Ah, now _that_ makes sense why this go published then. Glad to see that common sense prevailed. The day may come when all the use cases for third-party cookies that aren't "track Joe Regular all around the web" can be satisfied with other widely available web features, but until we have all those features I think taking a page from Linus' book and ensuring "we don't break userland" is important (and something I've always loved about the web and I'm glad to see it continuing).
somenameforme 9 hours ago [-]
Which use cases? I use Brave, which has a built in toggle to disable 3rd party cookies, which I have set to default, and at least my experience of 'the entire internet' works fine.
asddubs 5 hours ago [-]
embedded iframes that need to authenticate logins but don't trust the parent domain to store the login data there is a problem. You can somewhat work around it with the Storage Access API if that browser supports it (brave doesn't), but it does mean every embed requires a click by the user first before it works properly
hedora 9 hours ago [-]
Same here, but other browsers. I’ve had zero issues since well before the dot com crash.
Nevermark 10 hours ago [-]
Company whose market cap reflects pervasive surveillance non-requested announces that after serious consideration they won’t be removing technologies that enable pervasive non-requested surreptitious surveillance.”

It is going to be interesting to see if anti-trust enforcement's manages to separate Google from its financial and practical hold on web standards/browsers.

The opportunity to increase ethical norms of web browsing would be welcome to me.

pests 10 hours ago [-]
Google wants to remove third party cookies but they can't as the government sees it as anticompetitive to their competition. They dont need third party cookies, everyone else does.
svieira 9 hours ago [-]
Precisely - removing third-party cookies doesn't stop Google from tracking anyone. It just prevents anyone who doesn't own a browser and have one of the three major email providers from tracking everyone.

Well, it doesn't prevent them, but it does make it a little bit harder ...

pests 9 hours ago [-]
I personally think this decision hurts users more than anything else. We must let Google's competitors continue tracking us or else it won't be fair to them?

I don't even understand how being forced to divest Chrome will even help. Once another company owns Chrome and can remove third party cookies, Google gets the same benefit.

Nevermark 9 hours ago [-]
Google has remarkable financial influence across the four major commercial entity related browsers.

So limiting Google's control over browsers will create more competition. More competition on implementations. And also more competition in terms of features and user centric service.

--

Question: Does Google really not gather information from anything but its search engine and first party apps? That would seem financially non-optimal for any advertising funded business.

I would think that sure, they log everything peopel use their search for.

But that they would also find a way to track post-search behavior as well. Google leaving money on the table seems ... unusual if there isn't some self-serving reason they would forgo that.

I am happy to become better informed.

nemothekid 7 hours ago [-]
There are only 3 effective browsers - Chrome, Safari and Firefox. I don't see how limiting Google's control will create competition. The barrier to more browsers is the massive investment needed to create one, not any action that Google is doing.
VladStanimir 18 minutes ago [-]
You are correct, although its more correct to say there a only 3 major browser engines, Blink (used by all chromium derivatives), WebKit (used by Safari and some minor browsers), Gecko (used by Firefox and its derivatives). Creating a browser engine is hard, so hard that even a multi billion dollar company like Microsoft gave up on doing it. And we may soon witness Gecko going away as a side effect of the Google antitrust lawsuit.
6 hours ago [-]
youngtaff 4 hours ago [-]
Google could have removed third-party ten years ago as Safari did…

Their long wait to do it is part of why we ended up in a regulatory mess

driverdan 10 hours ago [-]
We don't need a replacement, they're not needed today. I've been blocking them for years and I can't remember the last time it caused a problem.
jeroenhd 5 hours ago [-]
Google has set up a replacement that puts the user in control of their ad interest tracking. It has its upsides and downsides, but I think it's pretty balanced. Anti-tracking features are embedded into the API so the API can't be abused by advertisers.

Of course, ad companies scream bloody murder, and the UK market watchdog had to step in so Google wouldn't turn off third party cookies by default.

hiccuphippo 10 hours ago [-]
Do not worry, the ad networks will come up with ways to circumvent it as soon as it becomes mandatory.
tejtm 10 hours ago [-]
done. third parties can be replaced with legally culpable first parties.
kgwxd 10 hours ago [-]
I've have them turned off since Firefox added the feature. Looks like that was around 2018, though I could have sworn it was much earlier than that. I've never had an issue where I had to make an exception for a site. Is there still some environment where it's common for them to be needed?
g-b-r 10 hours ago [-]
I don't recall a browser that didn't let you disable third-party cookies; given how long ago cookies were introduced, I could have forgotten about it, but I'm at least sure that Mozilla always supported it.

Firefox, especially in the first versions, permitted much less control on cookies than Mozilla did, but I think it still always allowed disabling third party cookies.

codeqihan 10 hours ago [-]
I have always blocked third-party cookies. The only problem I've encountered (there may be others, but I haven't come across them) is that some embedded videos on certain web pages won't play and prompt me to enable cookies.
badmonster 9 hours ago [-]
third-party cookies have done more harm than good, and it's time to fully remove them from the web platform. It is refreshing that their acknowledgment that replacements must not just be privacy-washed clones of the old model — purpose-built alternatives need to prove they don’t recreate the same surveillance infrastructure.
kazinator 9 hours ago [-]
> Some features of the web that people have come to expect, and which greatly improve user experience, currently depend on third-party cookies.

Idea: domains should be able to publish a text record in their DNS (similarly to SPF record for mail domains) designating other domains which are allowed to peek at their cookies.

Suppose I operate www.example.com. My cookie record could say that foo.com and bar.com may ask for example.com cookies (in addition to example.com, of course). A website from any other domain may not. As the operator of example.com, I can revoke that at any time.

Whenever a page asks for a cookie outside of its domain, the browser will perform a special DNS query for that cookie's domain. If that query fails, or returns data indicating that the page does not have access, then it is denied.

int_19h 6 hours ago [-]
But then all the ad-supported websites will whitelist the ad tracking cookies, which is precisely what they are trying to avoid here.
kazinator 4 hours ago [-]
Ah, but in so doing they will have to publish their whitelist, which will exhaustively have to list every single affiliated domain.

Browsers and browser extensions will be able to use that info to identify shit sites, turning the whitelist around into blacklisting uses, like ad blocking and whatnot.

One simple mechanism would be for the browser to deny the cookie request if the requested domain's cookie DNS record contains more than, say, three affiliated domains. (At the discretion of the browser developer, and user settings.) The proliferation of that sort of config would discourage domains from being overly promiscuous with their tracking cookie access.

Plus, existing cookie control mechanisms don't go away.

j16sdiz 8 hours ago [-]
Not a bad idea, TBH.

Just feeling uncomfortable putting more data into DNS. DNS is not encrypted. DNSSEC is easy to bypass (or break way too often that nobody want to enforce it).

-- but these are not w3c's problem.

kazinator 8 hours ago [-]
Yes; if someone hijacks example.com's main A record, that gets caught at the SSL level.

If someone hijacks example.com's cookie record, that won't be caught; they just write themselves permission to have their page access example.com's cookies.

The same info could just be hosted by example.com (at some /.well-known path or whatever). The web could generate a lot of hits against that.

The DNS records could be (optionally?) signed. You'd need the SSL key of the domain to check the signature.

pabs3 7 hours ago [-]
When you say bypass, do you mean disable DNSSEC on your own computer? Or are there known vulnerabilities in DNSSEC cryptography or software?
tptacek 8 hours ago [-]
DNSSEC isn't encrypted either.
chii 6 hours ago [-]
I dont think DNS should be overloaded to have a security measure.
kazinator 4 hours ago [-]
It's already used in a similar way for SPF records, in the context of e-mail.

Using a SPF record, a domain indicates hosts that are allowed to deliver mail on its behalf (meaning using an envelope sender address from that domain).

4 hours ago [-]
ordu 3 hours ago [-]
How about third party js? The site doesn't render properly without third party js from www.w3.org.
Animats 9 hours ago [-]
I haven't allowed third party cookies in a decade. No problem.
kstrauser 8 hours ago [-]
I had a little trouble when Safari rolled out ITP a while back. SSO providers scrambled to figure out how to fix federated logins, and because it affected every iPhone, they managed to do it with a quickness. I haven't had a single problem since.
dankwizard 9 hours ago [-]
Using a custom-built interception layer, I decouple session tokens from identifiable browser states, rotating my signature footprint every few requests via controlled entropy injection. “No more third-party cookies” sounds like a big shift, but it’s functionally irrelevant if your presence is already undetectable.
aligundogdu 6 hours ago [-]
This is actually a somewhat inconvenient wish, because the alternative would increase the fingerprint investments required for all browsers to recognise us.
lofaszvanitt 29 minutes ago [-]
Has anyone noticed this pattern that for some pulled out of my arse explanation, these standards groups and google suddenly remove features that would be useful to people, but they decided it's now not ok in the future. Like http referers now only show the domain, not the full url, because insert complete bs explanation. And now 3rd party cookies too...
noduerme 5 hours ago [-]
I block almost all 3rd party cookies, but at this point isn't it kind of nice to just have your google login follow you around, so you don't constantly have to login on other sites? Sure, it sucks for privacy, which is why your google account should never be tied to your phone number or your actual identity, but it's super convenient. Oh wait. It's tied to your real identity? Go back to square one and start a fake identity with all the root info. Buy a burner with a prepaid card, use it to set up a yahoo mail account, use that to set up a mail server you pay for in bitcoin, use that to verify a gmail account, and never let down your VPN. You're going to be tracked; the right move isn't to waste time worrying about that, it's to be someone invisible and untethered in the real world.
AdmiralAsshat 11 hours ago [-]
Fine. All that will happen is we'll see more sites switching to requiring a login to do anything on their website, so that they can track you with first-party cookies, and sell your information that way. Nothing will meaningfully change.

The only distinction is that I can do a decent job of blocking third-party cookies today with my existing solutions like uBlock Origin, but I will probably have a much more difficult time getting around login/paywalls.

recursive 10 hours ago [-]
First party cookies can't build a profile on you across multiple origins.
jmb99 10 hours ago [-]
They absolutely can. They have, at minimum, your account information and your IP address. Maybe you use a burner email address and/or phone number, and maybe a VPN, but chances are you’re not cycling your VPN IP constantly so there’s going to be some overlap there. And if you do cycle your IP, 99%+ of users probably aren’t clearing session cookies when doing so, which means you’re now tracked across IP/VPN sessions. Same deal if you ever connect without a VPN - that IP is tracked too. There’s tons of ways to fingerprint without third party cookies, they just make it easier (and also easier to opt out of if they exist, just disable third party cookies; if no one has third party cookies, sites are going to start relying on more intrusive tracking methods).

You can also easily redirect from your site to some third party tracking site that returns back to your successful login page - and fail the login if the user is blocking the tracking domain. The user then has to choose whether to enable tracking (by not blocking the tracking domain) or not seeing your website at all. Yes the site might lose viewers, but if they weren’t making the site any money, that might be a valid trade off if there’s no alternative.

Not saying I agree with any of this, btw, I hate ads and tracking with a passion - I run various DNS blocking solutions, have ad blockers everywhere possible, etc. Just stating what I believe these sort of sites would and can do.

hiccuphippo 10 hours ago [-]
All they need to do is redirect you through a central hub after login.
ear7h 10 hours ago [-]
Can't you just work around all of this by proxying to the third party site(s) with a subdomain?
crummy 10 hours ago [-]
I think you're right. I imagine if third party cookies were ever banned, we'd quickly see googleads.whatever.com become a common sight.
g-b-r 9 hours ago [-]
There's no need for a login to track you with "first-party cookies", looking at the IP is perfectly adequate, at most adding some fingerprinting if you really want.

The only problem is that then the tracking companies have to place more trust on the first party that they're giving them real data.

But they're doing it, actually, see confection.io for example

11 hours ago [-]
anothernewdude 7 hours ago [-]
UMatrix blocks those by default. Blocking third party cookies very rarely breaks anything. I can only think of one instance in the past five years, and that wasn't really a third party cookie, but one website using two different domains.
jeroenhd 5 hours ago [-]
You even don't need uMatrix for that. Every major website has a toggle for it in the settings.
nurettin 9 hours ago [-]
Sounds like a diversion. Websites can use local storage and fingerprinting to do anything they want at this point.
Svoka 10 hours ago [-]
So, the web Ad marked is being monopolized on platforms. Google and Facebook make overwhelming revenue from their own websites.

Now, down with the rest.

candiddevmike 10 hours ago [-]
Facebook pixel works just fine without third party cookies.
nolroz 11 hours ago [-]
Here we go again!