Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Identifying users when they have JS and cookies disabled

Name: Anonymous 2018-06-18 2:02

Some web users disable cookies and JS because of things like tracking or deanonymization. But how about this:

Tracking by performance of cross-site requests.
1x1 transparent PNGs. But they are big files, with a lot of padding. You can easily make an image file bigger by opening it in a text editor and adding random meaningless text after the official EOF. Then save it.

So you can make a pretty big image even when it's a 1x1 transparent/invisible image, not noticeable by the person browsing the site.

Now let's say you had 100+ of these different padding image files, each hosted on a different server in a different location, but with the same domain name (because some people use browser add-ons to track or block cross-site requests). These images can also be updated by the server too, so that there is a difference when the user hits refresh, and it will reload the images because their cache is outdated.

With all these images, you can get the performance information with backend metrics shit. Latency, speed, jitter, packet loss (depending on the transport protocol), and so on. Because the servers the images are hosted on will be in different parts of the world, you can be sure that most people will have very different results. Someone in the US will load the images faster if they're on US servers. Less latency and packet loss too. Someone who lives in China will load images from China faster. Some client-side stuff also affects performance, but that would be the same across multiple visits.

It's not just about location, it's about having a set of identifiers that can be used to consider that particular user to be unique. It doesn't matter if they're using Tor or a VPN, or if they block cookies or javascript. You can still log the performance of these hidden blank images. The more images there are, and in different places, the more accurate it will be, because the odds of someone having the same performance for 100+ of these things is really low. Sort of like cell tower triangulation, if you're familiar with that. I think some of the Snowden leak documents mentioned that shit. Stingrays too. But that's a little off-topic.

Of course, each time, there could be an anomaly for performance (something being way slower than usual), but that's why the server makes changes to the post-EOF padding, in order to make the user's browser reload it. So it will happen again and again. Then, you can use Bayesian stats or some shit to come up with confidence intervals. So then your tracking software would be like "83% confidence +-5% rDev that this is user 234234234234" or something. Or maybe some neural network/deep learning shit.

The only way to mitigate this would be to block images entirely, or to randomize your network speed.

This method isn't perfect, but this is assuming you don't have more traditional methods of tracking available, so it wouldn't be a first resort. Really though, you might want to just do canvas rendering performance if JS is enabled.

Name: Anonymous 2018-06-21 4:13

>>17
moot didn't invent trolling
it was a thing even back in the days of usenet and dial-in BBSes, though it was usually called ``flaming''

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List