Browser fingerprinting in depth
Canvas, WebGL, fonts, audio, viewport geometry, and why hiding your IP does not standardize your browser.
The transport-layer modules in this track established that low-latency anonymity overlays leak through traffic shape and that mixnets pay latency to address it. This module is about the application-layer leakage that all transport-level anonymity must compose with: browser fingerprinting. No matter how good your transport, if your browser is uniquely identifiable from JavaScript-readable surfaces, the destination service can recognize you across sessions, link your "anonymous" Tor visit to your identifiable home-network visit, and undermine the anonymity the transport bought you.
Browser fingerprinting is the process of identifying a browser instance — and by extension a user — from the implementation details and environment characteristics that scripts running on a webpage can read. It works without cookies, without login, without tracking pixels in the conventional sense. It works by combining many small implementation differences (some of which are unavoidable, some of which exist only because browsers haven't bothered to hide them) into a stable identifier that survives cookie deletion, IP rotation, and most casual privacy measures.
This module is the deep architectural treatment. We'll walk through what a fingerprint actually is (statistical identifier, not deterministic ID), enumerate the major feature surfaces that leak entropy (User-Agent, viewport, fonts, canvas, WebGL, audio, hardware-exposure APIs), explain why canvas and WebGL are particularly powerful, walk through Tor Browser's resist-fingerprinting architecture and Mullvad Browser's "Tor's hardening without Tor" experiment, and end with the paradox that user customization — even privacy-motivated customization — can shrink the anonymity set rather than enlarge it. The day-to-day hardening advice (which extensions to use, which settings to flip) lives in browser-fingerprint-hardening; this module is about why those choices matter at the architectural level.
Prerequisites
http-evolution-1-1-to-3— for understanding what HTTP request headers look like and why their shape matters.tls-1-3-handshake-byte-by-byte— TLS handshakes themselves are fingerprintable; seeja3-ja4-tls-fingerprintingfor the TLS-layer extension of this module.threat-models-for-network-anonymity— for the adversary-first thinking this module's evaluations require.traffic-analysis-fundamentals— for context on how application-layer leakage composes with transport-layer leakage.
Learning objectives
- Explain why browser fingerprinting persists even when cookies are cleared and the client is behind a VPN or Tor.
- Distinguish the high-entropy feature surfaces — User-Agent, viewport, fonts, canvas, WebGL, audio, hardware-exposure APIs, locale, timezone — and quantify roughly which leak how much identifying information.
- Compare the defensive philosophies of Tor Browser and Mullvad Browser as "hide in the crowd" systems, and explain why this approach differs from "block all tracking."
- Evaluate why anti-fingerprinting is a population-level standardization problem rather than a per-user gadget-permission checklist.
What a browser fingerprint actually is
A browser fingerprint is a statistical identifier built from many small observable properties of a browser instance. Each property contributes some entropy (information content); combined, they produce an identifier that's unique enough to recognize the same browser across visits.
Critically, a fingerprint isn't a single deterministic ID like a cookie. It's a vector of features. A fingerprinting service computes the vector for an incoming visitor, hashes it (or compares it more carefully), and asks "have I seen this vector before?" If yes, it's the same browser. If no, it's a new browser (or the existing browser has changed in some observable way).
The math: if you measure 10 features, each with 8 possible values, you have 8^10 = ~1 billion possible vectors. With a population of 1 million users, most users have a unique vector; the chance of two users colliding is small. With more features and more values per feature, uniqueness becomes overwhelming. Real fingerprinting services measure dozens to hundreds of features, producing identifiers that distinguish billions of unique browsers.
Fingerprinting is passive in the sense that it doesn't require consent or cookies; the browser exposes the features through normal Web APIs. It's active in the sense that the fingerprinting script chooses what to measure and how to combine the measurements. From the user's perspective, fingerprinting is invisible — no cookies are set, no obvious tracking pixels are loaded, no settings need to be adjusted by the tracking party.
What makes fingerprinting particularly stubborn against anonymity overlays:
- Network privacy doesn't help. A VPN or Tor changes what IP the destination sees but doesn't change what the browser tells the destination about itself. Same browser, different IPs — fingerprint matches; sessions get linked.
- Cookie deletion doesn't help. Fingerprinting is cookie-less by design.
- Incognito mode doesn't help. Incognito mode prevents persistent storage but doesn't change fingerprintable features.
- User-Agent spoofing alone doesn't help. UA is one feature among many; the rest of the fingerprint usually exposes the lie.
The defense is structural: make many users look identical at the fingerprint level, so the identifier loses uniqueness. This is the "hide in the crowd" model that drives Tor Browser and Mullvad Browser.
The feature surfaces that leak entropy
What does a fingerprinting script actually measure? The major surfaces, with rough entropy contributions:
User-Agent string — the HTTP User-Agent header. Identifies browser name, version, OS, sometimes device type. Modern browsers are converging on a "reduced UA" that exposes less detail (Chrome's User-Agent Reduction project), but legacy formats remain widely deployed. Entropy: 5-15 bits depending on browser+OS uniqueness in the population.
Accept headers — Accept, Accept-Language, Accept-Encoding. The order of values, the specific languages preferred, the encoding methods all leak. Entropy: 2-10 bits.
Viewport size and screen dimensions — window.innerWidth, window.innerHeight, screen.width, screen.height, screen.colorDepth, screen.pixelDepth. The user's window is usually unique to them — exact pixels of width and height are rarely identical between users. Entropy: 10-15 bits if window size is exposed precisely.
Timezone — Intl.DateTimeFormat().resolvedOptions().timeZone. The user's timezone, exposed via JavaScript. Entropy: 3-6 bits (many users in popular timezones, but the long tail is identifying).
Locale and language list — navigator.languages. Ordered list of preferred languages. Entropy: 3-8 bits.
Installed fonts — fonts available on the system, detectable through CSS measurement tricks (offsetWidth of test text in a fallback font reveals which fonts are actually installed). Entropy: 10-20+ bits in older browsers; reduced in modern privacy-aware browsers via font allowlists.
Canvas fingerprint — see next section; one of the highest-entropy surfaces.
WebGL fingerprint — see next section; complementary to canvas.
Audio fingerprint — minor variations in audio-API output (AudioContext rendering of a known input) reveal the audio stack and hardware. Entropy: 5-15 bits.
Plugins and MIME types — navigator.plugins, navigator.mimeTypes. Less informative on modern browsers (Chrome reports an empty list now), but legacy reporting still leaks.
Hardware concurrency and device memory — navigator.hardwareConcurrency (CPU cores), navigator.deviceMemory (RAM, rounded to 0.25/0.5/1/2/4/8). Entropy: 3-5 bits each.
Browser-specific behavior — feature detection via JavaScript reveals which browser engine and version is running. Different browsers handle edge cases differently; testing a hundred edge cases produces a fingerprint of the engine. Entropy: 5-15 bits depending on test set.
Permissions API state — whether camera, microphone, geolocation, notifications are granted, denied, or default. Entropy: 5-10 bits.
Battery API (now deprecated) — exposed battery level and charging state, which were uniquely combinable for short-term tracking. Removed from most browsers.
Network info — navigator.connection exposes effective connection type (4g, 3g, slow-2g) and round-trip estimates. Entropy: 2-5 bits.
WebRTC IP — the user's actual local IP, accessible via WebRTC even when behind a VPN, unless WebRTC is disabled or the browser hides local IPs. See webrtc-ip-leak-fix. This is one of the most identifying features that survives most privacy measures.
Touch support and pointer details — navigator.maxTouchPoints, pointer event characteristics. Distinguishes mobile from desktop and reveals input device specifics.
The total entropy: combining 20-30 features as described above easily produces 50+ bits of identifying information, more than enough to single out individuals in any plausible population. An attacker who measures 50+ such features has a near-unique identifier for each browser instance.
Canvas, WebGL, and audio as concrete examples
Three feature surfaces deserve special attention because they leak more entropy than most casual readers expect.
Canvas fingerprinting. The HTML5 Canvas API lets pages draw images programmatically. The "Pixel Perfect" paper (Mowery and Shacham, 2012) demonstrated that the same drawing instructions produce subtly different pixel output on different systems — different fonts (anti-aliasing differs by font, OS, and rendering hint settings), different graphics drivers, different sub-pixel rendering, different OS-level color profiles all combine to produce per-system pixel-level differences in canvas output.
The attack is simple: draw a string of text and a complex shape on a canvas, then read back the pixel data via canvas.toDataURL() or canvas.getImageData(). The resulting bytes are an identifier of the system that produced them. Two browsers on the same OS with the same fonts and the same graphics card will produce nearly-identical canvas output; two systems with any difference (different OS, different fonts installed, different driver version) produce different output.
The canvas attack works because every layer of the rendering stack contributes variation:
- The font rasterizer (FreeType on Linux, ATS/CoreText on macOS, GDI/DirectWrite on Windows).
- The font itself (Apple's San Francisco, Microsoft's Segoe, Linux's Noto).
- The hinting settings (full hinting, slight hinting, none).
- Sub-pixel rendering (RGB, BGR, none).
- Anti-aliasing strategy.
- The graphics card and driver.
- The compositor (Wayland vs. X11, macOS Metal vs. older).
The fingerprint is the cumulative output of all these layers. To make canvas not be a fingerprint, the browser would have to standardize the rendering stack — an enormous undertaking that would also break performance optimizations and platform-specific features.
Canvas can leak 10-20 bits of entropy in normal-population browsers; potentially more for systems with unusual hardware or fonts.
WebGL fingerprinting. Similar to canvas but for the GPU-accelerated 3D-rendering path. The WebGL API exposes:
- The graphics card vendor and renderer (
UNMASKED_VENDOR_WEBGL,UNMASKED_RENDERER_WEBGL). - Shading-language precision details.
- Maximum texture sizes, vertex attribute counts, etc.
- The pixel-level output of rendering a known scene.
The combination is highly identifying. The vendor and renderer strings alone contain substantial entropy ("ANGLE (Intel, Intel(R) UHD Graphics 770 Direct3D11 vs_5_0 ps_5_0)" is much more identifying than a User-Agent), and adding rendered-pixel comparison amplifies it.
WebGL can leak 15-25+ bits of entropy in normal-population browsers; modern browsers have started masking the renderer string (returning a generic placeholder) but not all do.
Audio fingerprinting. The Web Audio API includes an AudioContext for processing audio, and an OfflineAudioContext for processing without real-time playback. The fingerprinting attack: run a known signal through an audio-processing pipeline (oscillator, filter, etc.) and compare the output to expected values. Floating-point arithmetic, audio-stack quirks, and DSP implementation differences produce small variations.
Audio fingerprinting is lower-entropy than canvas or WebGL — typically 5-10 bits — but it's a useful additional discriminator when combined with the others.
The reason these three surfaces are so powerful: they expose hardware-and-OS-level details that browsers can't easily hide without breaking the legitimate uses of the underlying APIs. Canvas is legitimately useful for image processing, WebGL for 3D graphics, audio for music apps. Removing the fingerprint vector while preserving the API utility is hard.
Browser fingerprinting is not just one hash
A naive fingerprinting setup hashes the feature vector and compares hashes. Real fingerprinting services do better:
Stable vs. unstable features. Some features change frequently (window size when resized, battery level over time, network conditions). Smart fingerprinting services treat these as soft features — useful for short-term linkage but ignored for cross-session identity. Stable features (User-Agent, hardware concurrency, canvas output) are the identity backbone.
Linkability across visits. Even when a user "changes" their fingerprint (browser update, new font installed, different timezone), the change is usually small. A fingerprint that matches 95% of a previously-seen vector is probably the same browser with one update. Trackers maintain probabilistic models that link fingerprints across changes.
Probabilistic matching. Rather than exact-match hashing, sophisticated trackers compute fingerprint similarity. Two vectors that differ in 1 of 30 features are probably the same browser; two vectors that differ in 15 features are probably different browsers.
Behavior over time. Time-correlated patterns reveal usage habits that supplement the fingerprint. A user who always visits the site at 9am Pacific is identifiable by that pattern even if their fingerprint changes.
Server-side model updates. Tracker databases learn over time. New browser versions get added; old browsers get retired; known-bot patterns get filtered; regional differences get accounted for. The fingerprint is computed against a current model, not a static one.
The result: fingerprinting is a moving target on both ends. Trackers update their models; browsers add resist-fingerprinting features; users adopt new tools that change their fingerprint. The arms race continues.
Why VPNs do not solve this problem
A common confusion: "I'm behind a VPN, so I'm anonymous." VPN provides network-location privacy — the destination service sees a VPN exit IP rather than your real IP. It does not provide application-layer privacy:
- The browser still tells the destination its User-Agent.
- The browser still has the same fingerprint surfaces.
- The browser still has the same fonts, graphics card, audio stack.
- The browser still leaks the same canvas, WebGL, and audio outputs.
From the destination's perspective: "User U is visiting from VPN-IP-X. User U's browser fingerprint matches Browser-Fingerprint-F. I've seen Browser-Fingerprint-F before from non-VPN-IP-Y. The same browser is using a VPN today and not yesterday."
The VPN bought you network privacy from the local ISP, the network in between, and observers who can only see IP-level information. The VPN did not buy you privacy from the destination service or any service that does fingerprinting. To get application-layer privacy, you need application-layer defenses — which is what Tor Browser and Mullvad Browser provide and what generic VPNs cannot.
This isn't a flaw in VPNs; it's a scope difference. VPNs solve a network-layer problem. Browser fingerprinting is an application-layer problem. The right tool for each is different. Combining a VPN with a fingerprint-resistant browser addresses both layers; using only one leaves the other unaddressed.
Tor Browser's design philosophy
Tor Browser's approach is the reference implementation of "hide in the crowd." The core principle: every Tor Browser instance should look identical to every other Tor Browser instance, so that fingerprinting cannot distinguish individuals within the population.
The defenses, all documented in the Tor Browser design specification:
User-Agent uniformity. All Tor Browser instances of the same version report the same User-Agent string, regardless of the underlying OS. The string typically claims to be a generic Firefox-on-Windows-10 — a deliberately popular configuration to maximize the crowd.
Standardized window size. When you open Tor Browser, the window starts at a fixed size. If you resize, Tor Browser lies about the actual window size to scripts (rounding to a fixed grid) and adds letterboxing — gray bars around the content area — to prevent unique window dimensions from leaking. The set of reported sizes is small and discrete; many users share each size.
Limited fonts. Tor Browser ships with a fixed set of fonts and bundles them. The browser refuses to use system-installed fonts, removing the "which fonts are installed" surface entirely. All Tor Browser users have the same font set.
Canvas and audio gating. Sites that try to read canvas pixels or generate audio fingerprints get a permission prompt. The user can grant the site canvas access (for legitimate use cases like graphic apps), or refuse. By default, fingerprinting attempts produce empty or randomized output.
WebGL restrictions. WebGL is disabled by default for non-allowlisted sites. The sites that get WebGL get the standard Tor-Browser-bundled GPU info, not the user's actual GPU.
Locale and timezone uniformity. Tor Browser reports a single locale (en-US by default; configurable) and a fixed timezone (UTC). Users with different actual locales and timezones all look the same on the wire.
No hardware concurrency / device memory. Those APIs return fixed values to scripts.
No WebRTC local IP exposure. WebRTC is configured to not reveal local network IPs.
Disabled or restricted APIs. Battery API, network information API, sensor APIs are restricted or disabled to prevent fingerprinting through those vectors.
JavaScript is restricted by default. Tor Browser ships with NoScript and a default security level that disables JavaScript on sites that don't need it. This eliminates many fingerprinting opportunities at the cost of breaking sites that depend on JavaScript.
Letterboxing. When the user resizes the window, instead of reporting the actual size, the rendered area stays at a multiple of 100x100 pixels with gray bars around it. The window dimensions reported to scripts are coarse-grained.
The total effect: a Tor Browser user's fingerprint is largely standardized to "the Tor Browser fingerprint." Within the population of Tor Browser users (a few million people globally), individual fingerprints are not unique; they fall into a small number of equivalence classes.
The cost: sites break. Many sites assume browser features that Tor Browser disables; canvas-dependent apps don't work; WebGL games don't run; window-size-dependent layouts may look strange. Tor Browser users accept this cost in exchange for the anonymity. For users whose threat model justifies the breakage, it's worth it; for users browsing routinely, it's friction that may push them back to a less-private browser.
Mullvad Browser as Tor Browser's crowd-without-Tor experiment
Mullvad Browser was launched in 2023 as a collaboration between Mullvad VPN and the Tor Project. The idea: take Tor Browser's anti-fingerprinting hardening and ship it without the Tor network underneath. Users get the fingerprint-resistance benefits over their normal connection (or over Mullvad VPN if they're a Mullvad user).
The motivation: Tor's transport adds latency and breaks some sites; many users want the application-layer anti-fingerprinting without the transport-layer changes. Mullvad Browser fills that niche.
The architecture is largely the same as Tor Browser's: same User-Agent uniformity, same letterboxing, same canvas/WebGL gating, same font restrictions, same JavaScript-by-default-low-security level. The differences:
- No Tor circuit; traffic goes through whatever the OS routing says.
- Slightly different default search engine (DuckDuckGo without onion fallback).
- Some Tor-specific UI removed.
The interesting question: does Mullvad Browser actually achieve fingerprint-resistance comparable to Tor Browser? In principle yes — the same browser hardening is in place. In practice, the population matters: Tor Browser has millions of users globally; Mullvad Browser has fewer. The "crowd" you're hiding in is smaller, which means the equivalence classes (groups of users who all look the same) are smaller, which means individual identifiability within the smaller population is higher.
The crowd-population issue is the central tradeoff. Mullvad Browser's anti-fingerprinting is identical to Tor Browser's at the per-instance level; the anonymity-set size depends on the user population. As Mullvad Browser adoption grows, the crowd grows; as it stays small, the crowd is the thing limiting anonymity rather than the technical hardening.
The other consideration: Mullvad Browser doesn't address transport-level surveillance. A Mullvad Browser user behind their normal ISP is still observable by their ISP, by destinations that see their real IP, by anyone who can correlate timing or volume. Mullvad Browser is for users who want application-layer privacy more than transport-layer privacy; users who want both should still use Tor Browser over Tor.
The paradox of customization
Privacy-conscious users often install browser extensions, tweak settings, customize fonts, install ad-blockers, disable JavaScript, adjust permissions. Each individual choice may seem privacy-positive. The aggregate effect can be the opposite.
Reasoning: anti-fingerprinting works through population uniformity. If you customize your browser, you may move yourself out of the standard population's equivalence class into a smaller, more-identifiable class.
Examples:
Installing a rare extension. If 1% of users have your specific extension installed, you're in a 1% group. If 0.01% have your specific combination of three extensions, you're in a 0.01% group. The combination of customizations is what shrinks the anonymity set.
Disabling specific features. If 5% of users disable canvas, you're in a 5% group. If you disable canvas AND change User-Agent AND install certain extensions, the combinations multiply.
Using a non-default font configuration. Installing rare fonts (or disabling common ones) makes your font fingerprint distinctive even if individual font choices seem unremarkable.
Custom CSS. User stylesheets that change rendering produce different canvas outputs.
Tweaking browser preferences. Changes to the browser preferences file (about:config in Firefox, chrome://flags in Chrome) can be exposed through behavior testing.
Privacy-extension fingerprints. Some privacy extensions are themselves fingerprintable — they intercept browser APIs in specific ways that scripts can detect.
The principle: the only way to be in a large anonymity set is to be in the modal configuration. Tor Browser ships a single recommended configuration precisely so all users converge on the same fingerprint. Diverging from the recommended configuration — even toward "more private" choices — usually shrinks your anonymity set.
This produces a counterintuitive result for privacy-conscious users: the right answer is usually "use Tor Browser as configured" or "use Mullvad Browser as configured" rather than "use your favorite browser plus a stack of privacy extensions." The "favorite browser plus extensions" path is highly customized and therefore highly identifiable; the "stock anti-fingerprinting browser" path is intentionally generic.
The right hardening advice for daily use is in browser-fingerprint-hardening — but the architectural rule is: customization is identifying, even when the customization is privacy-motivated.
Measurement, breakage, and the real cost of resistance
Anti-fingerprinting defenses cost site compatibility. Websites that depend on the disabled features break. Canvas-based image editors don't work. WebGL-based games don't render. Sites that use precise window dimensions for layout look wrong. Sites that require JavaScript at all may not function under default Tor Browser.
This is the deep operational tension: privacy-resistant browsers must standardize to provide their privacy property, but standardization breaks websites that legitimately need the features. There's no clean resolution. The compromise:
- Permission prompts for features that some sites legitimately need (canvas access, audio context, microphone, etc.). Default-deny; user can allow specific sites.
- Per-site exceptions. Some browsers let users escalate certain trust to specific sites — granting WebGL only on game.example.com, etc.
- Multiple security levels. Tor Browser has Standard, Safer, and Safest levels; users can pick how aggressive the defenses are based on the visited site.
- Fallback rendering. Canvas without specific fonts may render text in a generic substitute, breaking pixel-perfect designs but not breaking functionality.
The accessibility tension: assistive technology often depends on browser features that fingerprinting targets. Screen readers may need access to layout that fingerprinting code can also read. Custom-color-scheme users may have unique CSS state. The defense must consider that some users have legitimate reasons to be different from the modal user.
For daily browsing, the cost of running Tor Browser or Mullvad Browser exclusively is real. Many users adopt a multi-browser strategy: a privacy-resistant browser for sensitive activity (research, dissident communication, medical lookups), a normal browser for trivia (food delivery, streaming services). The compartmentalization keeps the privacy benefits where they matter without making everyday browsing miserable.
The honest engineering practice: set the threat model first (which sites are sensitive?), then choose the browser per-context. "Always use Tor Browser for everything" is a strict policy that delivers strong privacy at a real usability cost; "always use Chrome with Adblock" delivers a cosmetic privacy improvement that doesn't address fingerprinting at all. The middle path requires per-context decisions.
Hands-on exercise
Inspect a few exposed surfaces in the browser.
Tools: any browser with developer tools open. Runtime: 15 minutes.
Open a fresh browser window, open the developer console (F12 or Cmd+Option+I), and paste:
// Read a small set of fingerprintable properties.
const fingerprint = {
userAgent: navigator.userAgent,
language: navigator.language,
languages: navigator.languages,
platform: navigator.platform,
hardwareConcurrency: navigator.hardwareConcurrency,
deviceMemory: navigator.deviceMemory,
timezone: Intl.DateTimeFormat().resolvedOptions().timeZone,
screen: { width: screen.width, height: screen.height,
colorDepth: screen.colorDepth, pixelRatio: window.devicePixelRatio },
viewport: { width: window.innerWidth, height: window.innerHeight },
};
// Compute a minimal canvas fingerprint.
const canvas = document.createElement("canvas");
canvas.width = 280; canvas.height = 60;
const ctx = canvas.getContext("2d");
ctx.font = '18px "Arial"';
ctx.fillStyle = "#069";
ctx.fillText("Hello, fingerprint!", 4, 24);
ctx.font = '20px "Times New Roman"';
ctx.fillStyle = "rgba(102, 204, 0, 0.7)";
ctx.fillText("Browser ID", 4, 50);
fingerprint.canvasHash = canvas.toDataURL().slice(0, 64);
console.log(JSON.stringify(fingerprint, null, 2));
Read the output. For each property, ask:
- Is this property unique to me, or common across many users?
- Does my browser version show a reduced UA, or a full one?
- Does my screen size match common defaults (1920x1080, 1366x768) or is it unusual?
- Is my canvas output different from what a colleague's browser would produce?
Stretch: open the same page in Tor Browser (download from torproject.org if you don't have it). Run the same script. Compare the outputs. Tor Browser should report the same User-Agent regardless of your real OS, no detailed hardwareConcurrency, fixed window size on grid points, and a canvas output that's either standardized or refused.
Evaluate a browser configuration for crowd compatibility.
Tools: notes. Runtime: 10 minutes.
Consider this list of customizations a privacy-conscious user might apply:
- Install uBlock Origin
- Install a tracker-blocking extension
- Disable WebRTC via a preference flip
- Install three rare fonts for typography preference
- Set the browser to use a custom User-Agent
- Pin tab manager extension
- Custom dark-mode CSS via a stylesheet extension
- Disable JavaScript on most sites via NoScript
- Configure DNS over HTTPS
For each, ask: does this increase or decrease anonymity-set size? Some answers:
- uBlock Origin: very common; minimal customization fingerprint.
- Tracker-blocking extension: depends on the extension; some have fingerprintable behavior.
- Disable WebRTC: small fraction of users; reduces a leak but adds a customization fingerprint.
- Three rare fonts: increases fingerprintability substantially.
- Custom User-Agent: small fraction of users; the lie is detectable through behavior testing.
- Tab manager: depends; some are detectable through DOM inspection.
- Custom dark-mode CSS: detectable through canvas reads.
- NoScript with custom rules: detectable through network behavior.
- DoH: doesn't affect browser fingerprinting; affects DNS-layer privacy.
The answers depend on the population. uBlock Origin is so common that its fingerprint is the modal one. A custom dark-mode CSS is rare and identifying. The right intuition: privacy customizations should be popular enough to maintain the crowd, not unique enough to mark you out.
Common misconceptions and traps
"A VPN stops browser fingerprinting." No. A VPN changes network location, not browser behavior. The destination still sees the same User-Agent, canvas output, fonts, and other application-layer features. VPN + non-fingerprint-resistant browser leaves application-layer identification intact.
"Blocking cookies is enough." Fingerprinting is cookie-less by design. Cookies are one tracking vector; fingerprinting is a different one that survives cookie deletion. A site that fingerprints can recognize you across sessions even if you wipe cookies between every visit.
"More privacy extensions always help." Privacy extensions add to the customization fingerprint. A user with a unique combination of extensions may be more identifiable than a user with the default browser configuration, even though each extension individually addresses a tracking vector. The right approach is to use a few popular extensions or to use a fingerprint-resistant browser, not to maximize the extension stack.
"Canvas blocking solves the whole problem." Canvas is one surface among 20+. Blocking canvas without addressing User-Agent, fonts, WebGL, audio, and the rest leaves the fingerprint largely intact. Real defenses standardize many surfaces simultaneously.
"Anti-fingerprinting is just a browser setting." It requires population-level uniformity. A single user setting a "resist fingerprinting" flag in Firefox produces a slightly-different-from-default fingerprint; that's still distinguishable. The flag works only if many users set it (and Firefox's default Resist Fingerprinting setting moves you toward the Tor Browser fingerprint, which is helpful as long as enough users use it). The defense is sociotechnical, not just per-user.
"My fingerprint changes when I update my browser, so it's not stable." Probabilistic linking handles small changes. A fingerprint that matches 95% of a previously-seen vector is treated as the same browser with one update. Trackers don't require exact matches.
"Fingerprinting is illegal under GDPR/CCPA." Some jurisdictions have laws that limit fingerprinting (especially for advertising), but enforcement is uneven and many fingerprinting practices continue regardless. Even where regulated, the technical measurements still happen; the legal restriction is on storing and using the result. Don't rely on legal compliance as your fingerprinting defense.
"I use private/incognito mode, so I'm safe." Private mode prevents persistent storage (cookies, history, cache). It does not change fingerprintable features. Private mode does not provide fingerprint resistance.
Wrapping up
Browser fingerprinting identifies browser instances through statistical combinations of small implementation details — User-Agent, viewport, fonts, canvas, WebGL, audio, hardware-exposure APIs, locale, timezone, and dozens of others. It works without cookies or login, persists across IP changes, and survives normal privacy measures like incognito mode and ad-blocking.
The defense is structural standardization: make many users look identical at the fingerprint level so the identifier loses uniqueness. Tor Browser implements this aggressively — uniform User-Agent, fixed window sizes, restricted fonts, canvas/WebGL gating, locale uniformity, JavaScript restrictions. The cost is site breakage; the benefit is that all Tor Browser users fall into a small number of equivalence classes rather than being individually identifiable.
Mullvad Browser is the same hardening without Tor underneath. It addresses application-layer fingerprinting for users who don't want the transport-level cost of Tor; the open question is whether the user population is large enough for the crowd to be meaningfully large.
The customization paradox: privacy-motivated customizations can shrink the anonymity set if they move the user out of the modal configuration. The right answer for most threat models is "use a fingerprint-resistant browser as configured" rather than "customize a normal browser with privacy extensions." Day-to-day hardening choices live in browser-fingerprint-hardening; the architectural principle is to converge on the crowd, not to diverge in the name of privacy.
The next module (os-and-tcpip-stack-fingerprinting — coming soon) extends this analysis below the browser to the OS and TCP/IP stack, where similar identification works on different features (TCP option ordering, OS-specific quirks, TLS JA3/JA4 fingerprints) and the defenses are even harder.
Further reading
- Browser Fingerprinting: An Introduction and the Challenges Ahead — The Tor Project, 2019 — concise systems-level explanation from the team most focused on resisting browser fingerprinting in practice.
- The Design and Implementation of the Tor Browser — detailed primary-source rationale for concrete defenses.
- Pixel Perfect: Fingerprinting Canvas in HTML5 — Mowery and Shacham, 2012 — the canonical canvas-fingerprinting paper.
- Browser fingerprinting — tracking behind the curtain — Mullvad — practitioner framing that complements the Tor literature.
- All together as one: This is how the Mullvad Browser works — Mullvad — useful for understanding crowd-based anti-fingerprinting outside the Tor network context.
// related reading
Decoy routing and refraction networking
Telex, TapDance, Slitheen, and Conjure: how cooperative infrastructure on ordinary network paths changes the evasion game.
Hysteria and QUIC-based transports
Why QUIC became an evasive substrate, how Hysteria uses it, and what QUIC-based camouflage still leaks to modern detectors.
Operational anonymity for engineers
Compartmentation, browser discipline, transport choice, telemetry minimization, and how to turn anonymity theory into a survivable daily operating model.