Some websites bypass hosts file in Safari
An answer on Stack Overflow
Why does /etc/hosts not work anymore for some websites on Big Sur?
(unfortunately closed)
gives as answer a new DNS feature added in order
to speed up access to sites that support HTTPS,
but which unfortunately may workaround the hosts
file.
I quote the answer here:
macOS 11 added support for what is officially called “Service binding and parameter specification via the DNS (DNS SVCB and HTTPSSV)”.
Now, when you visit a website, it’s not just the typical DNS
A
host-to-ip-address record that’s consulted, but a brand-newHTTPS
DNS record is checked too. It’s not just a name entry; it’s a brand-new record type (#65), to go along with the more familiarA
andCNAME
andMX
.These new
HTTPS
DNS records can indicate that the site supports HTTPS, including protocol versions and IP addresses. That way, typing in a bare domain name gives thehttps://
version of the site right away, maybe even on HTTP/2 or HTTP/3, skipping the old-fashioned HTTP redirect. There’s even a draft option for domain operators to tell your computer to bypass any local DNS settings and use a specific server for all future DNS queries involving their domain.There are many pro-performance intentions here, and some pro-privacy ones too.
But there is a fatal privacy and security flaw in both the specification and implementation: it removes the ability for users to override domain name lookups in
/etc/hosts
, even when faced with actively malicious domain name operators.To see how this is working in action:
The version of
dig
that comes with macOS doesn’t directly support these new records, but you can see whether they exist with$ dig -t type65 www.politico.com … ;; QUESTION SECTION: ;www.politico.com. IN TYPE65 ;; ANSWER SECTION: www.politico.com. 53806 IN CNAME www.politico.com.cdn.cloudflare.net. www.politico.com.cdn.cloudflare.net. 300 IN TYPE65 \# 58 0001000001000302683200040008681210CA681211CA000600202606
47000000000000000000681210CA2606470000000000000000006812 11CA …
I don’t know how to parse that, but wireshark does if I packet-capture it
Domain Name System (response) Queries www.politico.com.cdn.cloudflare.net: type HTTPS, class IN Answers www.politico.com.cdn.cloudflare.net: type HTTPS, class IN Name: www.politico.com.cdn.cloudflare.net Type: HTTPS (HTTPS Specific Service Endpoints) (65) Class: IN (0x0001) Time to live: 300 (5 minutes) Data length: 58 SvcPriority: 1 TargetName: <Root> SvcParams SvcParam: ALPN SvcParamKey: ALPN (1) SvcParamValue length: 3 ALPN length: 2 ALPN: h2 SvcParam: IPv4 Hint SvcParamKey: IPv4 Hint (4) SvcParamValue length: 8 IP: 104.18.16.202 IP: 104.18.17.202 SvcParam: IPv6 Hint SvcParamKey: IPv6 Hint (6) SvcParamValue length: 32 IP: 2606:4700::6812:10ca IP: 2606:4700::6812:11ca
So that’s what’s happening:
- Safari on Big Sur can load some websites you’ve blocked in
/etc/hosts
, because it gets their IP addresses from these newHTTPS
records- It can only do that for some sites, because most domain name operators haven’t set this up yet. It looks like Cloudflare has done this for everyone on their platform; fortunately most domain name operators, including the advertising/tracking/malware giants, haven’t caught on to this yet.
For now, you can keep using
/etc/hosts
for domain names that you fully control.In the meantime, for other domains, you have some options:
- you could run a local DNS server or firewall on your home network that blocks these requests
- you could configure a local DNS resolver daemon on your mac, and use it to block these requests
- you could switch to a Linux distribution where a configurable local resolver daemon is the default
- you could stop using Safari, although other apps using the default macOS networking stack may continue silently bypassing
/etc/hosts
Chrome has run some trials for this but does not appear to have implemented it yet. Firefox has started implementing it but doesn’t seem to have gotten too far.