Why can't browser send gzip request?

Solution 1:

The client and server have to agree on how to communicate; part of this is whether the communication can be compressed. HTTP was designed as a request/response model, and the original creation was almost certainly envisioned to always have small requests and potentially large responses. Compression is not required to implement HTTP, there are both servers and clients that don't support it.

HTTP compression is implemented by the client saying it can support compression, and if the server sees this in the request and it supports compression it can compress the response. To compress the request the client would have to have a "pre-request" that actually negotiated that the request would be made compressed OR it would have to require compression as a supported encoding for ALL requests.

* UPDATE Feb '17 * It's been 8 years, but as @Phil_1984_ notes, a 3rd possible solution would be for the client and server to negotiate compression support and then use that for subsequent requests. In fact, things like HSTS work just this way with the client caching that the server expects to only speak TLS and ignore any unencrypted links. HTTP was explicitly designed to be stateless but we've moved beyond that at this point.

Solution 2:

A client can't know in advance that a server would understand a gzipped request, but the server can know that the client will accept one.

Solution 3:

It could, provided it could guarantee that the server would accept it. This might mean using an OPTIONS request.

There are a lot of things that web browsers could do (for example, pipelining) that they don't do. Web browser developers consider the compatibility implications of a change.

In a heterogeneous environment, there are a lot of different web servers and configurations. Making a change to the way a client works could break some of them.

Perhaps only 1% of servers might accept gzipped requests, but perhaps some of those advertise that they do, but cannot correctly accept it - so users would be denied from uploading files to those sites.

Historically there have been a lot of broken client / server implementations - for a long time, gzipped responses were broken in major web browsers (thankfully those are now mostly gone).

So you'd end up with blacklists of user-agents or servers (or domain names) where those options were automatically turned off, which is nasty.