how can an application use port 80/HTTP without conflicting with browsers?

There are 2 ports: a source port (browser) and a destination port (server). The browser asks the OS for an available source port (let's say it receives 33123) then makes a socket connection to the destination port (usually 80/HTTP, 443/HTTPS).

When the web server receives the answer, it sends a response that has 80 as source port and 33123 as destination port.

So if you have 2 browsers concurrently accessing stackoverflow.com, you'd have something like this:

Firefox (localhost:33123) <-----------> stackoverflow.com (69.59.196.211:80)
Chrome  (localhost:33124) <-----------> stackoverflow.com (69.59.196.211:80)

Outgoing HTTP requests don't happen on port 80. When an application requests a socket, it usually receives one at random. This is the Source port.

Port 80 is for serving HTTP content (by the server, not the client). This is the Destination port.

Each browser uses a different Source to generate requests. That way, the packets make it back to the correct application.


It is the 5-tuple of (IP protocol, local IP address, local port, remote IP address, remote port) that identifies a connection. Multiple browsers (or in fact a single browser loading multiple pages simultaneously) will each use destination port 80, but the local port (which is allocated by the O/S) is distinct in each case. Therefore there is no conflict.


Clients usually pick a port between 1024 and 65535. It depends on the operating system how to handle this. I think Windows Clients increment the value for each new connection, Unix Clients pick a random port no.

Some services rely on a static client port like NTP (123 UDP)