How come sites like Google/Facebook/etc. don't get DDOS'd even though they receive so many requests?

Something I don't understand:

(Tens/hundreds of?) thousands of people simultaneously try to connect to a site like facebook.com or google.com.

From what I understand, they must all necessarily connect to the same initial server (because DNS will return the same IP to many of them, and so all the requests go to the same target destination).

So a single machine/router must handle all the initial requests, even if it plans to forward them to other machines.

How come that single device doesn't get overloaded when this happens?


Solution 1:

Your understanding that they all connect to the same server is wrong, although the details of how you achieve those results are complex. http://highscalability.com/ has reference work on how some of the scalability solutions are put into play.

They have well more than just "one" server that clients connect to, even if the public IP address looks the same. Google, for example, make heavy use of anycast addressing to direct people, and usually they don't just have one IP address for each client - even if they return just one address when you ask.