Testing network connectivity on a server with Gigabit Ethernet

Aria2 is command line tool similar to wget that supports multiple simultaneous downloads over http , bittorent, ftp etc.

aria2c -d /dev -o null  --allow-overwrite=true -x 15 url  --file-allocation=none

Download file with 15 connections to /dev/null.

--allow-overwrite prevents aria from trying to rename /dev/nulll.

I prefer not to start allocating space before the download since it takes time for the download to start


You will be limited to less then the speed of the slowest link. You could have a 10Gig connection, but if your internet connection is Dialup, you are going to be waiting. Even on a LAN that can support 1GB end to end, you may see a bottlneck with the read speeds of the source server or the write speeds of the destination server.


There are many factors that contribute to this:

For one thing, you're downloading over the Internet. Let's assume you truly have a gigabit down connection at your disposal:

TCP overhead can eat anywhere from 5-10% of your bandwidth - for simplicity's sake let's say 10%. So you're down to 900Mbits/s.

Remote server load is a major factor and you can't control or see it. Many servers can easily pull 200 MB/s read times, but under load it can push the speeds down.

Routing is a factor in speed too. If your route is saturated, speed will suffer.

And finally ...do you really have a gigabit connection to the Internet, or is it just your port speed? Speeds are limited by the slowest link that you cross. Also, if you have a hosted server with a gigabit link, these are often shared by other clients and you don't get a dedicated gigabit link to begin with.

Edit: The reason I didn't recommend a tool is because they're a google search away and there's tons.