Testing a website from Linux command line [closed]
I am looking for a tool to test a website from a Linux command line.
From the output, I need to know the http response (status codes) but also benchmark the time it takes to download the different elements of the site.
Thank you in advance.
Solution 1:
You can try wget
with -p
option:
wget -p http://site.com
It will tell you how long it takes to download each element and the return codes for each request.
Solution 2:
Please see Apache Benchmark:
Apache Benchmark
This should give you an overview of your page's performance.
Solution 3:
You may want to look at the following options of curl
:
-
--write-out
- displays any of several time-related variables -
--trace-time
- Prepends a time stamp to each trace or verbose line --verbose
-
--include
- (HTTP) Include the HTTP-header in the output. -
--trace-ascii <file>
- Enables a full trace dump of all incoming and outgoing data, including descriptive information
And the following option of wget
:
-
--timestamping
- Turn on time-stamping
Solution 4:
Selenium and Curl are good options depending on what your goal is. Also, a utility that I've come to like quite a bit is twill
. More information is available at http://twill.idyll.org/.
It's nice as it has it's own little specialized language for filling out forms, validating links, and checking response codes. Since it's just Python code, you can easily import the libraries and automate your tests yourself if you'd like to do something different.