Possible to catch URLs in linux?
There is a great program for windows (URL Snooper: http://www.donationcoder.com/Software/Mouser/urlsnooper/index.html) that allows you to view all URLs being requested on the machine.
Does any such program exist in linux (preferably command line)
Solution 1:
It seems that URL Snooper does not only apply to URLs being requested on the machine, but also to URLs hidden in the HTML source of some page, which are not necessarily requested yet. For the latter see also "How to download list of files from a file server?" here at Super User. Or, in Firefox see menu Tools » Page Info » Media, or use add-ons like Video DownloadHelper or UnPlug. The following applies to seeing all URLs that are actually requested.
The command line ngrep
could do it, but gives far more details then you'd probably want.
Like: it will not simply show you the URL as typed into the location bar of a browser, but the whole HTTP request. (So: the IP address as resolved by the browser before actually making the request, and then the HTTP request the browser sends to that IP address.) And: it will also show this for every image etcetera used in the resulting page.
You might need to install ngrep
, like on a default installation of Ubuntu:
sudo apt-get install ngrep
To capture all HTTP GET requests to port 80:
sudo ngrep -W byline -qilw 'get' tcp dst port 80
Still, that would show you the whole request. (Try for yourself, if you're a Super User!) To limit that output some more to show only lines with ->
, get
or host
:
sudo ngrep -W byline -qilw 'get' tcp dst port 80 \ | grep -i " -> \|get\|host"
Or, to capture all requests to port 80, but ignore those with the Referer
header set (as set when requesting embedded images etcetera, but also set when clicking a link in a web page, thus only showing requests that are typed into a browser's location bar directly, or are opened in a new window, or are opened from a bookmark or email):
sudo ngrep -W byline -qilwv 'referer' tcp dst port 80 \ | grep -i " -> \|get\|host"
Also sniffer tools like Wireshark have command line options. And, just as an aside and far more more basic, tcpdump
is installed on most Linux distributions:
sudo tcpdump -Alfq -s 1024 \ 'tcp dst port 80 and ip[2:2] > 40 and tcp[tcpflags] & tcp-push != 0' \ | grep -i " > \|get\|host"
Solution 2:
I can also recommend url-sniff by Pawel Pawilcz. It is a lightweight Perl script that wraps nicely around ngrep
. It supports colorized output as well. Here you will find a screenshot. It gives you a simple interface for sniffing all requested URLs.
Solution 3:
You could use an HTTP proxy such as Privoxy, but you'll have to configure your browser to use it - it doesn't snoop network traffic. It makes a log of URLs accessed which you can view with a text editor.