How should I check a site for dead links? [closed]

Solution 1:

The IIS SEO toolkit is great for this. It does a lot more than just search for broken links. http://www.iis.net/extensions/SEOToolkit

you can run reports with the data and also track reports over time.

Solution 2:

I have a Linux machine that has a cron job that runs linkchecker to send me a report.

http://wummel.github.io/linkchecker/

If you are running Ubuntu it is in the package manager.

sudo aptitude install linkchecker
man linkchecker

Lots of options. Works well for me. Can save the report in various formats.

Solution 3:

I'll vote for Xenu. Blindingly fast and gives you all kinds of other features.

Solution 4:

I haven't tried this, but I came across it last night while I was trying to beat wget into doing something else. May or may not be helpful in your case.

   --spider
       When invoked with this option, Wget will behave as a Web spider,
       which means that it will not download the pages, just check that
       they are there.  For example, you can use Wget to check your book‐
       marks:  

               wget --spider --force-html -i bookmarks.html

       This feature needs much more work for Wget to get close to the
       functionality of real web spiders.