How to avoid being scraped?
We have a searchable Database(DB) , we limit the results to 15 per page and only 100 results yet still get people trying to scrape the site.
We are banning sites that hit it fast enough. I was wondering if there is anything else that we can do. Flash render the results maybe?
You could make it a bit more difficult by retrieving the records via AJAX, and using an authentication ID (like an API key) for the AJAX calls.
Of course you can get around this by reading the ID and then making the AJAX request using that.
Rendering with Flash is an alternative as you point out (though still not 100% unscrapable), as is rendering in PDF.
Since there is obviously a demand for your database, have you thought about turning it around and providing what the scrapers want? Form a business connection with the scrapers and encourage appropriate use with an API?