How to request Google to re-crawl my website? [closed]
Solution 1:
There are two options. The first (and better) one is using the Fetch as Google option in Webmaster Tools that Mike Flynn commented about. Here are detailed instructions:
- Go to: https://www.google.com/webmasters/tools/ and log in
- If you haven't already, add and verify the site with the "Add a Site" button
- Click on the site name for the one you want to manage
- Click Crawl -> Fetch as Google
- Optional: if you want to do a specific page only, type in the URL
- Click Fetch
- Click Submit to Index
- Select either "URL" or "URL and its direct links"
- Click OK and you're done.
With the option above, as long as every page can be reached from some link on the initial page or a page that it links to, Google should recrawl the whole thing. If you want to explicitly tell it a list of pages to crawl on the domain, you can follow the directions to submit a sitemap.
Your second (and generally slower) option is, as seanbreeden pointed out, submitting here: http://www.google.com/addurl/
Update 2019:
- Login to - Google Search Console
- Add a site and verify it with the available methods.
- After verification from the console, click on URL Inspection.
- In the Search bar on top, enter your website URL or custom URLs for inspection and enter.
- After Inspection, it'll show an option to Request Indexing
- Click on it and GoogleBot will add your website in a Queue for crawling.
Solution 2:
The usual way is to either resubmit your site in your Google Webmaster Tools or submit it here: http://www.google.com/addurl/
Solution 3:
Google says that it is unable to control when your site is re-crawled. Regardless, you could also check this post on "forcing rewcrawls", I haven't tried it myself but it's worth a shot if you're desperate.
On another note, I might add that you make sure you have a sitemap.xml up as this will also help with SEO.
Solution 4:
As far I know, if you resubmit a sitemap it will trigger and crawler of your site.