Schedule Your Site Crawls Regularly


Photo credit: Family Oon

What does it mean to crawl a website?
A website crawler is a program that indexes all the pages of a website and generates an XML Sitemap of the site contents; just like the spiders that crawl the entire World Wide Web, only on a much smaller scale. Completing a Sitemap crawl uncovers broken links,  flags frequency of change for URLs and also allows the site manager to generate a snapshot of the most current version of the site pages which will later be used by search engines to index the content.

Why should your site be crawled?
Your site should be crawled regularly to uncover errors like broken links, duplicate content, etc. but more importantly to generate an XML Sitemap which would then let search engines know quickly where all your content is and, more importantly, how often that content is changed so that the search engines can then know when and if it should return to recrawl a site for indexing. Additionally, many site crawlers will list SEO elements like meta, title, description, and keyword tags which can then be reviewed for duplicate content and be modified or updated as needed by your site manager. These elements of your site are important for both user experience as well as organic search optimization.

How often should my website be crawled?
Depending on the size and frequency of updates on your website you should complete a site crawl each time significant content is added, removed, or edited then generate the XML Sitemap for submission to Webmaster Tools like Google, Bing or Yahoo. This could mean daily, weekly or  monthly depending on the amount of activity on a site.

In most cases the people or team that maintains your website will complete a site crawl and fix any errors. There are also Sitemap generators that you can download to complete your own site crawls.

Here are a list of downloadable Sitemap generators that Google has been so kind to compile:

  • GSiteCrawler (Windows)
  • GWebCrawler & Sitemap Creator (Windows)
  • G-Mapper (Windows)
  • Inspyder Sitemap Creator (Windows) $
  • IntelliMapper (Windows) $
  • Microsys A1 Sitemap Generator (Windows) $
  • Rage Google Sitemap Automator $ (OS-X)
  • Site Map Pro (Windows) $
  • Sitemap Writer (Windows) $
  • Sitemap Generator by DevIntelligence (Windows)
  • Sorrowmans Sitemap Tools (Windows)
  • TheSiteMapper (Windows) $
  • Vigos Gsitemap (Windows)
  • WebDesignPros Sitemap Generator (Java Webstart Application)
  • Weblight (Windows/Mac) $
  • WonderWebWare Sitemap Generator (Windows)

Remember, a regularly crawled site will increase usability and help build trust in potential leads. Scheduling regular website crawls and fixing errors allows you to proactively maintain your site; making a better experience for visitors and optimizing for organic searchability. When was the last time your site was crawled?

See how Hall can help increase your demand.