There are many different facets of SEO that help search engines navigate websites and figure out what they’re about. We’ve previously discussed many of the on-page elements, such as title tags, H1, meta descriptions, alt image attributes, etc., which are items where keywords are used to help a site increase relevancy and rank for those keywords. However there are other technical areas that also play a part in organic search. Here are three items to consider that will help your site “be seen” and get indexed by search engines:
1. Generate an XML Sitemap
Think of an XML Sitemap as the road map to all of the pages on your site. It’s basically a list of all your site’s urls, which helps ensure that the search engine knows about all of the pages on your site that might not be discovered during the normal crawling process. Google, Yahoo, and Bing all have information on how to create and submit a Sitemap to their search engine – and they’re all nice enough to accept the same format so that only one has to be generated. Your webmaster (or whoever maintains your site) should at least be using Google’s Webmaster Tools for Sitemap submission and as a tool to diagnose problems that Google is having when accessing your site.
2. Use a Robots.txt File
The robots.txt file is a set of instructions that a search engine looks for and follows when it gets to your site. This is an opportunity to tell the search engine which parts of the site that you don’t want it to crawl, so it can focus on the pages that you do want to be shown in the search results. This file is useful for areas such as login pages, images, include files, etc. – anything that you don’t need to show up in a search engine for. It’s a standard best practice that we use, and part of any SEO audit I perform on a site. Google provides more in-depth information on this file to instruct your webmaster on how to create one and add it to the appropriate area of your website.
3. Create a www/non-www Forward or Redirect
Isn’t it great that we don’t have to type the “www” in a web address to get to a website anymore? Now that there are two ways to type in a url, it’s important to make sure that a redirect or forward is set up to resolve to “www.site.com” when “site.com” is used to access your site. Without this, the search engine could potentially index two copies of your website (both www.site.com and site.com), which means that you’ll have duplicate content issues. This may force the search engine to determine which page is more relevant, and possibly remove pages of your website from the index altogether. You can use this tool to figure out if this redirect is in place on your site, and if not, have your webmaster implement this as soon as possible. It can be done through either a 301 (permanent_ redirect) or in the A record. Oftentimes, your DNS provider can set this up for you with web forwarding.
How Does This Affect SEO?
Have I left you wondering what these technical aspects have to do with SEO? Think about it this way: if a search engine can’t crawl and subsequently index all of the pages of your site that you want it to, or pages are being omitted from a search engine’s index due to duplicate content, how will you ever have a chance to compete for your keyword rankings? Making your site as hospitable as possible to the spiders that come trolling along will ensure that all of your on-page SEO efforts aren’t in vain.