SITEMAP & ROBOTS
Sitemaps are an easy way for you to inform search engines about pages on your site that are available for crawling. The storefront system automatically generates a sitemap. In its simplest form, a sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Search engines send out tiny programs called spiders or robots to search your site and bring information back so that your pages can be indexed in the search results and found by web users. When there are files and directories you do not want indexed by search engines, we can use the "robots.txt" file to define where the robots should and should not go. These files are very simple text files that are placed on the root folder of your website: www.yourwebsite.com/robots.txt. Search engines like them because it saves them time indexing your site, and you benefit because they do not list pages that are designed for the sole purpose of programming or your control panel.
We will submit the automatic sitemap generated by the storefront sytstm to multiple search engines. We will also create a robots file that instructs the search engine which pages to index and spider through which will make your indexing happen more quickly and smoothly than without them.
When finished creating your robot file and sitemap, we will submit your new sitemap to the major Search Engines that accept them such as Google and Bing.