by kumarsatish » Tue Sep 06, 2016 11:51 am
Googlebot's crawl method begins with a listing of webpage URLs, generated from previous crawl processes and increased with Sitemap knowledge provided by webmasters. As Googlebot visits every of those websites it detects links (SRC and HREF) on every page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links area unit noted and accustomed update the Google index.