Googlebot
Googlebot is Google's web crawler bot responsible for fetching web pages on the internet in order to update the index used by its search engine. As Googlebot crawls websites, it discovers new web pages through links and
sitemaps and adds them to its crawl list. Googlebot's crawl operations are guided by a computer program with a sophisticated algorithm that determines which websites to crawl, at what frequency, and how many pages per website to fetch.
More about Googlebot
coding