Whether you call them “spiders”, “bots” or “crawlers”, these are the programs that Google uses to find content on the Internet. One of the most important things you can do is to make sure these crawlers can do their job as effectively as possible. For example, you need to make sure your robots.txt file is doing exactly what you want it to do. Specifically, you can decide which of your pages need to be indexed and which should be ignored.
From our very good friends over at business. com