Spiders, robots, and crawlers are the software programs used by search engines to explore the internet and automatically download the available web content.
Spiders, robot and crawler, they are all the same and referred by different names. It is a software program that follows or “Crawls” various links throughout the internet, and then grabs the content from the sites and adds to the search engine indexes.