A web crawler (also recognized as a web spider or web robot) is a program or automated script which browses the World Wide Web in a methodical, automatic manner. Other less repeatedly used names for web crawlers are ants, automatic indexers, bots, and worms (Kobayashi and Takeda, 2000).The web crawler process is called web crawling or spidering. Many sites particularly search engines, use spidering as a means of providing up-to-date data. Web crawlers are largely used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to give fast searches.
Halloween Costume
No comments:
Post a Comment