(ICT) A crawler is a program that searches the Web for new links, new content and changes in order to keep Search Engine results up to date. A crawler may also be called a bot (short for robot) or spider. Crawlers within search engines perform a useful indexing function, but there are also crawlers or bots that have more sinister motives, such as gathering addresses to be targeted by spammers. See: Spam, Spambot, Spyware