When you hear people talk about Crawlers in the context of SEO, they are referring to the programs that search engines use to scan and analyze websites in order to determine their importance, and thus their ranking in the results of internet searches for certain keywords. Crawlers are also often referred to as spiders or robots.
Crawlers are very active, and often account for a great deal of the visitors to websites all over the internet. The Google crawler, known as Googlebot, is particularly active, and will often visit a website several times a day, checking for updates or new information. Studies have shown that it is much more active than other crawlers, the closest being the Yahoo crawler, which is about half as active as Googlebot.
Search engine optimization is aimed at understanding how crawlers work and what they look for the determine the importance and ranking of certain sites. The idea is to then implement SEO marketing strategies that will fill websites with the kind of information that the crawlers will determine to be of high value.
Crawlers are on the lookout for sites that are rich with the kinds of keywords that people search for, and sites that contain those keywords in high density are seen as being more relevant, and thus will be awarded high rankings. However, crawlers also gather other information important in determining rankings, including link population information and filenames structure.
Some forms of search engine marketing are deliberately aimed at deceiving the crawlers into thinking a site is more important than it is. These are known as black hat techniques, and they are frowned upon by most web optimizers, as they can produce penalties from search engines. There are all kinds of SEO tools out there to help you better understand crawlers and how they work. The Google keyword tool is a good place to start.