好得很程序员自学网

<tfoot draggable='sEl'></tfoot>

a crawler base on libevent

a crawler base on libevent

http://monkey.org/~provos/crawl/


crawl - a small and efficient HTTP crawler     

crawl - a small and efficient HTTP crawler The  crawl  utility starts a depth-first traversal of the web at the specified URLs. It stores all JPEG images that match the configured constraints. Crawl is fairly fast and allows for graceful termination. After terminating crawl, it is possible to restart it at exactly the same spot where it was terminated. Crawl keeps a persistent database that allows multiple crawls without revisiting sites.

The main reason for writing  crawl  was the lack of  simple  open source web crawlers. Crawl is only a few thousand lines of code and fairly easy to debug and customize.

Features Saves encountered images or other media types Media selection based on regular expressions and size contraints Resume previous crawl after graceful termination Persistent database of visited URLs Very small and efficient code Asynchronous DNS lookups Supports  robots.txt

The current version of Crawl identifies itself as  Crawl/0.4 libcrawl/0.1  to web servers. It's default configuration also limits how often a fetch can happen against the same web server.

 

Download crawl-0.4.tar.gz  - Release 2003-05-17 crawl-0.3.tar.gz  - Release 2002-01-28 crawl-0.2.tar.gz  - Release 2001-12-12 crawl-0.1b.tar.gz  - Release 2001-07-03 The crawl utility is distributed under a BSD-license and completely free for any use including commercial.

Building In order to build crawl, you need  libevent , a library for asynchronous event notification. You also need Berkeley DB  compiled with  --enable-compat185  for 1.85 compatibility.

Example
$ crawl -m 0 http://www.w3.org/
Searches for images in the index page of the web consortium without following any other links.

Acknowledgements This product includes software developed by Ericsson Radio Systems.
This product includes software developed by the University of California, Berkeley and its contributors.

Support If you are inclined, you can leave a tip for me with  PayPal .  Sign up  for it.

查看更多关于a crawler base on libevent的详细内容...

  阅读:39次