Skip to content

Latest commit

 

History

History
18 lines (11 loc) · 603 Bytes

README.md

File metadata and controls

18 lines (11 loc) · 603 Bytes

Periodic Crawler

A project that integrates celery to maintain periodic task.

The tasks are simple web crawling methods,you can add and configure the tasks list to meet your need.

How To Configure:

Clone the project.

Then run pip install -r requirements.txt

After that go to this link(http://celery.github.com/celery/getting-started/introduction.html#installation) to install and configure Broker

I have used RabbitMq(you can configure any broker with the project.)

For details check my blog post(http://mushfiq.com/2012/08/03/build-periodic-crawler-with-celery/)