In my business I need to create lots of low volume Django sites. I do this on Webfaction, where I am limited on how much memory I can use.
In a recent project, I needed to mirror a portion of another database (Viewpoint) in my Django database. Every night a cron job would launch a custom Django command to update the Django database. Nothing too tricky about that… until the customer said he needed to be able to launch that sync command occasionally during the day, maybe one or two times/week.
A quick search of the Internets pulled up Celery. I knew it would be over-kill. But I thought it might be useful in other projects. So maybe it was time to give it a try. Ugh. What a mistake. It took forever to get Celery, Redis and Supervisor all working. Then adding insult to injury all that code used about 300MB of memory. That’s about 3 times as much as a Django site running under Apache.
Next I moved on to Django-cron. A quick read of the docs revealed this:
The 2 most common ways in which most people go about this is either writing custom python scripts or a management command per cron (leads to too many management commands!).
But wait! I already had the Django command that runs under cron written and working. I just added a cron job to run every 5 minutes and check for a manual sync.
Here is a post where I dive deeper into memory use.