Monitoring Django RQ

In my use case, users used a form to put a long running process on the queue. I wanted to make a page that would allow each use to see the status of the jobs they queued. This turned out to be slightly more difficult than it should be.

The first step involved saving the job information to the user’s session:

from datetime import datetime

from django.conf import settings
from django.views.generic import FormView
from django.http import HttpResponseRedirect
from django.core.urlresolvers import reverse

import pytz
import django_rq

class MyView(FormView):
    def form_valid(self, form):
        queue = django_rq.get_queue('low')
        job = queue.enqueue(

        # Save queued job to session as a list so that order is preserved
        now = pytz.timezone(settings.TIME_ZONE).localize(
        enqueued_jobs = self.request.session.get('enqueued_jobs', [])
            'job_id': job._id,
            'name': 'Revenue and Open POs Report {}'.format(form.cleaned_data['year']),
            'started': now.isoformat(),
            'started_for_display': now.strftime('%Y-%m-%d %H:%M'),
            'queue': 'low'
        self.request.session['enqueued_jobs'] = enqueued_jobs
        self.request.session.modified = True
        return HttpResponseRedirect(reverse('on_queue'))

Here is the code for the view that allows the user to see the status of each job:

import django_rq
from redis import Redis
from rq.registry import StartedJobRegistry

from django.views.generic import TemplateView

from dateutil.parser import parse

class OnQueueView(TemplateView):
    template_name = 'on_queue.html'

    def get_context_data(self, **kwargs):
        kwargs = super(OnQueueView, self).get_context_data(**kwargs)

        # Make a list of queued jobs
        queue = django_rq.get_queue('low')
        job_status = {x._id: x.status for x in}

        # Make a list of running jobs
        redis_conn = Redis()
        registry = StartedJobRegistry('low', connection=redis_conn)
        for job_id in registry.get_job_ids():
            job_status[job_id] = 'running'

        # Make a list of failed jobs
        for job_id in django_rq.get_failed_queue().job_ids:
            job_status[job_id] = 'failed'

        # Insert status into list of jobs, remove old jobs
        all_jobs = self.request.session.get('enqueued_jobs', [])
        kwargs['jobs'] = []
        now = pytz.timezone(settings.TIME_ZONE).localize(
        self.request.session.modified = False
        for job in all_jobs:
            dt = (now - parse(job['started'])).total_seconds()
            if dt < 3600 * 4:
                if job['job_id'] in job_status:
                    job['status'] = job_status[job['job_id']]
                    job['status'] = 'completed'
                self.request.session.modified = True

        if self.request.session.modified:
            self.request.session['enqueued_jobs'] = kwargs['jobs']

        return kwargs

Django Sessions in StaticLiveServerTestCase

This is for Django 1.8.

Here was the scenario. I have a Django site where most of the pages require the user to login. I have a form that gets some initial values from Django sessions. The form has lots of javascript, so live testing seemed best. The problem I could not set the session values in the test code.

Initially, this seemed pretty easy because in the test code, there was a variable called:


However, changes to the session were not present when the test code called the view. Careful inspection showed that the session id’s in self.client.session and view.request.session were different.

Adding the following method to StaticLiveServerTestCase solved the problem:

    def save_to_session(self, key, value):
        cookies = self.selenium.get_cookies()
        session_key = None
        for cookie in cookies:
            if cookie[u'name'] == u'sessionid':
                session_key = cookie[u'value']

        if session_key:
            from django.contrib.sessions.backends.cached_db import SessionStore
            s = SessionStore(session_key)
            s[key] = value


Testing Django When Using @cached_property

I use the @cached_property decorator quite a bit. It’s pretty straightforward. Usually, it’s set it and forget it. As noted in the docs, it persists as long as the instance persists.

However, it can cause problems during testing if you want to change it’s value. Here is how to change the value (from SO):

class SomeClass(object):

    def expensive_property(self):

obj = SomeClass()
print obj.expensive_property
print obj.expensive_property # outputs the same value as before
del obj.expensive_property
print obj.expensive_property # outputs new value

Django, Ajax and HTML Select

This happens all the time; one field in a form determines the options in another select field. One way to handle this is to do a Ajax post. It’s not difficult, but there are lots of parts to remember. Here is one way to do it (using jquery).

The Django Ajax handler:

from django.http import JsonResponse

def get_options(request):
    project_id = int(request.POST['project_id'])
    project = models.Project.objects.get(id=project_id)

    choices = get_select_choices(project)
    options = []
    for choice_id, choice_label in choices:
        option = '<option value="{}">{}</option>'.format(str(choice_id), choice_label)
    return JsonResponse({'options': options})

The javascript

<script type="text/javascript">
    var project_field, phase_select;

    var get_phases_for_project = function(){
        var project = project_field.val();

        if (project === ""){
            "{% url 'my_url' %}",
            {project_id: project, csrfmiddlewaretoken: '{{ csrf_token }}'},
             * @param data
             * @param data.phases
                $.each(data.phases, function (index2, value) {

        project_field = $('#id_project');
        phase_select = $("#id_phase");

SSL, Django Development Server and Chrome

I am developing a Django site that uses Stripe. Even for testing, Stripe requires HTTPS. In the past, I used django-sslserver version 0.19 and ignored the complaining Chrome made about the certificate being self signed. Today (Sept 2017), none of that worked.

First thing I did was upgrade django-sslserver to 0.20. This crashed with an error related to:


It turns out ssl is built into Python and that constant is not defined in version 2.7.6. Reverting back to django-sslserver to 0.19 solved that problem.

Next, Chrome/Stripe will no longer let you ignore the SSL certificate warnings. This blog post by Alexander Zeitler does a pretty good job explaining how solve this problem. If you run into this problem:

error on line -1 of /dev/fd/11
140736435860488:error:02001009:system library:fopen:Bad file descriptor:bss_file.c:175:fopen('/dev/fd/11','rb')
140736435860488:error:2006D002:BIO routines:BIO_new_file:system lib:bss_file.c:184:
140736435860488:error:0E078002:configuration file routines:DEF_LOAD:system lib:conf_def.c:197:

remove sudo from and run the script using sudo.

When all of that is done, you need to tell Chrome to trust your Certificate Authority by going to “Advanced Settings -> Manage Certificates”, then “Authorities/Import. Select the rootCA.pem in the ssl directory created by the scripts above.

This probably already setup on your machine, but you need to check the file /etc/hosts to make sure localhost points to the IP address django-sslserver is using (most likely Then in the browser go to:


Launch django-sslserver using something like:

python runsslserver --certificate ~/ssl/server.crt --key ~/ssl/server.key


Daemonizing Django-RQ using Supervisor

I was trying to set up a task to be run from Django-RQ. The task involved scrapping a webpage using Selenium and Google Chrome. It worked great in development, but not in production. The error message indicated that there were problems starting Chrome.

One big difference between dev and production was in production I was daemonizing Django-RQ using Supervisor. Some queued tasks would run. Just not the ones involving Selenium. The clue came when I stopped Django-RQ using supervisorctl and then started it from the command line. Now the Selenium tasks worked.

I solved the problem by adding this snippet to the top of the task that used Selenium:

import os
import json

json.dump(os.environ['PATH'].split(':'), open('debug_file.json', 'wb'))

This revealed that the environment PATH when running from the command line was much different that that when running Django-RQ from Supervisor. Adding some of those paths to the Supervisor config solved the problem.

command= {{ virtualenv_path }}/bin/python rqworker high default low
stdout_logfile = /var/log/redis/redis_6379.log


directory={{ django_manage_path }}
environment = DJANGO_SETTINGS_MODULE="{{ django_settings_import }}",PATH="{{ virtualenv_path }}/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
user = vagrant