Thursday, August 20, 2009

Integrated Beaker Cache with Frisky

Last night I spent enjoying writing some performance code on Frisky. Beaker Cache is a very nice tool that provides an overview. Key features that I wanted to use it for are:
  • Dog piling (cache race condition)
  • Multiple cache storage systems
  • Cache regions (different cache configuration based on namespace)
Basically at the moment with the existing system the fastest I could get my laptop to serve just the string "hello" was around 500 req/sec. If I only run the code that does the WSGI setup and have the server serve the "hello" string is 3,000 req/sec which I think is the theoretical maximum (which is brain busting fast). With Beaker (with memcached) I was able to get around 1,100 req/sec which over 50% increase in performance. This is amazingly fast however still 1/3 of what it theoretically possible so for now this is OK I can move on to other features. I think one issue is that the dog piling requires some locking so not sure how much more I will be able to do using Beaker.

The next feature I wanted to get working was autoreload of python code. I decided that the best way to do this since I have processes is to just create new processes after files have been modified. Then I thought I would in development mode just set a limit to number of request a process can handle to 1. This would mean every request would be a new process and thus reload the python. I wanted to create a request limit on processes anyway.