This is a quick entry about benchmarking your nginx and showing how caching makes a great performance boost.
We are hosting the web page that you are reading on a micro type instance on EC2. It is pretty cheap (we have bought a reserved instance since we know that we will be running the server all year round) which works out around 10 - 15 USD per month. All our sites are Rails based and we are serving them through nginx and passenger.
The experiment that I wanted to run was simple: see how many requests can the server cope with:
- When a page is cached
- When the same page is not cached
I selected a page with text and some graphics (the benchmark seemed to have ignored the graphics part). The total size was about 1.7kB. The page was almost entirely composed by static HTML, except for a couple of links that the application had to put together (and of course the layout).
I decided that benchmarking should be done via another ec2 instance based on the same availability zone (US east 1) in order to minimize network latency. For benchmarking I used ab. I am sure there are more sophisticated tools out there, but for a simple test this is fine.
So I run first the test with the cached page:
The above command means that I asked for 5000 requests to the server while at any time 500 of them would run simultaneously (i.e. the benchmarking tool would issue requests even if previous requests had not been served yet). If I did not specify the 500 bit, then it would execute all requests serially (which for 5000 in total might take a while).
The results are:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
Plenty of prety numbers, but the ones that I keep in mind are the Requests per second: 11061.60 [#/sec] (mean) and the Time per request: 45.201 [ms] (mean). So my dirt cheap EC2 slice can serve 11k requests in one second and each one takes around 45ms (actually 90% of them were served within 36ms!).
Now if I repeat the same experiment (i.e. same page, same concurrent requests, same total requests) I get about ten times slower response. If you run the experiment with a slightly more realistic scenario then the drop in performance is dramatic. All the time is consumed in the application putting together your view.
Here are the results:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39
Notice that some requests failed. Also the page is much smaller (just 173 bytes - which means that my comparison is not that great - if anything it is optimistic). Finally we have a drop of times 10 in the number of pages that our server managed to serve.
In any case, it shows how even a very light html page will be served at a much slower speed if not cached.