Benchmarking squid and other caching proxy servers

Posted on in Categories Squid caching server last updated July 20, 2006

We run fairly large squid caching proxy server. There are commercial products are also available. So sometime you may need benchmark caching proxy server. As business grows everyday you may need to evaluate certain aspects of core business in advance.

This process of measuring the performance of a product or service is must for real life situations. I have already written about benchmarking a web server. The main aim for benchmarking is to reproduces result for workload that does not put stress under test (so that I can be sure that it will run under heavy load 🙂 ).

My main criteria were to see caching server throughput rate, response time under workload; cache hit ratio, number of concurrent connections to caching server and other factors.

I am going to use web polygraph software for this purpose. It is a freely available benchmarking tool for caching proxies, origin server accelerators, L4/7 switches, content filters, and other Web intermediaries.

Important Links:

Download web polygraph
Installation and configuration documentation

I shell not able to publish benchmarking result here as software license put the restriction on me (from web polygraph license page, ‘you shall not publish benchmarking results based on Web Polygraph‘).

Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin and a trainer for the Linux operating system/Unix shell scripting. He has worked with global clients and in various industries, including IT, education, defense and space research, and the nonprofit sector. Follow him on Twitter, Facebook, Google+.

7 comment

  1. Which version of web polygraph did you use and on which *nix did you build it?
    I am having trouble building 2.8.1 version on Ubuntu Edgy , gcc version 4.1.2
    Specifically errors in ../xstd/Ring.h file in the src directory.

  2. Hie Vivek..
    i have a small doubt… i was using squid with dansguardian for content filtering and was working great. but somedays back when i checked my internet comsumption i found for 2 days there was 18gbs of download it never happend before. when i unpluged my proxy server from the network there is no issues. i just want to know is there a possibility my proxy server downloads automatically? or any other reason? i run it on Fedora core 6.
    Thanks in advance

Leave a Comment