Benchmarking squid and other caching proxy servers

last updated in Categories Squid caching server

We run fairly large squid caching proxy server. There are commercial products are also available. So sometime you may need benchmark caching proxy server. As business grows everyday you may need to evaluate certain aspects of core business in advance.

This process of measuring the performance of a product or service is must for real life situations. I have already written about benchmarking a web server. The main aim for benchmarking is to reproduces result for workload that does not put stress under test (so that I can be sure that it will run under heavy load 🙂 ).

My main criteria were to see caching server throughput rate, response time under workload; cache hit ratio, number of concurrent connections to caching server and other factors.

I am going to use web polygraph software for this purpose. It is a freely available benchmarking tool for caching proxies, origin server accelerators, L4/7 switches, content filters, and other Web intermediaries.

Important Links:

Download web polygraph
Installation and configuration documentation

I shell not able to publish benchmarking result here as software license put the restriction on me (from web polygraph license page, ‘you shall not publish benchmarking results based on Web Polygraph‘).


Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin, DevOps engineer, and a trainer for the Linux operating system/Unix shell scripting. Get the latest tutorials on SysAdmin, Linux/Unix and open source topics via RSS/XML feed or weekly email newsletter.

7 comment

  1. Which version of web polygraph did you use and on which *nix did you build it?
    I am having trouble building 2.8.1 version on Ubuntu Edgy , gcc version 4.1.2
    Specifically errors in ../xstd/Ring.h file in the src directory.

  2. Jasbir,

    It was Debian Linux and another was FreeBSD server and compiled and worked w/o a problem.


  3. Hie Vivek..
    i have a small doubt… i was using squid with dansguardian for content filtering and was working great. but somedays back when i checked my internet comsumption i found for 2 days there was 18gbs of download it never happend before. when i unpluged my proxy server from the network there is no issues. i just want to know is there a possibility my proxy server downloads automatically? or any other reason? i run it on Fedora core 6.
    Thanks in advance

  4. Hi Vivek,

    Would you please publish some websites or howto’s for examples on configuring polygraph on debian and how to load a test?


    Have a question? Post it on our forum!