Research shows that if your web pages take longer than 5 seconds to load, you lose 50% of your viewers and sales. As a UNIX admin often end users and web developers complain about website loading speed and timings. Usually, there is nothing wrong with my servers or server farm. Fancy java script and images / flash makes site pretty slow. These tools are useful to debug performance problems for sys admins, developers and end users. Here are six tools that can analyzes web pages and tells you why they are slow. Use the following tools to:
- Make your site faster.
- Debug site problem, especially client side and server side stuff.
- Better user experience.
- Improve the web.
#1: Yahoo! YSlow
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages. YSlow is a Firefox add-on integrated with the Firebug web development tool. YSlow grades web page based on one of three predefined ruleset or a user-defined ruleset. It offers suggestions for improving the page's performance, summarizes the page's components, displays statistics about the page, and provides tools for performance analysis, including Smush.i and JSLint.
If you apply tips provided by YSlow, your corporate web site or personal blog can be load pretty faster as compare to old version.
This is useful to find out if Apache or Lighttpd compressing (gzipping) files or not.
#2: Google Page Speed
Page Speed is an open-source Firefox/Firebug Add-on. You can use Page Speed to evaluate the performance of yoir web pages and to get suggestions on how to improve them.
#3: Pagetest (IE specific tool)
This tool only works with MS Internet Explorer. From the project web page:
Pagetest is an open source tool for measuring and analyzing web page performance right from your web browser. AOL developed Pagetest internally to automate load time measurement of its many websites, and it has evolved into a powerful tool for web developers and software engineers in testing their web pages and getting instant feedback. We decided to release it to the grander web development community to further help evolve it into an even more useful - and free - web performance tool.
#4: HTTP Server Benchmarking Tool
ab is a tool for benchmarking your Apache Hypertext Transfer Protocol (HTTP) server. It is designed to give you an impression of how your current Apache installation performs. This especially shows you how many requests per second your Apache installation is capable of serving. See how to use ab command.
httperf is a tool to measure web server performance. It speaks the HTTP protocol both in its HTTP/1.0 and HTTP/1.1 flavors and offers a variety of workload generators. Following command causes httperf to create a connection to host www.cyberciti.biz send and receive the reply, close the connection, and then print some performance statistics.
$ httperf --hog --server www.cyberciti.biz
httperf --hog --client=0/1 --server=www.cyberciti.biz --port=80 --uri=/ --send-buffer=4096 --recv-buffer=16384 --num-conns=1 --num-calls=1 httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE Maximum connect burst length: 0 Total: connections 1 requests 1 replies 1 test-duration 0.236 s Connection rate: 4.2 conn/s (236.0 ms/conn, <=1 concurrent connections) Connection time [ms]: min 236.0 avg 236.0 max 236.0 median 235.5 stddev 0.0 Connection time [ms]: connect 47.0 Connection length [replies/conn]: 1.000 Request rate: 4.2 req/s (236.0 ms/req) Request size [B]: 70.0 Reply rate [replies/s]: min 0.0 avg 0.0 max 0.0 stddev 0.0 (0 samples) Reply time [ms]: response 38.0 transfer 151.0 Reply size [B]: header 242.0 content 26976.0 footer 2.0 (total 27220.0) Reply status: 1xx=0 2xx=1 3xx=0 4xx=0 5xx=0 CPU time [s]: user 0.01 system 0.22 (user 6.3% system 93.6% total 99.9%) Net I/O: 112.9 KB/s (0.9*10^6 bps) Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0 Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
Following is Like above, except that a total of 100 connections are created and that connections are created at a fixed rate of 10 per second:
# httperf --hog --server www.cyberciti.biz --num-conn 100 --ra 10 --timeout 5
httperf --hog --timeout=5 --client=0/1 --server=www.cyberciti.biz --port=80 --uri=/ --rate=10 --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=1 httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE Maximum connect burst length: 1 Total: connections 100 requests 100 replies 100 test-duration 10.089 s Connection rate: 9.9 conn/s (100.9 ms/conn, <=4 concurrent connections) Connection time [ms]: min 186.7 avg 193.6 max 302.3 median 187.5 stddev 20.8 Connection time [ms]: connect 36.4 Connection length [replies/conn]: 1.000 Request rate: 9.9 req/s (100.9 ms/req) Request size [B]: 70.0 Reply rate [replies/s]: min 9.8 avg 9.9 max 10.0 stddev 0.1 (2 samples) Reply time [ms]: response 39.5 transfer 117.7 Reply size [B]: header 242.0 content 26976.0 footer 2.0 (total 27220.0) Reply status: 1xx=0 2xx=100 3xx=0 4xx=0 5xx=0 CPU time [s]: user 0.34 system 9.75 (user 3.4% system 96.6% total 99.9%) Net I/O: 264.1 KB/s (2.2*10^6 bps) Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0 Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
- Download httppref utility for UNIX like operating systems.
- See ab and httppref man page for more details.
#5: Full Page Test
#6: UNIX wget or fetch Utility
wget is used to retrieve the file(s) pointed to by the URL(s) on the command line. It can tell you exact time it spent to download your files:
$ wget http://www.cyberciti.biz/files/test.pdf
$ wget http://www.cyberciti.biz/
--2009-07-15 22:09:05-- http://www.cyberciti.biz/ Resolving www.cyberciti.biz... 18.104.22.168 Connecting to www.cyberciti.biz|22.214.171.124|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html' [ <=> ] 26,976 38.0K/s in 0.7s 2009-07-15 22:09:07 (38.0 KB/s) - `index.html' saved 
Update#1: Tools For Apple Safari 4 Browser
Apple offers various tools to test your web site:
TwitterFacebookGoogle+PDF versionFound an error/typo on this page? Help us!
The Resources pane graphs the order and speed at which website components load over the network. It’s also the first tool that lets you sort data based on loading parameters such as latency, response time, and duration. You can graph page resources by either size or load time. Clicking a resource in the left column brings up detailed data on the right. For text resources, such as documents and scripts, you see the text source of the file. For image and font resources, you view a graphical preview of the file.
- 30 Cool Open Source Software I Discovered in 2013
- 30 Handy Bash Shell Aliases For Linux / Unix / Mac OS X
- Top 30 Nmap Command Examples For Sys/Network Admins
- 25 PHP Security Best Practices For Sys Admins
- 20 Linux System Monitoring Tools Every SysAdmin Should Know
- 20 Linux Server Hardening Security Tips
- Linux: 20 Iptables Examples For New SysAdmins
- Top 20 OpenSSH Server Best Security Practices
- Top 20 Nginx WebServer Best Security Practices
- 20 Examples: Make Sure Unix / Linux Configuration Files Are Free From Syntax Errors
- 15 Greatest Open Source Terminal Applications Of 2012
- My 10 UNIX Command Line Mistakes
- Top 10 Open Source Web-Based Project Management Software
- Top 5 Email Client For Linux, Mac OS X, and Windows Users
- The Novice Guide To Buying A Linux Laptop