Linux: Jpeg Image Optimization / Compress Command

last updated in Categories , ,

I know how to optimize and compress png images using optipng command. line tool. I have lots of images in JPEG format. How do I compress and optimize JPEG images on my amazon cloud account so that I can save bandwidth on cloudfront cdn account? How do I use an image compressor to create lossless compression on JPEG files, with no effect on image quality in bulk using Linux?

JPG file format is recommended for high resolution photographic-style images. You need to use jpegoptim command. It is used to optimize/compress jpeg files. Program supports lossless optimization, which is based on optimizing the Huffman tables. And so called “lossy” optimization where in addition to optimizing Huffman tables user can specify upperlimit for image quality.


Type the following command:
$ apt-get install jpegoptim
Sample outputs:

[sudo] password for vivek: 
Reading package lists... Done
Building dependency tree       
Reading state information... Done
The following packages were automatically installed and are no longer required:
  libavutil-extra-51 libggiwmh0-target-x libggi2 libgii1 libvo-aacenc0
  libgii1-target-x mplayer-skin-blue libggiwmh0 libggi-target-x
Use 'apt-get autoremove' to remove them.
The following NEW packages will be installed:
0 upgraded, 1 newly installed, 0 to remove and 5 not upgraded.
Need to get 14.0 kB of archives.
After this operation, 77.8 kB of additional disk space will be used.
Get:1 squeeze/main jpegoptim amd64 1.2.3-2+b1 [14.0 kB]
Fetched 14.0 kB in 1s (11.2 kB/s)    
Selecting previously deselected package jpegoptim.
(Reading database ... 333683 files and directories currently installed.)
Unpacking jpegoptim (from .../jpegoptim_1.2.3-2+b1_amd64.deb) ...
Processing triggers for man-db ...
Setting up jpegoptim (1.2.3-2+b1) ...


The syntax is:

jpegoptim file.jpeg
jpegoptim [options] file.jpeg

Type the following command to optimize photo.jpeg, enter:
$ jpegoptim photo.jpeg
Sample outputs:

photo.jpeg 1312x948 24bit JFIF  [OK] 25226 --> 10744 bytes (57.41%), optimized.

How do I process files in batch?

Use the bash for loop as follows:

for i in one.jpeg two.jpeg foo.jpeg; do jpegoptim "$i"; done


 ## process all *.jpeg in the current directory 
for i in *.jpeg; do jpegoptim "$i"; done


From the man page:

      -d, --dest=
             Sets  alternative  destination  directory  where  to save optimized files (default is to overwrite the originals). Please note that unchanged
             files won't be added to the destination directory. This means if the source file can't be compressed, no file will be created in the destina‐
             tion path.

       -f, --force
             Force optimization, even if the result would be larger than the original file.

       -h, --help
             Displays short usage information and exits.

       -m[0..100], --max=[0..100]
             Sets  the maximum image quality factor (disables lossless optimization mode, which is by default enabled). This option will reduce quality of
             those source files that were saved using higher quality setting.  While files that already have lower  quality  setting  will  be  compressed
             using the lossless optimization method.

       -n, --noaction
             Don't really optimize files, just print results.

       -o, --overwrite
             Overwrite target file even if it exists (when using -d option).

       -p, --preserve
             Preserve file modification times.
       -q, --quiet
             Quiet mode.

       -t, --totals
             Print totals after processing all files.

       -v, --verbose
             Enables verbose mode (positively chatty).

             Strip all (Comment & Exif) markers from output file. (NOTE! by default only Comment & Exif markers are kept, everything else is discarded)

             Strip Comment (COM) markers from output file.

             Strip EXIF markers from output file.

             Strip IPTC markers from output file.

             Strip ICC profiles from output file.

  1. jpegoptim home page.
  2. See man page – jpegoptim(1)

Posted by: Vivek Gite

The author is the creator of nixCraft and a seasoned sysadmin, DevOps engineer, and a trainer for the Linux operating system/Unix shell scripting. Get the latest tutorials on SysAdmin, Linux/Unix and open source topics via RSS/XML feed or weekly email newsletter.

16 comment

  1. Here’s a neat trick for everything in an entire directory (recursive)

    find . -type f -name "*.jpg" -exec jpegoptim {} \;
  2. Hi,

    Thank you for the tip. I am looking for an effective way to optimize all the images (more than 3000 jpg images) in my server, powered by WordPress.

    Of course I plan to run some tests and see how the optimization works (before I run in production), but I have a question. How effective is to use jpegoptim in bulk mode to optimize more than 3000 images in a bulk mode? Can I rely on this optimization tool in order to reduce the overall file size for the images, without compromising too much the quality on the screen?


  3. Hi, Julian,

    If I were you, I would not rely on just external point of view of what “without compromising too much the quality” might be.

    Go and fire some tests, find parameters that will fit your needs. Anyway, this is harmless, as long as you keep a backup of originals.

    And afterwards, we would be delighted to know what parameters fit your needs, so feel free to give us feedback and share your experience, as a reward for Vivek’s hard work!

    1. Hi Philippe.

      Thank you for your response. Definitely I won’t rely on just an external point of view. I will do my own tests with a small set of pictures, make sure I keep backups of the original files, and take this kind of precautions before applying the chances on production server. My question is more oriented about what was other people experiences using jpegoptim to reduce the overall file size beyond the theory and doing it for a real case of a website with thousands of images.

      My goals with this optimization is:
      a. Reduce the transfer bandwidth (and cost) using CloudFront
      b. Make the pages load faster
      c. Improve the user experience with a faster website
      d. From (b) see if this helps to increase the SEO visiblity even more.

      After checking the documentation and other resources, I liked the idea to experiment with the -max parameter and using –strip-all to remove all meta data from the images. Plus, if everything goes well I like the idea of adding a cron job to optimize recent files time to time.

      I will keep you posted once I get any conclusion. Thanks.

      1. @Julian,

        a. Reduce the transfer bandwidth (and cost) using CloudFront
        b. Make the pages load faster

        I am currently using both jpegoptim / optipng to optimize images. Both worked nicely without reducing quality. You can see 7-30% reduced bandwidth usage. However, faster site means happy users as well as increased visibility in SE.

        c. Improve the user experience with a faster website

        See above.

        d. From (b) see if this helps to increase the SEO visiblity even more.

        Other factors also affect the SEO (such as back links, social media, speed, and more), check out this guide for more info. If I were you, I will start with Google pagespeed suggestions. Also, try Apache/Nginx pagespeed module.

        @Philippe, thanks for keeping conversation alive :)


  4. Hi,
    Thanks a lot for this post. On my EC2 install (which is an recent Amazon Linux 64 bits), I do not have apt-get (and yum jpegoptim does not work either).
    Any idea how I can install jpegoptim on an Amazon Linux / CentOS ?

  5. Hi Christian,

    I was able to install it from epel repository.

    Did you try:

    rpm -ivh epel-release-6-8.noarch.rpm
    yum --enablerepo=epel install jpegoptim


  6. How could I use this in conjunction with find to recursively convert all images in a folder, while keeping a backup of the original?

    Still, have a question? Get help on our forum!