iEFdev

Code, Computers & Random Junk

Optimizing Images in Terminal

Optimizing images is really important, not only if for the ones to your website, but for all images you’re about to send, for example.

I wrote post about: “2 Small Script/helpers for Images”, and I still use those on daily basis. But now I’ve added a few more — meaning, I could also ditch my image optimizing program I had in OS X: ImageOptim. It’s a really great program, and on Linux there’s a simular one called Trimage. I’ve used that one to, but on the last install I felt it took a lot of resources.

Anyway, now with all my small scripts, that works indepentently or together, I could ditch the programs and now can use the same files on all computers/installations. +1

So, the scripts I have and use now are:

mkjpg

This is used just to be able to convert an image to a jpg real quick. Perfect for screenshots etc, where the settings usually are set to png or tiff. They can get quite large in filesize, so just by typing mkjpg and drag’n’drop the image into the terminal window is really a time saver - instead of open up an image editor, fiddle through menues, export/save as, etc etc… Helps me out (almost) every day.

Usage

# mktjpg <image> [<procent>]
mkjpg Image.png
mkjpg Image.png 60          // Quality: 60%

mkthumb

This one takes an image and creates a thumbnail/miniature of it. If no argument is added after the image path, it will default to 250px (wich are the desired size on the Arch forum, for example). Max size is set to 500. Larger images really doesn’t qualify as a “thumb”. smile

Usage

# mkthumb <image> [size]
mkthumb Image.jpg
mkthumb ~/Desktop/Another_Image.jpg 400

That will create:

Image_250px.jpg
~/Desktop/Another_Image_400px.jpg

jpgopt

It’s like a wrapper for jpegoptim, and is also using mkjpg, if needed. That means I can run a big png image, like a screendump, and get it converted to a jpeg + get it optimized in one take. Then the “for loop” in the script makes it easy to use a wildcard to run on a directory.

If the input is a png, it will not delete that file. Sometimes you want to keep the lossless image and just have a nicer formatted image for the web, or to send/post somewhere.

Usage

jpgopt /path/to/file.jpg            1 file
jpgopt /path/to/{more,files}.jpg    More files
jpgopt 60 /path/to/file.jpg         60% quality

pngopt

I was looking at a few different optimizers - pngcrush, optipng etc, but decided to go with optipng. It does a really nice job.

Here’s the line doing the optimization:

optipng -strip all -quiet -o7 "${file}" -out "${file}.opt";

After that it will match the sizes and delete either one - the optimized or the original, depending on the output. If the image didn’t get optimized it will keep the original, else it will rename the optimized image to the input name.

The verbose mode is just to see the names &/or wich one it took (eg mv -v … and rm -v …).

Usage

pngopt <file(s)>        1 or more files
pngopt -v               Verbose, show filenames

Example:

$ pngopt Image.png
:: Optimized! (42.1%)

So, that is all now… 4 scripts I can use in both OS X or Linux. I could have continued to use the GUI’s but, on OS X, the last version had some force on settings, and Trimage is not on all distributions - had to build it myself once. With a few scripts I can stay updated with their dependencies.

And the big win here is… Since I (mostly) use a custom directory: /usr/local/xbin for my own scripts, instead of mix them in another **/bin folder - it’s really easy to move them between the computers, &/or when installing a new computer. All in one place.


Happy hacking…

/Eric

Comments