Automagically GZip a file remotely before downloading it with SCP!

I don’t know about you, but since I don’t live in South Korea, I do not have near-infinite bandwidth.  If you have a file up on a Unix server that you want to download with SCP down to your server, it sure would be awesome if you could automatically compress the file on the remote server before downloading it locally, saving you space on your local disk and time on the transfer. Rsync won’t help you if you don’t have the file on your local drive and are just downloading a “fresher” version.

But wait, there’s more! What about wrapping it up into a Bash function so you can call it easily?

This is another use of GNU Parallel to do heavy lifting of compressing the file remotely, downloading the newly-compressed file (using Rsync and passwordless SSH) and then deleting the compressed file on the remote server (so no junk is left there).

function remote_gzip {
    parallel -S $1 --cleanup --return {/}.gz "gzip --best {} -c > {/}.gz" ::: $2

You can put this Bash function into your .bashrc or something similar, where it will always be with you. Make sure to source the file to get the function into your bash shell; “. ~/.bashrc”.

So if you have a server named and a file at location /var/logs/bigfile.log, call it like this:


remote_gzip /var/logs/bigfile.log

In your current working directory you will have bigfile.log.gz.



Leave a Reply

Your email address will not be published. Required fields are marked *