Run Commands Only If the Previous Ones Succeed


&&

In the previous section, you saw that ; separates commands, as in this example:

[View full width]

$ unzip /home/scott/music/JohnColtrane.zip ; mkdir -p /home/scott/music/coltrane ; mv /home/scott/music/JohnColtrane*.*mp3 /home/scott/music/coltrane/ ; rm /home/scott/music /JohnColtrane.zip


What if you fat finger your command, and instead type this:

[View full width]

$ unzip /home/scott/JohnColtrane.zip ; mkdir -p /home/scott/music/coltrane ; mv /home /scott/music/JohnColtrane*.*mp3 /home/scott/music/coltrane/ ; rm /home/scott/music /JohnColtrane.zip


Instead of unzip /home/scott/music/JohnColtrane.zip, you accidentally enter unzip /home/scott/JohnColtrane.zip. You fail to notice this, so you go ahead and press Enter, and then get up and walk away. Your computer can't unzip /home/scott/JohnColtrane.zip because that file doesn't exist, so it blithely continues onward to the next command (mkdir), which it performs without a problem. However, the third command can't be performed (mv) because there aren't any MP3 files to move because unzip didn't work. Finally, the fourth command runs, deleting the zip file (notice that you provided the correct path this time) and leaving you with no way to recover and start over. Oops!

Note

Don't believe that this chain of events can happen? I did something very similar to it just a few days ago. Yes, I felt like an idiot.


That's the problem with using ;commands run in sequence, regardless of their successful completion. A better method is to separate the commands with &&, which also runs each command one after the other, but only if the previous one completes successfully (technically, each command must return an exit status of 0 for the next one to run). If a command fails, the entire chain of commands stops.

If you'd used && instead of ; in the sequence of previous commands, it would have looked like this:

[View full width]

$ unzip /home/scott/JohnColtrane.zip && mkdir -p /home/scott/music/coltrane && mv /home/ scott/music/JohnColtrane.*mp3 /home/scott/music/coltrane/ && rm /home/scott/music /JohnColtrane.zip


Because the first unzip command couldn't complete successfully, the entire process stops. You walk back later to find that your series of commands failed, but JohnColtrane.zip still exists, so you can try once again. Much better!

Here are two more examples that show you just how useful && can be. In Chapter 13, "Installing Software," you're going to learn about apt, a fantastic way to upgrade your Debian-based Linux box. When you use apt, you first update the list of available software, and then find out if there are any upgrades available. If the list of software can't be updated, you obviously don't want to bother looking for upgrades. To make sure the second process doesn't (uselessly) occur, separate the commands with &&:

# apt-get update && apt-get upgrade 


Example two: You want to convert a PostScript file to a PDF using the ps2pdf command, print the PDF, and then delete the PostScript file. The best way to set up these commands is with &&:

$ ps2pdf foobar.ps && lpr foobar.pdf && rm foobar.ps 


If you had instead used ; and ps2pdf failed, the PostScript file would still end up in nowheresville, leaving you without a way to start over.

Now are you convinced that && is often the better way to go? If there's no danger that you might delete a file, ; might be just fine, but if one of your commands involves rm or something else from where there is no recovery, you'd better use && and be safe.



Linux Phrasebook
Linux Phrasebook
ISBN: 0672328380
EAN: 2147483647
Year: 2007
Pages: 288

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net