Sounds like a fine plan. Some suggestions:
Learn to automate everything you can. Make it a habit. If you do something more than a couple times, put it in a script. It's not just to avoid typing but to document the process. Improve your scripts as you notice problems. Share your scripts when appropriate.
Learn the power of pipelining. Discover the purpose of the xargs
command. Re-write a standard command line utility like grep
or sort
in the language of your choice. (I'm partial to Perl, but that's almost cheating. ;-)
Customize your .bashrc
file. Know what settings you like and which ones don't work for you.
Use ksh
rather than bash
for scripts. There aren't many differences, but ksh
has a few extra features that are very nice to have. I prefer bash
for interactive shells, however.
Seems like the other answers suggest focusing on "real programming languages". I won't say that's bad advice, but in my experience too few programmers make good use of the command line. Over a career, good use of shell scripting saves countless hours and lots of tedium.
Let me give you an example. This weekend I began putting new code into our production system. We had spent the previous week testing it and everything looked good. Ideally, you'd want to have a perfect clone of the operational system so that you're testing apples to apples. But we can't afford two copies of the hardware, so we borrow production machines to run tests on and swap them into production when we perform the upgrade.
Now to distinguish between our operations and testing, we use two different accounts. So before putting a system into operations, we clean out certain files generated by the testing account. Basically it's a two step process:
Find all the files created by the testing user.
Blow them away.
I imagine it would take me a minute or two to write the code to do that in Perl and another couple of minutes to test it. It's a simple job. I'm not even sure how to go about it in C/C++. I think you'd start with a stat
of the root directory.
But everyone who has mastered shell scripting is jumping up and down, waving their hands and shouting out the answer, because you can write the code in the time it takes to type it:
$ find /data -user test | xargs rm -rf
Testing consists of running the command and watching for errors. This particular problem is a softball pitch right in the wheelhouse for bash
. Perl gets the job done, but it's a bit less natural. (I'd use find2perl
, which just adds a step.) Attempting this in C or C++ would be a quixotic quest. Those languages are designed to solve different problems.
Now if you don't work in a UNIXy environment, there's probably a toolset designed for doing this sort of thing. (I'm no expert, but in Windows I'd probably run a search to get all the files in one window, select all and delete. Very nearly as easy. I don't know how to automate it, however.) But if you plan on finding a job in the UNIX/Linux world, you must be familiar with the command line so that you don't take 5 minutes to do a 30 second job.