tags:

views:

68

answers:

2

I have a file that looks like this:

ftp://url1/files1.tar.gz dir1
ftp://url2/files2.txt dir2
.... many more...

What I want to do are these steps:

  1. Create directory based on column 2
  2. Unix 'cd' to that directory
  3. Download file with 'wget' based on column1

But how come this approach of mine doesn't work

while(<>) {
  chomp;
  my ($url,$dir) = split(/\t/,$_);
  system("mkdir $dir");
  system("cd $dir");   
  system("wget $url"); # This doesn't get executed
}

What's the right way to do it?

+4  A: 

I'll tell you one thing wrong. The system("cd $dir"); will create a sub-shell, change into the directory within that sub-shell, then exit.

The process running Perl will still be in its original directory.

I'm not sure if that's your specific problem since # Fail here is a little light on detail :-)

One possible fix is:

system("mkdir $dir && cd $dir && wget $url");

That will do the whole lot in one sub-shell so shouldn't suffer from the problems mentioned.


In fact, this script works fine:

use strict;
use warnings;
system ("mkdir qwert && cd qwert && pwd && cd .. && rmdir qwert");

outputting:

/home/pax/qwert
paxdiablo
You should probably use ``, just in case the `mkdir` or `cd` fail for some reason.
cjm
Good point and fixed. We once had a install script on HPUX run as root which did a cd to the install directory then chown'ed and chmod'ed everything under there to the specific user and permission. Unfortunately we misspelt the install directory, the script stayed in `/` and every damn file on the box got changed. Oops, back to single-user mode we go :-)
paxdiablo
neversaint
Then you could replace the `` (and sacrifice error checks). The answer I gave worked fine on my system so I'm not sure what shell you're using. You may have to select a specific shell, something like `system ("bash -c 'mkdir ... '");`.
paxdiablo
If you aren't going to use Perl, don't bother using Perl. Just use a shell script. :)
brian d foy
I noticed the smiley @brian, but you raise a valid point. Doing it in 'Perl proper' is definitely the _right_ thing to do (hence my +1 to rjh) but sometimes all you need is a quick'n'dirty fix. I wouldn't use my answer long-term or in important code but I'm nothing if not pragmatic :-)
paxdiablo
+11  A: 

Use native Perl solutions where possible:

  • cd can be done with chdir
  • mkdir can be done with mkdir
  • mkdir -p (don't die if dir exists, recursive creation) can be done with File::Path which comes with Perl
  • wget can be done with LWP::Simple

How I would implement this:

use File::Spec::Functions qw(catfile);  # adds a '/' between things (or '\' on Windows)
use LWP::Simple qw(mirror);
use File::Path qw(mkpath);
use File::Basename;
use URI;

while (<>) {
    chomp;
    my ($url, $dir) = split /\t/;
    mkpath($dir);

    # Use the 'filename' of the $url to save 
    my $file = basename(URI->new($url)->path);
    mirror($url, catfile($dir, $file));
}

If you do this, you get:

  • Portability between platforms
  • Portability between shells
  • Perl exception handling (via return values or die)
  • Perl input/output (no need to escape anything)
  • FLexibility in the future (if you change the way you want to calculate filenames, or how you store the web content, or if you want to run web requests in parallel)
rjh