tags:

views:

42

answers:

5

I am trying to have my script be able to take in an arbitrary number of file names as command line arguments. In Unix, it is possible to use the '*' key to represent any character. For example,

ls blah*.txt 

will list every file with blah at the beginning and txt at the end.

I need something like this for my python script.

python myscript.py blah*.txt

Is this possible? and if it is, how can it be done?

A: 

I think you want to use glob.

Skilldrick
A: 

There is also fnmatch for when you don't want to have to worry about the path.

jathanism
+1  A: 

Thats why you have sys.argv (and dont forget to import sys) It will return all your blah*.txt as a list of filenames

domino
Hmmmm I did not know it could handle it. Thanks!
Donovan
A: 
import sys, glob
files = reduce(lambda x, y: x + y, (glob.glob(x) for x in sys.argv[1:]))
carl
+2  A: 
import sys

for arg in sys.argv[1:]:
  print arg

In Unix-land, the shell does the job of glob-expanding the commandline arguments, so you don't need to do it yourself. If you're processing a bunch of files in sequence, you might also look at the fileinput module, which works like Perl's "magic ARGV" handle and the -n and -i flags. It lets you loop over every line of every file named on the commandline, optionally moving the file to a backup name and opening stdout to the original name of the file, which lets you do something simple like:

import fileinput

for line in fileinput.input(inplace = True, backup = '.bak'):
  print fileinput.filelineno() + ": " + line

to add the line number to the beginning of every line of every file on the commandline, while saving the originals as filename.bak.

hobbs