views:

138

answers:

3

I need to write some scripts to carry out some tasks on my server (running Ubuntu server 8.04 TLS). The tasks are to be run periodically, so I will be running the scripts as cron jobs.

I have divided the tasks into "group A" and "group B" - because (in my mind at least), they are a bit different.

Task Group A

  1. import data from a file and possibly reformat it - by reformatting, I mean doing things like santizing the data, possibly normalizing it and or running calculations on 'columns' of the data

  2. Import the munged data into a database. For now, I am mostly using mySQL for the vast majority of imports - although some files will be imported into a sqlLite database.

Note: The files will be mostly text files, although some of the files are in a binary format (my own proprietary format, written by a C++ application I developed).

Task Group B

  1. Extract data from the database
  2. Perform calculations on the data and either insert or update tables in the database.

My coding experience is is primarily as a C/C++ developer, although I have been using PHP as well for the last 2 years or so (+ a few other languages which are not relevant for the purpose of this question). I am from a windows background so I am still finding my feet in the linux environment.

My question is this - I need to write scripts to perform the tasks I described above. Although I suppose I could write a few C++ applications to be used in the shell scripts, I think it may be better to write them in a scripting language (maybe this is a flawed assumption?). My thinking is that it would be easier to modify things in a script - no need to rebuild etc for changes to functionality. Additionally, C++ data munging in C++ tends to involve more lines of code than "natural" scripting languages such as Perl, Python etc.

Assuming that the majority of people on here agree that scripting is the way to go, herein lies my dilemma. Which scripting language to use to perform the tasks above (giving my background).

My gut instinct tells me that Perl (shudder) would be the most obvious choice for performing all of the above tasks. BUT (and that is a big BUT). The mere mention of Perl makes my toes curl, as I had a very, very bad experience with it a while back (bought the Perl Camel book + 'data munging with Perl' many years ago, but could still not 'grok' it just felt too alien. The syntax seems quite unnatural to me - despite how many times I have tried to learn it - so if possible, I would really like to give it a miss. PHP (which I already know), also am not sure is a good candidate for scripting on the CLI (I have not seen many examples on how to do this etc - so I may be wrong).

The last thing I must mention is that IF I have to learn a new language in order to do this, I cannot afford (time constraint) to spend more than a day, in learning the key commands/features required in order to do this (I can always learn the details of the language later, once I have actually deployed the scripts).

So, which scripting language would you recommend (PHP, Python, Perl, [insert your favorite here]) - and most importantly WHY?. Or, should I just stick to writing little C++ applications that I call in a shell script?.

Lastly, if you have suggested a scripting language, can you please show with a FEW lines (Perl mongers - I'm looking in your direction [nothing too cryptic!] ;) ) how I can use the language you suggested to do what I am trying to do i.e.

  • load a CSV file into some kind of data structure where you can access data columns easily for data manipulation
  • dump the columnar data into a mySQL table
  • load data from mySQL table into a data structure that allows columns/rows to be accessed in the scripting language

Hopefully, the snippets will allow me to quickly spot the languages that will pose the steepest learning curve for me - as well as those that simple, elegant and efficient (hopefully those two criteria [elegance and shallow learning curve] are not orthogonal - though suspect they might be).

+1  A: 

I'd go with Python or Ruby. You will most likely find them much faster/easier to pick up than Perl, and they are still very powerful/efficient languages in their own right for "data munging". You should be able to pick up either of them in a day or less, not counting looking up random library functions every so often.

To pick up Python quick: http://diveintopython.org/toc/index.html

I personally can't recommend a Ruby tutorial myself, but I'm sure others can chime in with good options.

If you want to play around with either, http://www.trypython.org and http://www.tryruby.org each host online interactive-shell versions of the interpreters for their respective languages.

Amber
@amber: I suspected Python would feature in the least. I am not too averse to it, as I have played with it in the past. Early days yet. I'll wait and see if a consensus opinion emerges. Thanks for your valued input
morpheous
Could those who downvoted this please have the courtesy to comment on why?
Amber
@Amber: Probably for unsupported opinions. s/Go with/I would go with/, s/You will most likely/I found Python/, and delete "coming from a C/C++ background" completely (although if you come/came from a C/C++ background, and really feel that this somehow was a factor in the time that it took to pick up Python vs. Perl, then feel free to edit as appropriate).
runrig
runrig: I don't really see my answer as having much more in the way of "unsupported opinions" than other answers which were not downvoted, and I've done my best to give links to resources that would help the OP decide for their self. I'd personally assume that any answer on this site for this kind of question is going to be an opinion, given that there's almost never a single answer to the question of "what language should I write this in?"
Amber
Also, I think the "you will most likely" comment is appropriate *given that* the OP already stated they had trouble wrapping their head around Perl, but also is experienced with C/C++ and thus doesn't have trouble with programming in general, just Perl specifically.
Amber
+2  A: 

import data from a file and possibly reformat it

Python excels at this. Be sure to read up on the csv module so you don't waste time inventing it yourself.

For binary data, you may have to use the struct module. [If you wrote the C++ program that produces the binary data, consider rewriting that program to stop using binary data. Your life will be simpler in the long run. Disk storage is cheaper than your time; highly compressed binary formats are more cost than value.]

Import the munged data into a database. Extract data from the database Perform calculations on the data and either insert or update tables in the database.

Use the mysqldb module for MySQL. SQLite is built-in to Python.

Often, you'll want to use Object-Relational mapping rather than write your own SQL. Look at sqlobject and sqlalchemy for this.

Also, before doing too much of this, buy a good book on data warehousing. Your two "task groups" sound like you're starting down the data warehousing road. It's easy to get this all fouled up through poor database design. Learn what a "Star Schema" is before you do anything else.

S.Lott
@s.lott: " sound like you're starting down the data warehousing road". I like it when people are able to "reverse engineer" scenarios like this. Indeed you are right, it is a kind of "poor mans's data warehouse" I am building
morpheous
Perl also excels at this. If, however, you don't care for Perl, then go ahead and use Python. Or Ruby.
runrig
+3  A: 

Well, I was you a few years back. Didn't like Perl at all and would re-write any scripts my peers wrote in Perl back to Python - because I could not stand Perl. Long story short - let's just say I am fairly conversant with Perl now. I would recommend a book called "Impatient Perl" which explains the really important stuff quite nicely and which converted me to Perl. :) Another thing, is to install the Perl documentation on your computer - this was really important for me - easy and quick access to sample code, etc.

Teaser Script for Task A - to read a file, format it and then write to the database.

use autodie qw(:all);
use Text::CSV_XS ();
use DBI ();

my $csv = Text::CSV_XS->new({binary => 1}) 
  or die 'Cannot use CSV: ' . Text::CSV->error_diag;

{
    my $database_handle = DBI->connect(
        'dbi:SQLite:dbname=some_database_file.sqlite', undef, undef, {
            RaiseError => 1,
            AutoCommit => 1,
        },
    );
    $database_handle->do(
        q{CREATE TABLE something_table_or_other ('foo' CHAR(10), 'bar' CHAR(10), 'baz' CHAR(10), 'quux' CHAR(10), 'blah' CHAR(10))}
    );

    my $statement_handle = $database_handle->prepare(
        q{INSERT INTO something_table_or_other ('foo', 'bar', 'baz', 'quux', 'blah') VALUES (?, ?, ?, ?, ?)}
    );

    {
        open my $file_handle, '<:encoding(utf8)', 'data.csv';
        while (my $columns_aref = $csv->getline($file_handle)) {
            my @columns = @{ $columns_aref };

            # sanitize the columns - maybe substitute commas, numbers, etc.
            for (@columns) {
                s{,}{};  # substitutes commas with nothing
            }

            # insert columns into database now, using placeholders
            $statement_handle->execute(@columns);
        }
    }
}

Note: Given your current distaste for Perl, I would well recommend you do the above "tasks" in any programming language you are comfortable in. The above is only an attempt to show you that it might not be so cryptic after all. You get to be cryptic when you don't want to repeat yourself! :)

Bart J
And similar to what the python poster recommends, if the data is CSV or binary or fixedlength, there are libraries (Text::CSV_XS, Parse::Binary, Parse::FixedLength) and functions (pack, unpack) to easily deal with those also. Oh, and also similar to what that poster recommends, let me just add, "Perl excels at this" (as does almost any scripting language).
runrig
Replaced it with exemplary modern code that actually works.
daxim
Thanks, daxim! I was planning to do that tonight. :)
tsee