views:

279

answers:

2

I'm looking into replacing a bunch of Python ETL scripts that perform a nightly / hourly data summary and statistics gathering on a massive amount of data.

What I'd like to achieve is

  • Robustness - a failing job / step should be automatically restarted. In some cases I'd like to execute a recovery step instead.
  • The framework must be able to recover from crashes. I guess some persistence would be needed here.
  • Monitoring - I need to be able to monitor the progress of jobs / steps, and preferably see history and statistics with regards to the performance.
  • Traceability - I must be able to understand the state of the executions
  • Manual intervention - nice to have... being able to start / stop / pause a job from an API / UI / command line.
  • Simplicity - I prefer not to get angry looks from my colleagues when I introduce the replacement... Having a simple and easy to understand API is a requirement.

The current scripts do the following:

  • Collect text logs from many machines, and push them into Hadoop DFS. We may use Flume for this step in the future (see http://www.cloudera.com/blog/2010/07/whats-new-in-cdh3b2-flume/).
  • Perform Hive summary queries on the data, and insert (overwrite) to new Hive tables / partitions.
  • Extract the new summaries data into files, and load (merge) into MySql tables. This is data needed later for on-line reports.
  • Perform additional joins on the newly added MySql data (from MySql tables), and update the data.

My idea is to replace the scripts with spring-batch. I also looked into Scriptella, but I believe it is too 'simple' for this case.

since I saw some bad vibes on Spring-Batch (mostly old posts) I'm hoping to get some inputs here. I also haven't seen much about spring-batch and Hive integration, which is troublesome.

+1  A: 

Why not use JasperETL or Talend? Seems like the right tool for the job.

dukethrash
These tools are valid, but I'm looking into a more programmatic solution. We were actually using Talend before, and ran into some scale issues. Another problem is that graphical tools are not very popular by the developers - the icons hide too much of the complexity, and understanding the ETL flow has proven to be a big effort for most developers.
Eran Harel
+1  A: 

If you want to stay within the Hadoop ecosystem, I'd highly recommend checking out Oozie to automate your workflow. We (Cloudera) provide a packaged version of Oozie that you can use to get started. See our recent blog post for more details.

Jeff Hammerbacher
Thanks Jeff. Does Oozie integrate with Hive and MySql?
Eran Harel
Yes. There are Hive and Sqoop actions in the version of Oozie packaged with CDH. Sqoop will allow you to perform imports and exports with MySQL. To run queries on data stored in MySQL, you could use the Java action (http://yahoo.github.com/oozie/releases/2.2.0/WorkflowFunctionalSpec.html#a3.2.7_Java_Action) and a JDBC driver to submit queries.
Jeff Hammerbacher
10X again, I will look into it
Eran Harel