views:

42

answers:

2

We have a box that has terabytes of data (10-20TB) each day, where each file on the drive is anywhere from megabytes to gigabytes.

We want to send all these files to a set of 'pizza boxes', where they will consume and process the files.

I can't seem to find anything that is built to handle this amount of data besides distcp (hadoop). Robocopy/etc won't do.

Anyone know of a solution that can handle this type of delegation (share the work amongst the pizza boxes) and has reliable file transferring?

+1  A: 

Take a look at Flume http://archive.cloudera.com/cdh/3/flume/UserGuide.html

Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of log data. It has a simple and flexible architecture based on streaming data flows. It is robust and fault tolerant with tunable reliability mechanisms and many failover and recovery mechanisms. The system is centrally managed and allows for intelligent dynamic management. It uses a simple extensible data model that allows for online analytic applications.

To install it https://wiki.cloudera.com/display/DOC/Flume+Installation

Joe Stein
A: 

As already mentioned the Hadoop is the answer cause it's exactly made for such kind of large data. You can create Hadoop cluster and store the information there and use the core's of the boxes to analyze the information by using map/reduce.

khmarbaise