tags:

views:

605

answers:

3

I need to write data in to Hadoop (HDFS) from external sources like a windows box. Right now I have been copying the data onto the namenode and using HDFS's put command to ingest it into the cluster. In my browsing of the code I didn't see an API for doing this. I am hoping someone can show me that I am wrong and there is an easy way to code external clients against HDFS.

+2  A: 

Install Cygwin, install Hadoop locally (you just need the binary and configs that point at your NN -- no need to actually run the services), run hadoop fs -copyFromLocal /path/to/localfile /hdfs/path/

You can also use the new Cloudera desktop to upload a file via the web UI, though that might not be a good option for giant files.

There's also a WebDAV overlay for HDFS but I don't know how stable/reliable that is.

SquareCog
+3  A: 

There is an API in Java. You can use it, by including the Hadoop code in your project. The JavaDoc is quite helpful in general, but of course you have to know, what you are looking for *g * hadoop.apache.org/common/docs/

For your particular problem, have a look at: http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html (this applies to the latest release, consult other JavaDocs for different versions!)

A typical call would be: Filesystem.get(new JobConf()).create(new Path("however.file")); Which returns you a stream you can handle with regular JavaIO.

Peter Wippermann
+1  A: 

For the problem of loading the data I needed to put into HDFS, I choose to turn the problem around.

Instead of uploading the files to HDFS from the server where they resided, I wrote a Java Map/Reduce job where the mapper read the file from the file server (in this case via https), then write it directly to HDFS (via the Java API).

The list of files is read from the input. I then have an external script that populates a file with the list of files to fetch, uploads the file into HDFS (using hadoop dfs -put), then start the map/reduce job with a decent number of mappers.

This gives me excellent transfer performance, since multiple files are read/written at the same time.

Maybe not the answer you were looking for, but hopefully helpful anyway :-).

Erik Forsberg