Hi
Is there a way to change a valid and existing Hadoop Path object into a useful Java File object. Is there a nice way of doing this or do I need to bludgeon to code into submission? The more obvious approaches don't work, and it seems like it would be a common bit of code
void func(Path p) {
if (p.isAbsolute()) {
File f = new File(p.toURI());
}
}
This doesn't work because Path::toURI() returns the "hdfs" identifier and Java's File(URI uri) constructor only recognizes the "file" identifier.
Is there a way to get Path and File to work together?
**
Ok, how about a specific limited example.
Path[] paths = DistributedCache.getLocalCacheFiles(job);
DistributedCache is supposed to provide a localized copy of a file, but it returns a Path. I assume that DistributedCache make a local copy of the file, where they are on the same disk. Given this limited example, where hdfs is hopefully not in the equation, is there a way for me to reliably convert a Path into a File?
**