views:

63

answers:

1
+1  Q: 

`virtual' files

Sometimes I'm downloading big files which have been split into chunks, say, a 1GByte file into ten chunks with 100MByte each.  Currently, I have to concatenate all files into a new file to be able to access it as a whole.  I now wonder whether it is possible to group these chunks into a virtual file (similar to virtual file systems), avoiding this copying.  BTW, I'm using a GNU/Linux box.

A: 

You could use FUSE to make something like this. But it probably isn't worth it, because:

  1. It would be fairly complex to handle the various split schemes programmatically without false positives.
  2. It's easy to do manually, and only has to be done once per file.
  3. With a good file transfer protocol (e.g. HTTP or BitTorrent), there's no need to split the files to begin with.
Matthew Flaschen
Just for the sake of argument, what if the limitation is a FAT32 filesystem that can't handle files larger than 4GB?
jleedev
Why would anyone use FAT32 on a modern system? Even if you're dual-booting Linux and Windows, you can now reliably use NTFS to share data.
Matthew Flaschen
Well, if you download from, say, rapidshare, you have no choice: you get a bunch of chunks whether you like or not...
Werner Lemberg
Another comment: What do you mean with `various split schemes'? I just want a `replacement' for `cat', for example, instead of saying cat foo bar > foobarI would like to have virtualfile foobar foo bar
Werner Lemberg
The split schemes is based on the filesystem creating the virtually merged file automatically. If you have to execute the virtualfile command manually every time, what's the point? Why not cat it once and be done with it?
Matthew Flaschen
The point of manually doing is that such a virtually concatenated file would exist just a few minutes! Often, large files are compressed with e.g. the `rar' program which can produce chunks automatically. However, sometimes the final rar archive gets split into chunks which means that you have to concatenate them, then calling rar, then deleting the chunks. Some compression algorithms such as LZW (used in gzip) don't need a seek so piping is possible. The RAR format, however, doesn't work with piping.
Werner Lemberg
I see your point with regard to RAR files that you want to decompress then immediately get rid of. Note that bzip2 also supports piping through bzcat.
Matthew Flaschen
The problem is that I don't have control on the used archiving format by others, and RAR seems to be extremely popular for Windows users...
Werner Lemberg