tags:

views:

67

answers:

3

A client has a system which reads large files (up to 1 GB) of multiple video images. Access is via an indexing file which "points" into the larger file. This works well on a LAN. Does anyone have any suggestions as to how I can access these files through the internet if they are held on a remote server. The key constraint is that we cannot afford the time necessary to download the whole file before accessing individual images within it.

+1  A: 

You could put your big file behind an HTTP server like Apache, then have your client side use HTTP Range headers to fetch the chunk it needs.

Another alternative would be to write a simple script in PHP, Perl or server-language-of-your-choice which takes the required offsets as input and returns the chunk of data you need, again over HTTP.

Paul Dixon
A: 

If I understand the question correctly, it depends entirely on the format chosen to contain the images as a video. If the container has been designed in such a way that the information about each image is accessible just before or just after the image, rather than at the end of the container, you could extract images from the video container and the meta-data of the images, to start working on what you have downloaded until now. You will have to have an idea of the binary format used.

Alan Haggai Alavi
I guess I was hoping for some middle-ware solution that could intercept the fseek/freads (or whatever) generated locally and replicate them on the server without the need to change the code running on the client to which I do not have access.
A: 

FTP does let you use 'paged files' where sections of the file can be transferred independently

To transmit files that are discontinuous, FTP defines a page structure. Files of this type are sometimes known as "random access files" or even as "holey files". In FTP, the sections of the file are called pages -- rfc959

I've never used it myself though.

Pete Kirkham