views:

464

answers:

2

Useing VC++ VisualStudio 2003.

I'm trying to copy several image files (30kb or so per file) from another computer`s shared folder to a local file.

The problem is that there can be more than 2000 or so files in one transfer, and it seems to take its toll, substantially taking more time to complete.

Is there any alternate method of copying files from another computer that could possibly speed up the copy?

Thanks in advance.

EDIT* Due to client request, it is not possible to change the code base dramaticaly, hate to have to deviate from best practice because of non-techinical issues, but is there a more subtle approuch? such as another function call?

I know I`m asking for some magical voodoo, asking just in case somebody knows of such.

+1  A: 

Well, for a start, 2000 is not several. If it's taking most of the time because you're sending lots of small files, then you come up with a solution that packages them at the source into a single file and unpackages them at the destination. This will require some code running at the source - you'll have to design your solution to allow that since I assume at the moment you're just copying from a network share.

If it's the network speed (unlikely), you compress them as well.

My own beliefs are that it will be the number of files, basically all the repeated startup costs of a copy. That's because 2000 30K files is only 60MB, and on a 10Mb link, theoretical minimum time would be about a minute.

If your times are substantially above that, then I'd say I'm right.

A solution that uses 7zip or similar to compress them all to a single 7z file, transmit them, then unzip them at the other end sounds like what you're looking for.

But measure, don't guess! Test it out to see if it improves performance. Then make a decision.

paxdiablo
thanks for the advice.Its not file size, the archiving method seems ideal
Saifis
+2  A: 

A few things to try:

  • is copying files using the OS any faster?

  • if no, then there may be some inherent limitations to your network or the way it's setup (maybe authentication troubles, or the distant server has some hardware issues, or it's too busy, or the network card loses too many packets because of collisions, faulty switch, bad wiring...)

  • make some tests transferring files of various sizes.
    Small files are always slower to transfer because there is a lot of overhead to fetch their details, then transfer the data, then create directory entries etc.

  • if large files are fast, then your network is OK and you're probably not be able to improve the system much (the bottleneck is elsewhere).

  • Eventually, from code, you could try to open, read the files into a large buffer in one go then save them on the local drive. This may be faster as you'll be bypassing a lot of checks that the OS does internally.

  • You could even do this over a few threads to open, load, write files concurrently to speed things up a bit.

A couple of references you can check for mutli-threaded file copy:

If implementing this yourself in code is too much trouble, you could always simply execute a utility like McTool in the background of your application and let it do the work for you.

Renaud Bompuis
yes, copying useing the OS is faster.thanks for the adcice, will look into the multi-thread approach
Saifis