I'm not a web developer, but a senior developer has set me with the task of developing an internal web site that hosts licensed software for download by branch offices.
The parameters for the project are:
1) It must have a web interface, to display meta information about the software. Use AD authentication, and log downloads. For these reasons, FTP is not a solution.
2) The software to download can contain between 5 and 5,000 files and range from a few Mb to 4Gb. Zipping the directories, or zipping the directories on the fly cannot be used.
With this in mind, the solution I have designed (and mostly coded) does the following. When a user visits the site, and requests a software package, the website (written in ASP) passes parameters to a VBScript that uses Robocopy to copy the software package to the branch office's file server. The problem with this method is that, apart from being very dirty, the user doesn't get a progress bar or any feedback that the Robocopy is working, and if the software package being copied is 4Gb, the script could be running for quite some time.
What I'm wondering is, does anybody have a better conceptual solution for the project, or should I fold on it and insist we simply use an FTP server?
I know a question like this isn't quite what Stack was designed for, and if I'm out of line asking it here, please tell me to go away and close it, but I'm in a rut, and the collective knowledge of Stack seems to be my only light in the darkness.
Cheers.