views:

82

answers:

4

I'm not a web developer, but a senior developer has set me with the task of developing an internal web site that hosts licensed software for download by branch offices.

The parameters for the project are:

1) It must have a web interface, to display meta information about the software. Use AD authentication, and log downloads. For these reasons, FTP is not a solution.

2) The software to download can contain between 5 and 5,000 files and range from a few Mb to 4Gb. Zipping the directories, or zipping the directories on the fly cannot be used.

With this in mind, the solution I have designed (and mostly coded) does the following. When a user visits the site, and requests a software package, the website (written in ASP) passes parameters to a VBScript that uses Robocopy to copy the software package to the branch office's file server. The problem with this method is that, apart from being very dirty, the user doesn't get a progress bar or any feedback that the Robocopy is working, and if the software package being copied is 4Gb, the script could be running for quite some time.

What I'm wondering is, does anybody have a better conceptual solution for the project, or should I fold on it and insist we simply use an FTP server?

I know a question like this isn't quite what Stack was designed for, and if I'm out of line asking it here, please tell me to go away and close it, but I'm in a rut, and the collective knowledge of Stack seems to be my only light in the darkness.

Cheers.

A: 

One thing you should look into is WebDAV. I haven't used it personally, but it is a widely used technology for sharing files using a web server.

I found this introduction to WebDAV with a simple google search.

The second half of your question regards using Active Directory for user authentication. To integrate the two you should look for an LDAP module for Apache.

Finally, if you're looking to add version control you might want to think of integrating a source control repository like subversion.

*Warning - although this is a common issue and solution - from what I've read it doesn't look like a simple configuration.

Add all three to google and you get 99k pages that could help you.

Kevin Williams
Thanks Kevin, although perhaps I haven't explained myself well enough.The users of the site will be non-technical and the main purpose of the site is to have a user-friendly interface for the them to search and retrieve the software. So I don't think WebDAV is quite the solution I'm looking for.
The KZA
Also, AD authentication isn't really a problem. I already have it in place. And I'm using IIS, not Apache (should've mentioned that, sorry)
The KZA
+1  A: 

Well since you're already underway, I'll suggest according to that:

I'm unsure what you mean by dirty? If you mean unstable then it's something you need to address, otherwise if it works 99.9% of the time, it sounds like your 3/4 home and your only problem is the user experience. I doubt whether in this instance the user will care what's under the hood so long as it's simple and reliable.

Robocopy does have a continuous progress indicator albeit to the command line. Couldn't you just redirect the command line output from Robocopy, parse and format it as necessary in order to update the user on the status of their download?

VB Script Redirect Command Line Output

Otherwise I think your best bet is to reexamine constraint #2. Why can't files be compressed? I'd chunk them as they are added to the download repository. Make each chunk say 50MB in size and use the chunks as your progress indicators.

rism
I only say dirty because it's using a web-site that pulls values from SQL to launch a VBScript that starts Robocopy. A lot of room for error but it does seem to work, and I agree, I haven't really cared since the user can't see under the hood.
The KZA
I'll look at your link as that sounds like it could solve my problems! Also, the files can't be compressed because they get updated often and on the fly. Having to make a new zip adds a lot of complexity to the guys updating the software. I had suggested scripting them to be updated once a day...
The KZA
... but was told that wasn't good enough as it almost doubled our storage requirement.
The KZA
Oh o.k. If you're almost doubling even with compression then it sounds like its already heavily compressed. Although I would do the math on the cost of the storage vs the cost of transmitting "un"compressed data. Bearing in mind they will be stored once, but transmitted multiple times...
rism
... if you can make a 15-20% saving on data transferred via compression then how many downloads does it take before before that saving exceeds the cost of additional storage? Just a thought, either way all the best.
rism
A: 

There are entire technology stacks associated with deploying software to remote machines. One that is scalable, robust, and feature rich is Microsoft System Management Services. This is likely way beyond the scope of what you are intending but consider making the client machine participate more actively in the interaction by creating a download manager that supports your web front end. Code project has some interesting projects you might gain some insight from. In Particular MyDownloader looks as though it could offer you some alternatives to your VBScript.

ojblass
We're making available source files, not actually installing the software. I'll have a look at MyDownloader, thanks very much.
The KZA
A: 

Since you are just making files available please consider using a source control system in a read only fasion to your users. Git or suberversion have numberous customizable web front ends.

ojblass