views:

34

answers:

1

I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.

Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?

I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.

Are their security benefits one way or another? Performance benefits? ASPX page wins the "ease of maintenance" category.

+1  A: 

I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.

Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.

IrishChieftain