views:

184

answers:

3

I have a web-service that resides on serverA. The webservice will be responsible for finding files of a certain type in a virtual directory on serverB and then returning the full URL to the files.

I have the service working if the files are located on the same machine - this is straight-forward enough. My question is what is the best way to find all files of a certain type (say *.xml) in all directories below a known virtual directory on a remote server?

So for example, the webservice is on http://ServerA/service.asmx and the virtual directory is located at http://serverB/virtualdirectory

So in this code, obviously the DirectoryInfo will not take a path to the remote server - how do I access this so I can find the files it contains? How do I then get the full URL to a file found on that remote server?

        DirectoryInfo updateDirectory = new DirectoryInfo(path);

        FileInfo[] files = 
             updateDirectory.GetFiles("*.xml", SearchOption.AllDirectories);

        foreach (FileInfo fileInfo in files)
        {
            // Get URL to the file
        }

I cannot have the files and the service on the same server - IT decision that is out of my hands.

Thanks!

+1  A: 

You need to make sure that the location on server B can be browsable - meaning that when you access the url http://serverB/virtualdirectory you can see the files.

When this is done you can use the System.Net.WebClient to get the data from the url (DownloadData method). Parse the data and use WebClient to get each file.

kimtiede
Thanks for the suggestion - however I definitely do not want to download any data between the two servers as these files can be quite large. I only want to get a list of the URLs to the files of a certain type under the virtual directory and then return this to the caller of the service.
Peter Kelly
A: 

I would setup the virtual directory on serverA IIS to point to the UNC path of server B. Like: http://serverA/virtualdirectory/ gets mapped to \serverb\files\ Then as you already know the URL path and can get the directory using getfiles, you can then just build a valid url in your asmx response.

Edit: Since you are using winforms I would suggest not even exposing your files via a virtual directory to be accessed from a URL. You should just stream the files using your winforms client through your asmx. There are number of ways to do this. This project is a very complete starting point for a MTOM files transfer application. http://www.codeproject.com/KB/XML/MTOMWebServices.aspx You can also just transfer files using byte streams. http://support.microsoft.com/kb/318425

Hope that helps.

pcpimpster
Also what is consuming the web service? A browser or client app like winforms or wpf?
pcpimpster
A client winforms app is consuming the service. I'll look into mapping a virtual directory to the UNC path so.
Peter Kelly
Appreciate the extra Edit, good link and interesting approach. However, the component I'm using needs a full URI in order to create the downloading part.
Peter Kelly
+1  A: 

Is this serverB used only as a repository? Another approach is setting an application (a simple web-service can be), that reads a directory contents and returns a list of URL's.

You could expand it, performing operations (web methods) of files manipulating on that service.

Erup
This is my accepted answer because I think a) the responsibility for searching the directories should probably be on the server where the files are. b) It doesn't seem there is an easy way to remotely search a virtual directoryIf I add this second service, then what's the point of the first service? That's the downside but at least that is a single point of contact for logging etc. and the actual file server could change...Thanks
Peter Kelly