I'm building an application which (currently) consists of one web application (ASP.NET MVC) and two console applications.
The web application is just the user interface. The first console application is a service that is run in a specified interval, and scrapes several web pages. The second console application is responsible for sending out the information from my "Downloader" by mail. My console applications is run on different computers. The UI just displays the result from the downloader.
The procedure is like this:
When a user add an URI to scrape in the UI, the uri is saved into a SQL Server table. My "downloader" then selects all uris and scrape them and inserts it to a results table, and a mailqueue table. My "Mail sender" is then selecting all rows from the mailqueue table and sends the information to the user.
Is this the "optimal" solution, or can i optimize it in any way? I find it pretty hard to maintain right now. Maybe I can use WCF to communicate direct between my applications?
The reason that the console applications is run on different computers is that the "Downloader" needs to be connected to a VPN, which i cannot send mail from.