I would like to have a client-server application written in .NET which would do following:
- server is running Linux
- on the server there is SQL database (mySQL) containing document URLs
What we want: - server side would regularly crawl all URLs and create a full text index for them - client side would be able to perform a query into this index using GUI
The client application is written in .NET using C#. Besides of searching in documents it will be able to do a lot of other things which are not described here and which are done client-side very well.
We would like to use C# for the server side as well, but we have no experience in this area. How are things like this usually done?
Clarifying question now based on some answers:
The thing which is most unclear to me is how client-server communication is usually handled. Is client and server usually using sockets, caring about details like IP addresses, ports or NAT traversal? Or are there some common frameworks and patters, which would make this transparent, and make client-server messaging or procedure calling easy? Any examples or good starting points for this? Are there some common techniques how to handle the fact a single server is required to server multiple clients at the same time?