views:

940

answers:

3

We are writing an ASP.Net/C# based program that will potentially be accessed by a number of companies (each having separate login and data). We are thinking of having multiple sql server 2008 databases (same instance), each for one company. However, the c# program that accesses the database will be the same and will create appropriate connection string based on the database the customer will be accessing.

How many such databases can be created in the single instance of the sql server before seeing any performance degradation due to:

  • Limit on the connections, because each connection (not sure if it will be pooled for accessing different databases) is created using a differents connection string.

  • Limit on the number of databases, is it limited by the hardware or sql server 2008 will show degradation when the number of databases increases to say 100?

Anything else I might be missing?

Thanks for your time

+4  A: 
  • Max databases per SQL Server instance: 32,767
  • Max User connections: 32,767

(From here: Maximum Capacity Specifications for SQL Server)

Both are practically limited by the amount of RAM the SQL server machine has, long before it reaches those maximum values.

Of the two, I suspect user connections are going to be the bigger problem if you have thousands of users (as you are not using connection pooling).

To find the SQL Server machine's current value:

SELECT @@MAX_CONNECTIONS AS 'Max Connections'

Updated in response to poster's comments:
It's not really the number of databases that is the problem, but more the number of frequently accessed pages in those databases. If all the 'hot' pages fit into memory (and very few physical reads occur) then all is good.

Mitch Wheat
Thanks for your commentsAny ideas how the sql server performance gets degraded if we increase the number of databases. Say we compare a single database where the data is partitioned using primary/foreign keys and between when each company is on a different database.Also, will sql server along with ado.net not do connection pooling if you are creating connections to different databases but doing it from the same instance of the asp.net process?
Samuel
@Samuel: suggest you add those questions to your original question. (1) It's not really the number of databases, but more the number of frequently accessed pages in those dsatabases. If all the 'hot' pages fit into memory (and very few physical reads occur) then all is good. (2) No. A connection to a different DB is another user connection.
Mitch Wheat
+2  A: 

Having multiple databases for multiple client could easily become a maintenance nightmare. If the application is the same, I am assuming that the DB design would be the same as well.

We did a similar project couple of years back, but we decided to go for commingling data in the same database and then developed a robust security model to make sure that one customer does not end up seeing or modifying another customer's data.

I can proudly say that the project was a success and today as we speak is holding data of 100+ different customers and is performing flawlessly.

Raj

Raj
That is true but one thing I was thinking about is the flexibility of changing customers to a different instance if the server gets loaded...Also I thought that having separate database may make the processing faster because there will not be too many records in the individual databases. We are talking about about 100,000 entries per company.
Samuel
+2  A: 

You should also keep in mind that connections will be pooled by connection string -- in your case, you will get separate pools for each Customer DB. That might not be bad if you have high traffic for each customer, but if you have low traffic to lots of different databases you will not get the full benefit of pooling.

JP Alioto