Hi all, I have been working in a web project(asp.net) for around six months. The final product is about to go live. The project uses SQL Server as the database. We have done performance testing with some large volumes of data, results show that performance degrades when data becomes too large, say 2 million rows (timeout issues, delayed reponses, etc). At first we were using fully normailized database, but now we made it partially normalized due to performance issues (to reduce joins). First of all, is it the right decision? Plus what are the possible solutions when data size becomes very large, as the no. of clients increase in future?
I would like to add further:
- 2 million rows are entity tables, tables resolving the relations have much larger rows.
- Performance degrades when data + no. of users increases.
- Denormalization was done after identifying the heavily used queries.
- We are also using some heavy amount of xml columns and xquery. Can this be the cause?
- A bit off the topic, some folks in my project say that dynamic sql query is faster than a stored procedure approach. They have done some kind of performance testing to prove their point. I think the opposite is true. Some of the heavily used queries are dynamicaly created where as most of other queries are encapsulated in stored procedures.