views:

54

answers:

2

I have design a database. Theres no columns with indexing, nor any code for optimizing. I am positive i should index certain columns since i search them a lot.

My question is HOW do i test if any part of my database will be slow? ATM I am using sqlite and i will be switching to either MS Sql or MySql based on my host provider. Will creating 100,000 records in each table be enough? Or will that always be fast in sqlite and i need to do 1mil? Do i need 10mil before a database will become slow?

Also how do i time it? I am using C# so should i use StopWatch or is there a ADO.NET/Sqlite function i should use?

+2  A: 

Its really a question of monitoring things over time as optimal execution plans will change as you add more data. If you want to test then 10 million rows should be adequate to make most performance issues visible but you will also need to try and populate the test data with the same cardinality characteristics as the real data will have. SQL Server has lots of performance monitoring functionality built in (Dynamic Management Views, XEvents, SQL Trace/Profiler) but I'm not sure how much you would be able to access if you are on a hosted solution. For monitoring performance outside the database ADO.NET supports tracing but I've never used it myself.

Martin Smith
A: 

About your second question SqlLite is almost always faster than MsSql since your database in SqlLite is in memory as in MsSql your database is on a hard disk

Yassir
i dont think its true that sqlite is in memory. I created a million entry table and the ram usage did not grow.
acidzombie24
it can be set as an in memory database http://www.sqlite.org/inmemorydb.html
Yassir