Hi,
I'm developing a statistics module for my website that will help me measure conversion rates, and other interesting data.
The mechanism I use is - to store a database entry in a statistics table - each time a user enters a specific zone in my DB (I avoid duplicate records with the help of cookies).
For example, I have the following zones:
- Website - a general zone used to count unique users as I stopped trusting Google Analytics lately.
- Category - self descriptive.
- Minisite - self descriptive.
- Product Image - whenever user sees a product and the lead submission form.
Problem is after a month, my statistics table is packed with a lot of rows, and the ASP.NET pages I wrote to parse the data load really slow.
I thought maybe writing a service that will somehow parse the data, but I can't see any way to do that without losing flexibility.
My questions:
- How large scale data parsing applications - like Google Analytics load the data so fast?
- What is the best way for me to do it?
- Maybe my DB design is wrong and I should store the data in only one table?
Thanks for anyone that helps,
Eytan.