I have an ASP.NET site I've inherited where all page views are logged to a database. Currently we store sessionguid, page, querystring and timestamp. We generally keep it at about 100M rows before purging. Certain users have access to pageview reports and it often times out. Other than doing an index analysis, any other tips on how to best handle this? Should I normalize out the page and/or querystring fields?
Have seperate tables for the total pages in your system and have a cross reference table between the pages table with users table. Perform only updates and update the indicators that you need in the reports. Are users registered on your site?
Did you try using the server log files for site statistics (Apache/IIS)? You should also plan to analyze the data in the access log to extract information about total request and page counts, bandwidth, popular pages, user paths, browser usage etc. There will be lot of information in server log files.
Also take a look at ganglia , maybe you can externalize this whole operation from your core system
100M records is not a realistic amount of rows to do real-time reporting on. You're either going to have to narrow down your dataset quite a bit or run reports every so often that generates more manageable reporting data.