views:

104

answers:

3

I need to log all post and get requests on web site in the database. There will be two tables:

  • requests with time stamp, user id and requested URI
  • request parameters with name, value and request id

I will use it only for analytical reports once per month. No regular usage of this data.

I have about one million requests a day and the request parameters table will be very huge. Can I handle such a large table in MySQL with no problems?

A: 

Yes, mysql will handle millions of rows normally, but depending on what you wanna do with your data later and on indexes on those tables perfomance may be not very high.

PS. In my project we have a huge pricelist with a few millions of products in it and it works without any problems.

nightcoder
+1  A: 

I'd avoid writing to the db on each request or you'll be vulnerable to slashdot effect. Parse your web logs during quiet times to update the db.

SpliFF
Could you explain what is slashdot effect?
Bogdan Gusiev
http://en.wikipedia.org/wiki/Slashdotted
Dan F
A: 

The usual solution of this type of problem is to write a program that parses the logs from the whole month. If You don't need sophisticated MySQL capabilities, You should consider this approach.

If You really need the database, then consider parsing logs offline. Otherwise, if Your database goes down, You will loose data. Logs are know to be pretty safe.

Table indexes are not free. The more indexes You have, the faster the queries run, but the more indexes You have, the slower inserting data becomes.

Reef