Hi all. Basically I have an xml feed from an offsite server.
The xml feed has one parameter ?value=n now N can only be between 1 and 30
What ever value i pick, there will always be 4000 rows returned from the XML file. My script will call this xml file 30 times for each value once a day. So thats 120000 rows. I will be doing quite complicated queries on these rows. But the main thing is I will always filter by value first so SELECT * WHERE value = 'N'
etc. That will ALWAYS be used.
Now is it better to have one table where all 120k rows are stored? or 30 tables were 4k rows are stored?
EDIT: the SQL database in question will be MySQL
EDIT: Just to make it abit clearer, the data will be UPDATED every day, so the old tables will be overwritten, I do not want any archive solutions, just the best way to store the data to have as little performance bottlenecks as possible, the database results once outputted will be cached and also updated daily.
EDIT: I guess i was being too vague for my own good :( Basically the feeds are leaderboards, each value is a different leaderboard location
The values will only be updated if the leaderboard position changes and there will always be ONLY 120k rows. no more, no less.
Lets say:
- Blue
- Green
- Red
Thats the current leaderboard and the next update the feed returns:
- Blue
- Red
- Green
Only rows 2 and 3 will change. That is my plan anyway :)
ANOTHER EDIT >.<: The rows will only contain at most 12 columns and less than 1kb per row. And the update wil only happen ONCE a day because the server the feeds are from is slow and it takes 80 minutes for my server to get all feed values from it.