Hello,
I am creating a database for an application that logs data for several different nodes. The data logged looks like this:
- timestamp
- several integer values
- several floating point values
- maybe a string or two
Each node is polled separately.
I would be creating a log entry between every 10 minutes and every 10 seconds (variable logging interval), so I would be looking at (at most) under 10k entries a day per node.
I am wondering how should I structure the database for best data access/management. I imagine I would want to access at least 30 days of historical data, and I want to be prepared for 100s of nodes.
Initially I thought of creating a single table with log data and linking each log entry to a node via 1:1 relationship, but I am afraid that my table will grow too big in this scenario.
Is creating a separate table for each node a viable option?
Any comments/suggestions would be helpful,
Thanks!