views:

28

answers:

1

I'm currently in the design stage of a reporting system which will create comparison reports of screens for a fleet of ATM's (being developed in Ruby on Rails).

The data collected from each ATM is stored in XML which contain File Name, Date Modified, Size and Check-sum. These files are then compared with a Master file and report on extra, missing or mismatched files per ATM.

The question is how to store the generated report. The report will contain an index of all ATM's with total and then a separate detail/file-list report for each ATM (around 1000). There will be a report created each day with a 60 day retention.

I've considered using a document store like CouchDB or MongoDB and store all information as a single document in the database.

A rough estimate of 60 days worth of data will take up around 30 GB.

How would you tackle this situation?

+1  A: 

"A rough estimate of 60 days worth of data will take up around 30 GB."

So? 30Gb is a size so small that it can be ignored. As an example, 1Tb is a size where time should be spent thinking about it.

30Gb is a size so small that any solution that works is fine. Implement it quickly and move on.

S.Lott