I have found that reading 64 contiguous memory locations (elements) using modbus is the most efficient way of retrieving information over Modbus using the rmodbus library.
My goal is to log the read information in a simple database that can be mined to generate graphs and tables of data on a webpage and store the most current value in an instance variable.
The data is read into an array by the rmodbus library where each element's index represents the element's address. However, I would like to convert the index to octal as that corresponds with the element addressing scheme the users are already familiar with and would be easier to reference in the interface.
edit adding details and refinement: At this point in time, I am working with the following schema:
create_table "elements", :force => true do |t|
t.string "name"
t.integer "modbus_connection_id"
t.string "address"
t.string "eng_unit"
t.integer "base"
t.string "wiring"
t.text "note"
t.boolean "log"
t.datetime "created_at"
t.datetime "updated_at"
end
create_table "events", :force => true do |t|
t.integer "element_id"
t.string "value"
t.datetime "created_at"
t.datetime "updated_at"
end
create_table "modbus_connections", :force => true do |t|
t.string "name"
t.string "ip_address"
t.integer "port"
t.integer "client"
t.text "note"
t.datetime "created_at"
t.datetime "updated_at"
end
The idea is going to be that a background process will probably poll over modbus and compare against itself for changes before logging only Elements that have changed and have been asked to be logged. Elements should probably be stored both in db and variables already scaled so the frontend doesn't have to worry about it. The ones that aren't logged are still held in instance variables for semi-realtime heads up display type monitoring. Then the logged elements in their Events table will be parsed for graphs and tables only when requested by the UI.
First Question: (Finally!) Does it make more sense to live with the data in an array and apply a layer that handles converting the index, (and as it happens the corresponding element value as well which I use v.collect{|i| i.to_s(16)}
to convert) or is it better to transfer everything into a Hash where the index and value can live happily ever after in their most usable form?
First Question edit: Given the determination/evolution of my question toward logging only changes in the data into a simple sqlite db, and that I will need to track changes for the elements to determine which ones changed between modbus reads, does the array or hash do comparison more efficiently? Should I even care?
Second Question: In Rails, assuming a one minute logging interval, will the approximately one thousand data points be better held in independent fields, or should I leave them in 64 element chunks and parse the information on its way to the interface?
Second Question edit: Running copious amounts of unchanged data into a one minute 'row' of a database seems very flat. Plus, it doesn't allow for easy dynamic selection of elements to be logged. It would seem far more appropriate to make the "logger" event based rather than interval based. Which pretty well means the First Question is the more important one here, as it will likely become the state checking mechanism as well.
I'm guessing I'm unnecessarily re-inventing a wheel with that revelation as this is becoming a lot like existing 'loggers.' Reading around SO reveals that it is an age old question as to logging into a DB vs. the FS. As the log itself is the foundation of the app, I am inclined to log into a DB, most likely sqlite given what I've read.
Second Question edited again: Now it's a question of normalization, everything I'm reading suggests that "scalability" tends to require denormalization. My logged "Events" table will be relatively simple, timestamps, the value, and the element id. Should it also denormalize the most common attribute(s) from inside the Elements table or is a join ok at this relatively small scale?
Anybody have any favorite Ruby logging frameworks/gems/bundles/plugins/whatever?