I am working on an application in which one module accesses a log (an ActiveRecord model) to increment various values at several points of the execution. The problem is that the script is likely to take several seconds, and other instances of the same script is likely to run at the same time.
What I'm seeing is that while the script finished correctly and all data is accounted for, the values never are correct. This is likely because by the time the script gets to the point where it updates the value, the read value for the column (which is incremented) is already out of date.
The values are correct if I force it to only have one instance of the module at a time, but for performance reasons I can't keep doing this.
Currently, I've tried to solve the problem by querying the database for the record before each increment statement in a transaction, like this, where column
is a symbol of the column and value
is 1 or higher:
Log.transaction do
log = Log.find(@log_id)
log.update_attribute(column, log.send(column) + value)
end
However, it still won't give me accurate numbers. I'm thinking caching is involved, and I suppose I could try something like:
Log.transaction do
uncached do
log = Log.find(@log_id)
log.update_attribute(column, log.send(column) + value)
end
end
But surely I can't be the first to come across this issue, so I am wondering if there is a better implementation/solution for my problem?