views:

51

answers:

1

I am working on an application in which one module accesses a log (an ActiveRecord model) to increment various values at several points of the execution. The problem is that the script is likely to take several seconds, and other instances of the same script is likely to run at the same time.

What I'm seeing is that while the script finished correctly and all data is accounted for, the values never are correct. This is likely because by the time the script gets to the point where it updates the value, the read value for the column (which is incremented) is already out of date.

The values are correct if I force it to only have one instance of the module at a time, but for performance reasons I can't keep doing this.

Currently, I've tried to solve the problem by querying the database for the record before each increment statement in a transaction, like this, where column is a symbol of the column and value is 1 or higher:

Log.transaction do
  log = Log.find(@log_id)
  log.update_attribute(column, log.send(column) + value)
end

However, it still won't give me accurate numbers. I'm thinking caching is involved, and I suppose I could try something like:

Log.transaction do
  uncached do
    log = Log.find(@log_id)
    log.update_attribute(column, log.send(column) + value)
  end
end

But surely I can't be the first to come across this issue, so I am wondering if there is a better implementation/solution for my problem?

+2  A: 

ActiveRecord provides an update_counters method. You can change your code as follows:

Log.update_counters @log_id, column.to_sym => value
KandadaBoggu
That looks like an excellent option. Can't believe I missed that one. Will test and see how it works.
vonconrad
You still have a race condition if two scripts call `update_counters` in overlapping transactions; one will overwrite the other's `update`. If you want true sequentiality you have to lock the row (`Model.find(..., :lock => true)`), table, or use an external e.g. filesystem or IPC lock.
vladr