views:

27

answers:

1

Hi all,

I'm parsing an XML file with a few thousand elements in it for insertion into a sql database. Everything works fine with sqlite, but postgres dies on me with this error:

PGError: ERROR:  out of shared memory

I was monitoring locks, and it looks like there's a lock for each insert statement, and although I'm just looping over activerecord "save" calls (which should have a transaction each) these locks seem to hang around such that I have a few thousand open at a time.

As an experiment, I wrapped the save calls in a transaction:

Coupon.connection.transaction do  
  @coupon.save!
end

And that appears to have resolved my locks issue. However, it does seem there is something wrong here. Anyone know what it might be? I am using the "pg" driver.

Thanks, Tom

A: 

Your server may just be set up to only use a small amount of shared memory - http://www.postgresql.org/docs/8.1/static/kernel-resources.html details all this sort of stuff very nicely. Try raising that for starters. A few thousand elements defeating PostgreSQL does sound very odd, though - how large are the objects you're storing? What does your AR code look like? There's not a lot of information to go on here.

JamesHarrison
The elements are fairly small... about 20 attributes, only one of which is a text datatype. My AR code is also simple, but I'm not sure what you're looking for. Create a coupon, assign a few attributes, save. I have a few validators, and no before/after save callbacks. I agree with you that it's very odd that a stock install of postgres would need any scalability tweaking for something that sqlite handles no-sweat. Plus, my code is super simple. So, that's why I'm thinking there's something AR/Rails/Driver-y going on here.
tlianza