views:

93

answers:

3

So I have I think around 36,000 just to be safe, a number I wouldn't think was too large for a modern sql database like mysql. Each record has just two attributes.

So I do: so I collected them into one single insert statement

sql = "INSERT INTO tasks (attrib_a, attrib_b) VALUES (c1,d1),(c2,d2),(c3,d3)...(c36000,d36000);"

ActiveRecord::Base.connection.execute sql

from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/abstract_adapter.rb:219:in `log'
from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/mysql_adapter.rb:323:in `execute_without_analyzer
from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute'
from C:/Ruby/lib/ruby/1.8/benchmark.rb:308:in `realtime'
from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute'
from (irb):53
from C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/vendor/tzinfo-0.3.12/tzinfo/time_or_datetime.rb:242

I don't know if the above info is enough, please do ask for anything that I didn't provide here. So any idea what this is about?

THANK YOU!!!!

A: 

Well without more information, I would hazard a guess that you exceeded the max_packet_size value for the MySQL server.

webdestroya
Well, the thing is there aren't much more I can think of that's relevant. I am using a windows computer windows 7 64bit, 4gb ram and mysql 5.0.83ruby 1.8.7, rails 2.3.5I mean, I actually made a dummy app with one model task with a name and a priority attribute and tried the "execute" above and it gave the same error. I haven't tried it on ruby 1.9.1 yet, but I can.
Nik
A: 

the problem is because of time out. I had same kind of problem while Using doctrine ORM.In php we can solve this issue by changing script time in php.ini file. But i dont know how to change the number in rails.May be some one here will help u..

piemesons
thanks! I thought about something in that ballpark as well. i will give it a try. Thanks again
Nik
A: 

I found a solution:

Inspired by Piemesons's suggestion, I chopped up the values to be included in one single insert into into groups of 10,000. so therefore having n/10000 .ceil many inserts.

And it worked!

It's still speedy and I am still not sure what configuration limitation was and is still preventing me to do a single insert with that size. if anyone knows, please do offer it as a comment so we can all learn from this.

Best,

Nik
@Nik Better you accept the answer of pie-mesons and give him credits for this
We love stackoverflow
Sorry for the delay, Done
Nik