My application is storing location data from GPS inputs. When importing a GPX file, a user can have from 500 - 10,000 GPS datapoints. Right now, I have a model for each trackpoint. It's working great, but on insert it is SLOW. 30+ seconds for the 10,000 datapoints. Is there any better way of bulk inserting?
All the time is spent on the SQL side - each insert is quick, but 10,000 add up fast. Each user might have 100 files, and 100 users == long long insert times. Not all at once of course.
I'd be happy to change the application architecture if that would help, just not sure what alternatives I have here. I only ever use the GPS data as a unit. I never search for one record in the set, so the whole ActiveRecord is overkill.
I'd hate to have to do a whole queue system just to handle this one silly insert.