views:

24

answers:

2

Ok, let's see if you Wizards can help me with this problem I'm banging my head against.

The scenario is a pretty standard setup with a point of sale (that's where I am) and a backoffice keeping track of warehouse information. The problem is that I need to synchronize the database at the point of sale with the current stock information from the backoffice. Right now this is done by fetching a catalog from the backoffice via Web. The catalog contains information such as currently available units, current price and unique IDs for each type. In my database I have the exact same information and I update it after fetching the catalog. It is important for me to insert new unit types as they appear, update already existing types with availability information and price, and finally delete types (or mark them unavailable) once they go below a certain threshold.

The current implementation just marks all items as unavailable in case they were dropped from the catalog, then for each item in the catalog issues a query like this:

INSERT INTO store (id, quantity, price)VALUES(x, y, z)
ON DUPLICATE KEY quantity=y, price=z;

The id is set as unique to avoid duplicate data that is out of date in the database, and cause a collision that triggers the update instead. The main problem is that this issues one query for each type in the catalog, which may grow rapidly.

Is there a faster solution to this problem?

A: 

INSERT INTO store (id, quantity, price) VALUES (x, y, z), (x2,y2,z2)

But i dont know how this jives with your ON DUPLICATE KEY statement... you could perform some logic to split the data into groups for update and insert independently.

prodigitalson
The ON DUPLICATE will crumble with multi line inserts, I could try and fetch on the Ids first but that would just increase the load on my DB, won't it?
cdecker
Well it depends on how many items we are talking but assuming youre selecting on the primary key and only return a single column i wouldnt think it would be that horrible. But i think youre either going to have to do a big read and then sort them out up front in order to do multi insert or your going to have to insert them one at a time with automatic update failover for existing records.
prodigitalson
+1  A: 
REPLACE INTO store (id, quantity, price) VALUES (x, y, z), (x2,y2,z2);

I'd also suggest not sending them to the webapp as SQL (horrible security) and sending them as large batchs (1000 rows at a time) in some format your comfortable with. The webapp does all the REPLACE statements and returns success.

MindStalker
This is as close as it could get :-)Sadly I have no say in what the interface between the offices should look like, otherwise I'd let only the updates be pushed from the backoffice to me.The data is a strange pipe-delimited file, which wouldn't be all that bad if they didn't concatenate lines using pipes too, thus making it necessary to count elements to know in which record we are :-(
cdecker