views:

336

answers:

1

Im looking to rewrite code that update a table on a Sybase IQ database v14 that does the following:

  1. selects all the records in the table and extracts some data to file
  2. updates the extracted to file flag for each record in the table

Currently, once a record is written to file, its extraction flag is updated. Currently there are 40 000 records in the table and the process uses 40GB of database temp space. Each record in the table in question contains 60 fields and at most columns contains 120 characters.

Is the database server creating a new version of data for the table for each record in the table and as time goes, and so we create a snowball of temp data on the database server with accounts for the 40GB used? Would the best method to process the data to first extract the data, write it to file, then perform a bulk update. Since it is my understanding that Sybase IQ is generally used in arena im thinking the database would be optimised for inserts and deletes and selects but performs badly on updates ? Would the sybase database IQ server perform the same on HP-UX and Windows server 2003.

+1  A: 

Actually Sybase IQ is optimized for reads (think OLAP) and not so much for OLTP activity During my own testing I found getting the data into Sybase IQ took the longest

The fastest way to load data would be to use the LOAD TABLE bulk load command

SQLMenace