views:

406

answers:

5

I need to replace more than 20 000 names with new names i created given the CodeID.

For example: I must update all rows that contain "dog" (which has a CodeID of 1) with "cat", and update all rows that contain "horse" (which has a CodeID of 2) with "bird", etc.

1st SQL statement: UPDATE animalTable SET cDescription = "cat" WHERE CodeID = 1

2nd SQL statement: UPDATE animalTable SET cDescription = "bird" WHERE CodeID = 2

These statements work, but i need a faster way to do this because i have over 20 000 names.

Thank you in advance.

A: 

update animalTable set cDescription = "cat" where someColumnThatNowsThatThisIsCat = theCatValue

Omu
I don't understand the answer can you give more details ...
RageZ
+1  A: 

Thats the fastest way you can do it.

Or do you want update all records in a single command?

you can do a update with a join (Fixed Syntax... Havent used this one in a while)

UPDATE animalTable 
INNER JOIN CodeTable ON animalTable.CodeID = CodeTable.ID 
SET animalTable.cDescription = CodeTable.Description_1;

Another option is to split the updates into smaller batches, this will reduce the time the table is locked... But the total time of the updates will take longer (Its just an improvement of precieved Performance) You can do that by updating only certain ID ranges in each batch.

Also you could have that data in a separate table. Since the data is not normalized. Move it away so its more normalized.

Heiko Hatzfeld
Thanks so much. I just had to edit a few things: UPDATE animalTable INNER JOIN CodeTable ON animalTable.CodeID = CodeTable.ID SET animalTable.cDescription = CodeTable.Description_1; Thanks again much appreciated
sami
+1  A: 

You could use a CASE statement to update it

UPDATE animaltable SET cDescription = CASE codeID WHEN 1 THEN 'cat' WHEN 2 THEN 'bird'.... END

Fermin
+2  A: 

You might want to create a temporary table that holds the translation values and update based on that.

For example:

create table #TRANSLATIONS
(
    from   varchar(32),
    to     varchar(32)
)

Then, insert the translation values:

insert into #TRANSLATIONS (from,to) values ('cat','dog')

Finally, update based on that:

update MYTABLE
set    myvalue = t.to
where  myvalue = t.from
from   MYTABLE m,
       #TRANSLATIONS t

(Untested, off the top of my head).

Andrew
you need a join clause for those two tables I think...
codeulike
Like I said -- untested, off the top of my head. :-) Doesn't the 'where' clause implement the join?
Andrew
+1  A: 

Supposind you have a file like this:

1,cat
2,bird
...

I would write a script which reads that file and executes an update for each row.

In PHP:

$f = fopen("file.csv","r")
while($row = fgetcsv($f, 1024)) {
    $sql = "update animalTable set cDescription = '".$row[1]."' where CodeID = ".$row[0];
    mysql_query($sql);
}
fclose($f);
Keeper
what is the 1024 in brackets?
sami
taken from documentation (http://us.php.net/fgetcsv):Must be greater than the longest line (in characters) to be found in the CSV file (allowing for trailing line-end characters). It became optional in PHP 5. Omitting this parameter (or setting it to 0 in PHP 5.0.4 and later) the maximum line length is not limited, which is slightly slower.
Keeper
how would i execute this script? sorry i'm new at mySQL.
sami
It's PHP: you can't execute it *inside* MySQL. I was suggesting to use a script language to parse a file and execute several queries against your db so that PHP is just a sample: you can use any language that can read files and query your db (almost any language can do it)
Keeper
OK, thanks for your help.
sami