views:

232

answers:

8

I have a mysql database that I am trying to migrate into another database. THey have different schema's and I have written a php script for each table of the old database in order to populate its data in to the new one. The script works just fine but the problem is that it does not move all the data. for example if I have a table and all its info are being selected and then inserted into the new table but only half of them are done. The way I am doing it I am opening a database selecting * and puting it in an associative array. then I close the db connection and connect to the other one go through each element of the array and insert them in the new one. Is there a limit to how big an array could be? what is wrong here?

A: 

Your server (as all server do) will have a memory limit for PHP - if you use more than the assigned limit, then the script will fail.

Is it possible to just Dump the current MySQL Database into text files, perform find-and-replaces or RegExp-based replacements to change the schemas within the text files, and then reload the amended test files into MySQL to complete the change? If this is a one-off migration, then it may be a better way to do it.

Lucanos
that seems like a hassle but thanks for the info
jsd911
A: 

You may be running into PHP's execution time or memory limits. Make sure the appropriate settings in php.ini are high enough to allow the script to finish executing.

Borealid
can you tell me what are the things I need to look for in there?
jsd911
@Shahin Kian: max_execution_time and max_memory are the two prime suspects.
Borealid
tnx I will do that with what 81403 suggested and i think it should work.
jsd911
+2  A: 

You may have constraints in the target database that are rejecting some of your attempted inserts.

Brian Hooper
+1 because this is what really happens without any dramatic error messages. Check your FKs. And remember, there are no FKs on a MyIsam table....
ran2
+1  A: 

Why not do this via sql scripts?

If you prefer to do it via php then you could open connections to both databases and insert to target as you read from source. That way you can avoid using too much memory.

cherouvim
I had problem connecting to 2 databases at a time! plus how would I go about reading everything but 1 at a time? (selecting * from table 1 at a time?)
jsd911
you add parameter `LIMIT x,y` to your query, X being the first row to read, and Y the number of row to read. Instead of a static X, you use a PHP variable that increments itself in a `while` or `for` loop
Squ36
+5  A: 

You should read the rows from the first database in chunks (of 1000 rows for example), write those rows to the second database, clean the array (with unset() or an empty array) and repeat the process until you read all the rows. This overcomes the memory limitations.

Another problem might be that the script is running for too long (if the table is too large), so try using the function set_time_limit(). This function resets the timeout for a script after which it should be terminated. I suggest calling it after processing each chunk.

81403
thank you I will let you know how it works
jsd911
thanks alot it helped out alot!
jsd911
+3  A: 

First of all, I don't see the point in writing a script to do this. Why don't you just get a SQL dump from phpMyAdmin and edit it so that it fits the other database? Or are they that different?

But to reply on your question: my first thought would be, like other people already said, that the problem would be the time limit. Before you try to do something about this, you should check the value of max_execution_time in php.ini (this is about 30 seconds most of the time) and how long it takes for the script to execute. If it terminates after roughly 30 seconds (or the value of max_execution_time if it's different), then it's likely that that's the problem, although php should throw an error (or at least a warning).

I don't think there's a limit on the size of an array in php. However, there is a directive in php.ini, namely memory_limit that defines the amount of memory a script can use.

If you are have acces to your php.ini file, I suggest setting both max_execution_time and memory_limit to a higher value. If you don't have acces to php.ini, you won't be able to change the memory_limit directive. You will have to work your way around this, for example by using LIMIT in your SQL. Be sure to unset your used variables, or you could run in to the same problem.

Landervdb
+1  A: 

Using php to do the transform/convert logic is a possibility. I would do it, if you are doing complex transformations and if your php skills are much better thant your mysql skillset.

If you need more memory in your php script use:

memory_limit = 2048M max_execution_time = 3600

This will give you 2gigs of possible space for the array and about an hour for processing. But if your database is really this big, it would much (really a lot) much faster to use:

1. mysqldump, to make a dump of your source-server

Check it here: http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html

2. Upload the dumpfile and iport it. There are a bunch of example on the mysql documentation page. (Look also in the comments).

After this you can transform your database through CREATE/SELECT-statements.

CREATE TABLE one SELECT * FROM two;

As an alternative you can use UPDATE-statements. What is best depends heavily on the kind of job that you are doing.

Good luck!

Richard
+1  A: 

It would be preferable to do a mysql dump at the command line:

mysqldump -a -u USER_NAME -p SOURCE_DATABASE_NAME > DATA.mysql

You can also gzip the file to make it smaller for transfer to another server:

gzip DATA.mysql

After transfer, unzip the file:

gunzip -f DATA.mysql.gz

And import it:

mysql -u USER_NAME -p TARGET_DATABASE_NAME < DATA.sql

Chris Livdahl