i have a php file which regularly writes and reads tables from mysql DB.
i have another ruby file that runs at the same time which reads and write to the same tables in mysql DB.
can there be potential issues with this set up ?
i have a php file which regularly writes and reads tables from mysql DB.
i have another ruby file that runs at the same time which reads and write to the same tables in mysql DB.
can there be potential issues with this set up ?
Yes. There are. You should be using transactions to ensure, that only one instance is accessing data, and that if there is an error, you can roll back.
Also it depend on what storage engine you are using.
MySQL doesn't care if it's PHP or Ruby. Yes, there are potential issues with multiple clients accessing the database simultaneously, though. There's a lot possible to go wrong.
Just one random example:
Client 1 intends to a) subtract 10 from integer column_x (= 10), and then b) add this 10 to integer column_y.
Imagine Client 1 is done with step a), but before it manages to do b), Client 2 comes, intending to do something only if either column_x or column_y value is positive. And erroneously does nothing, as far as the 10 has not arrived to column_y yet.
Basically, you're getting all the problems of multiple execution threads with a shared state.
To address such issues, transactions exist. MySQL provides several transaction isolations levels, which depend on the storage engine of your tables.
Thus, all the modifications (and reads for REPEATABLE READ
& SERIALIZABLE
isolation level) must be executed within a database transaction, starting with START TRANSACTION
, and ending with COMMIT
SQL statement. If there's a problem in the middle of your routine, and you can't finish the transaction as a whole, execute ROLLBACK
to undo all the changes made since the START TRANSACTION
call.