tags:

views:

22

answers:

1

Hi,

I have several hosts from which i want to do the same query. So imagine, i have on each server the database db and a table test like :

mysql> desc test;
+-------+------------------+------+-----+---------+-------+
| Field | Type             | Null | Key | Default | Extra |
+-------+------------------+------+-----+---------+-------+
| id    | int(10) unsigned | NO   | PRI |         |       | 
| data  | varchar(255)     | YES  |     | NULL    |       | 
+-------+------------------+------+-----+---------+-------+

Obviously, each table test has different data but they all have the same type. Let's say i have 2 host : h1 and h2 which host these tables.

Now i need to write a script which do the same query on each host and get the results on standard output. first idea was :

shell> (mysql -h h1 -u myusername -p mypwd -d db -e "select * from test"; 
        mysql -h h2 -u myusername -p mypwd -d db -e "select * from test";) 
       > out.txt

I want to do it faster, so what i did was :

shell> (mysql -h h1 -u myusername -p mypwd -d db -e "select * from test" & 
        mysql -h h2 -u myusername -p mypwd -d db -e "select * from test" &) 
       > out.txt

The problem is that i have some collision in my output file like

458 Karma police
459 876 Paint it black  Everything in its right place
460 street spirit

You have two rows on line 2.

My idea was that mysql buffer the result, so the buffer could end in the middle of a row. In this example, the buffer would stop at 459. But i can't figure out how to solve the problem.

Any ideas ?

A: 

Use a loop

for host in h1 h2 ; do
    mysql -h $host -u myusername -p mypwd -d db -e "select * from test" >> out.txt
done

Using >> appends to the file. Hosts will be accessed in order, not simultaneously, and so all results from h1 will be entered first, then all results from h2. If this is insufficient then there's another option but it's more complex.

Sorpigal
That was the first idea and it gives me valid results but i want to do it simultaneously so i'm eager to know your other option !
GuillaumeThomas
@GuillaumeThomas: The other option I could think of is trickery with named pipes. Write the output from each command to a separate pipe and then read them both into a third combined file when finished. The "tricky" part here is monitoring the commands and figuring out when they're finished. If what you want is to monitor the results in real time then that's still not sufficient.
Sorpigal
Yes, i can figure out a method which runs several process where each save data in a file and then concatenate everything into a file. Actually, i was wondering about an option about buffer size or something like that
GuillaumeThomas
Buffering the output from mysql's side would do no good since it would have no way to know whether or not the other instance is currently writing. To do what you seem to want, interleaving output lines from both sources, you'd need a more complex program that fetches lines as fast as they come, stores them and outputs only when a lock can be obtained on the output stream. This is probably not possible in bash but could be done in a more sophisticated language (e.g. perl). If you're wanting a pure bash solution I recommend asking serverfault, or irc.freenode.net#bash
Sorpigal