What I am trying to do: I have hundreds of servers with very large log files spread out at dozens of different clients. I am creating nice python scripts to parse the logs in different ways and would like to aggregate the data I am collecting from all of the different servers. I would also like to keep the changing scripts centralized. The idea is to have a harness that can connect to each of the servers, scp the script to the servers, run the process with pexpect or something similar and either scp the resulting data back in separate files to be aggregated or (preferentially, I think) stream the data and aggregate it on the fly. I do not have keys set up (nor do I want to set them up) but I do have a database with connection information, logins, passwords and the like.
My question: this seems like it is probably a solved problem and I am wondering if someone knows of something that does this kind of thing or if there is a solid and proven way of doing this...