We maintain a huge set of files in our web servers. Yesterday we were surprised to see a very important core file of the system being reverted by an older version (probably by some human beings since we do not have any automated scripts). Users partially got affected for half a day for this.
Shall I run a sort of automated script to scan the last uploaded date (possible?) of each file and send us an automated alert if there is a mismatch of the core files? Or are there any s/w available to take fingerprints of each file and send alerts? Please suggest what else can be done to stop such sort of human errors.