You could do a commit log sort of format, sort of how wikipedia does.
Use a database, and every saved change creates a new row in the database, that makes the previous record redundant, with an incremented value, then you only have to worry about getting table locks during the save phase.
That way at least if 2 concurrent people happen to edit something, both changes will appear in the history and whatever one lost out to the commit war can be copied into the new revision.
Now if you don't want to use a database, then you have to worry about having a revision control file backing every visible file.
You could put a revision control ( GIT/MERCURIAL/SVN ) on the file system and then automate commits during the save phase,
Pseudocode:
user->save :
getWritelock();
write( $file );
write_commitmessage( $commitmessagefile ); # <-- author , comment, etc
call "hg commit -l $commitmessagefile $file " ;
releaseWriteLock();
done.
At least this way when 2 people make critical commits at the same time, neither will get lost.