I have a central repository with a subset of files that I want to protect from been changed (by pushing) from another users. If I add this files to .gitignore they would not be cloned. Is it possible to give the ability to clone all files, but after cloning add some of them to .gitignore on the client side?
I initially thought about a filter driver (see Pro Book), which would:
- on the smudge step save your files content
- on the clean step would restore your files content.
But that is not a good solution since those scripts are about stateless file content transformation (see this SO answer).
You can try enforcing a save/restore mechanism in hooks (see the same SO answer), but note that it will be local to your repo (it will protect your files in your repo only, hooks aren't pushed)
You can also use:
git update-index --assume-unchanged file
See "With git, temporary exclude a changed tracked file from commit in command line", again a local protection only. That will protect them against external push to your repo ("import"), but if you publish them ("export"), than can be modified on the client side.
Is there a specific reason why Git must itself be the answer to this?
How about making the files read-only, and dictating as policy that these files shouldn't be pushed?
Sometimes a technological solution is not the simplest way.
If somebody does push changes to these files, these changes can always be reverted.