As an SEO, I feel your pain.
Forgive me if I'm wrong, but I'm assuming that the problem is caused because there is a robots.txt on your staging server because you need to block the whole staging environment from the search engines finding and crawling it.
If this is the case, I would suggest your staging environment be placed internally where this isn't an issue. (Intranet-type or network configuration for staging). This can save a lot of search engine issues with that content getting crawled say, for instance, they deleted that robots.txt file from your Staging by accident and get a duplicate site crawled and indexed.
If that isn't an option, recommend staging to be placed in a folder on the server like domain.com/staging/ and use just one robots.txt file in the root folder to block out that /staging/ folder entirely. This way, you don't need to be using two files and you can sleep at night knowing another robots.txt won't be replacing yours.
If THAT isn't an option, maybe ask them to add it to their checklist to NOT move that file? You will just have to check this - A little less sleep, but a little more precaution.