views:

270

answers:

10

Today I first saw the potential of a partial accidental deletion of a colleague's home directory (2 hours lost in a critical phase of a project). I was enough worried about it to start thinking of the problem ad a possible solution.

In his case a file named '~' somehow went into a test folder, which he after deleted with rm -rf... when rm arrived to the file bash expanded it to his home folder (he managed to CTRL-C almost in time). A similar problem could happen if one have a file named '*'.

My first thought was to prevent creation of files with "dangerous names", but that would still not solve the problem as mv or other corner case situations could lead to the risky situation as well.

Second thought was creating a listener (don't know if this is even possible) or an alias of rm that checks what files it processes and if it finds a dangerous one skips sending a message. Something similar to this:

  1. take all non-parameter arguments (so to get the files one wants to delete)
  2. cycle on these items
  3. check if current item is equal to a dangerous item (say for example '~' or '*'), don't know if this works, at this point is the item already expanded or not?
  4. if so echo a message, don't do anything on the file
  5. proceed with iteration

Third thought: has anyone already done or dealed with this? :]

+7  A: 

There's actually pretty good justification for having critical files in your home directory checked into source control. As well as protecting against the situation you've just encountered it's nice being able to version control .bashrc, etc.

Andrew
A: 

The version control suggestion gets an upvote from me. I'd recommend that for everything, not just source.

Another thought is a shared drive on a server that's backed up and archived.

A third idea is buying everyone an individual external hard drive that lets them back up their local drive. This is a good thing to do because there are two kinds of hard drives: those that have failed and those that will in the future.

duffymo
A: 

You could also create an alias from rm that runs through a simple script that escapes all characters, effectively stopping you from using wildcards. Then create another alias that runs through real rm without escaping. You would only use the second if you are really sure. Bu then again, that's kinda the point of rm -rf.

Another option I personally like is create an alias that redirects through a script and then passes everything on to rm. If the script finds any dangerous characters, it prompts you Y/N if you want to continue, N cancelling the operation, Y continuing on as normal.

tschaible
+1  A: 

Since the shell probably expands the parameter, you can't really catch 'dangerous' names like that.

You could alias 'rm -rf' to 'rm -rfi' (interactive), but that can be pretty tedious if you actually mean 'rm -rf *'.

You could alias 'rm' to 'mv $@ $HOME/.thrash', and have a separate command to empty the thrash, but that might cause problems if you really mean to remove the files because of disk quotas or similar.

Or, you could just keep proper backups or use a file system that allows "undeletion".

Christoffer
You probably mean "trash" instead of "thrash".
Dennis Williamson
+2  A: 

Accidents do happen. You only can reduce the impact of them.

Both version control (regular checkins) and backups are of vital importance here.

If I can't checkin (because it does not work yet), I backup to an USB stick.

And if the deadline aproaches, the backup frequency increases because Murphy strikes at the most inapropriate moment.

Gamecat
+1  A: 

One thing I do is always have a file called "-i" in my $HOME.

My other tip is to always use "./*" or find instead of plain "*".

David Schmitt
A: 

One company where I worked we had a cron job which ran every half an hour which copied all the source code files from everyone's home directory to backup directory structure elsewhere on the system just using find.

This wouldn't prevent actual deletion but it did minimise the work lost on a number of occasions.

Dave Webb
A: 

This is pretty odd behaviour really - why is bash expanding twice?

Once * has expanded to

old~
this~
~

then no further substitution should happen!

I bravely tested this on my mac, and it just deleted ~, and not my home directory.

Is it possible your colleague somehow wrote code that expanded it twice?

e.g.

ls | xargs | rm  -rf
Alex Brown
No, not twice, he just had a file named '~'. I added the '*' case because it is similar.
Alberto Zaccagni
A: 

You may disable file name generation (globbing):

set -f

Escaping special chars in file paths could be done with Bash builtins:

filepath='/abc*?~def' 
filepath="$(printf "%q" "${filepath}")" 
filepath="${filepath//\~/\\~}" 
printf "%s\n" "${filepath}"
A: 

I use this in my ~/.basrc

alias rm="rm -i"

rm prompts before deleting anything, and the alias can be circumvented either with the -f flag, or by escabing, e.g. \rm file

Degrades the problem yes; solves it no.

Benjamin