tags:

views:

389

answers:

6

Does anyone have a solution to remove those pesky ._ and .DS_Store files that one gets after moving files from a Mac to A Linux Server?

specify a start directory and let it go? like /var/www/html/ down...

A: 
cd /var/www/html && find . -name '.DS_Store' -print0 | xargs -0 rm
cd /var/www/html && find . -name '._*' -print0 | xargs -0 rm
mopoke
A: 
find . -name "FILE-TO-FIND"-exec rm -rf {} \;
Martin Beckett
Using exec for each file found is not fast. It is faster to have find print them out and then use xargs to invoke rm once.
X-Istence
However, if you have a `find` that supports it, `+` is xargs-like: `find . -name "FILE-TO-FIND" -exec rm -rf {} +` - (also, you're missing a space before `-exec`)
Dennis Williamson
+3  A: 

change to the directory, and use:

find . -name ".DS_Store" -print0 | xargs -0 rm -rf
find . -name "._*" -print0 | xargs -0 rm -rf

Not tested, try them without the xargs first!

You could replace the period after find, with the directory, instead of changing to the directory first.

find /dir/here ...
X-Istence
I know his question didn't ask for it, but I can never remember: does your example handle filenames with spaces?
Grundlefleck
Yes, that is what the print0 and the -0 to xargs is for. Normally it wouldn't handle spaces correctly, however with print0 it will print the filename with a null character at the end of the line, which xarg with -0 will then use to pass the full path to xargs without a chance of having the whitespace being used a second or third parameter to the rm command which could be really bad!
X-Istence
@X-Istence - This wont recursively traverse sub-directories though right?
JT
@X-Istence - What do you think about running this starting at /var/www/html and directories below it in a CRON say every hour? This box is a webserver, but files are uploaded frequently.
JT
@X: newer findutils supports a `-delete` action, which could shorten this. @JT: This searches recursively under `.`. Depends on how many files and subdirectories there are... can't you just just forbid those from being uploaded?
ephemient
@JT: It will recurse. `man find`. Using this in a cron would be possible, rather see if your FTPD supports upload scripts, which will check what is uploaded and remove files like ._ and .DS_Store.@ephemient: Yes, newer find will, however last time I used find it was not available. Also, I do believe there is a performance penalty to have find run the delete, as searching a directory and removing files at the same time can cause slowdowns (UFS has issues with this from experience)
X-Istence
+1  A: 

You could switch to zsh instead of bash. This lets you use ** to match files anywhere in a directory tree:

$ rm /var/www/html/**/_* /var/www/html/**/.DS_Store

You can also combine them like this:

$ rm /var/www/html/**/(_*|.DS_Store)

Zsh has lots of other features that bash lacks, but that one alone is worth making the switch for. It is available in most (probably all) linux distros, as well as cygwin and OS X.

You can find more information on the zsh site.

Dave Kirby
A possible problem: since you don't use xargs, you might run into the command-line argument length restriction. Also, Bash 4 supports `**` too :-) (though not by default, `shopt -s globstar` needs to be set)
ephemient
Say more features Bash does not have and Zsh does.
Hai
+3  A: 
find /var/www/html -name '.DS_Store' -or -name '._*' -delete
OneOfOne
A: 

if you have Bash 4.0++

#!/bin/bash
shopt -s globstar
for file in /var/www/html/**/.DS_Store /var/www/html/**/._ 
do
 echo rm "$file"
done
ghostdog74
Running a new copy of rm for each file is wasteful.
X-Istence