views:

867

answers:

18

To find files containing a particular string, I use this often

find . -name * | xargs grep -iH "string"

+2  A: 

My top commands:

  • vi (my editor of choice to do almost all configuration tasks)
  • top (list running tasks, and show performance statistics)
  • tar/gzip (compress/uncompress archives)
  • grep (find text on files using regular expressions)
  • man (command manual reference)
  • kill (kill running processes)
  • chmod (change and manage permissions)
CMS
A: 

I often find myself needing to find a file that contains a particular string. For that I use:

grep -ri 'string'

There is such a wide range of things I need to do on a regular basis, that it is hard to pick out specific commands. Mostly just lots of combinations of sed, awk, grep, and find, with some random until thrown in for processing.

Drakonite
Note: Cygwin grep doesn't assume you're recursing from "." . For example, the syntax for a case-insensitive recursive grep looking for "foo" in starting in directory dir is:grep -ri foo diror to specify files matching *.txt and showing filenames:grep --include="*.txt" -rHi foo dir
Markc
+1  A: 
#!/bin/sh
ps axxw | grep $1 | grep -v grep | grep -v boost 
sudo renice -20 `ps axww | grep $1 | grep -v grep | grep -v boost | awk '{print $1}'`

Callling the script boost, I use it to give highest priority to the named application.

boost
The ``ps axxw'' on the second line was meant to be ``ps axww'' as in the third line?
PolyThinker
Good question. It's been so long since I was on the MacOSX box that I developed that on, that I honestly can't remember. I now work in an exclusively Wintel environment.
boost
A: 

Not exactly a regular command but rather the short-cut 'Ctrl + r' for auto completion of bash commands.

Himanshu
+2  A: 

I use find's -exec option fairly often. For instance, I often want to change the permissions for a whole directory tree, giving the directories execute permissions, but not the files. I do this in two steps:

find root_dir -type d -exec chmod 555 {} \;
find root_dir -type f -exec chmod 444 {} \;

The above would make the whole tree readonly to everyone, but still allow anyone to cd into any directory.

Scotty Allen
In zsh: chmod 555 root_dir/**/*(/), chmod 444 root_dir/**/*(.)
orip
+2  A: 

For finding which directories take up most space (for potential cleanup), start at the desired level, such as /home and execute:

cd /home
du -s * | sort -r -k1 -n

This gives a sorted (most use to least use) list based on space used such as:

4413700  bob
   6308  alice
   4284  george
     84  daniel
     16  lost+found

You can then run the same command from within /home/bob:

cd /home/bob
du -s * | sort -r -k1 -n

to get:

4413600  p0rn
    100  src

Hence you now know what's using up most of the space on the /home filesystem (and Bob will soon be getting his marching orders :-).

paxdiablo
I like du -sh * for human-readable syntax (not good for sorting, though)
Blorgbeard
A: 

I use who -T | sort to get a sorted list of logged-in users. Also, to get a sorted list of groups (rather than the normal unsorted list), I use groups | tr ' ' '\n' | sort | tr '\n' ' ' && echo.

mipadi
+3  A: 

Mine's are

awk - for filtering and extracting fields
find - for finding files/directories
xargs - build command lists, often i use it with find
less - for quickly browsing/reading files
man/info - for viewing manpages and info pages
emacs - for editing source code
irssi - to get in touch with other developers
cd - to change to home dir and to other directories
killall - to kill not behaving commands (yeah, you get power!)
ps - to list processes (oh noes, i hate hanging mplayers blocking my sound!)
<CTRL>+<R> - completion of commands by searching in history file
<TAB> - for completing directory and file names

And, last but not least, the most often used power command is shutdown, isn't it? :)

Johannes Schaub - litb
i haven't even try to invoke that command
ken
+1  A: 

The command in the original question can be better written as,

grep -RHi "string"

I use grep -R quite frequently. Of course, the find command can be used for fine-tuning the files to search.

I often use the -00 flag to perl to print "paragraph" mode.

perl -wnl -00 -e '/something:/ and print;'

Change a string to something else, inline, while making backup copies of the original file(s) (from Minimal Perl):

perl -s -i.bak -wpl 's/old_string/new_string/g;' *.txt

I like side-by-side diffs:

sdiff -s file1 file2

Or syntax-highlighted diffs:

diff file1 file2 | vim - # or mate - on my Mac

I went looking through my history a bit to see if there were other commands, but sadly(??) most of my system maintenance, administration and programming is done through automated tools, lately all written in Ruby (puppet, capistrano, some home-rolled tools, etc), or are related to SCM (git, svn).

jtimberman
+2  A: 

The ones I use the most from the command-line are grep and all sorts of zsh goodies, e.g.

# count number of lines in all .java and .py files
wc -l **/*.{java,py}
orip
A: 

I find nohup very handy, for things that are done over an unreliable connection or take a lot of time, ie nohup python build_big_db_on_this_remote_server.py

bvmou
A: 

a dos2unix perl one liner I like :

perl -pi -e 's!\r\n!\n!g' << filename >>

That can easily be converted in unix2dos by reversing the order in the replacement.

Aif
A: 

My favorite commands in linux are :

ps -ax
kill [n]

Also less known (and dangerous) don't try this at home kids :

hack [targetPC]
nuke [targetPC]
sol    <--this is solitaire
/.     <--opens up slash dot in IE
quote  <--quotes a /. meme from the following list so you can use it to post on /.
       1 Yes but does it run linux?
       2 Can you imagine a Beowulf cluster of those
       3 In Soviet Russia ...
       4 ...
       5 Profit.
Pop Catalin
A: 

Find if a process is running and get the pids in a tree view:

ps afx | grep 'foo'

Find a string in a directory of files recursively:

grep -r 'foo' *

Make all the files executable in a directory recursively:

chmod -R +x *

Erase a file but keep its existence and permissions:

cat /dev/null > file

Just for fun:

rm -rf /
cowgod
A: 

two power commands:
wget: to download a file
curl: manipulate http headers

btw, your find and grep combination is not safe, what if the filename contains space? the safe way is to do like this:

find . -name * -print0  | xargs -0 grep -iH "String"
jscoot
no need to use xargs: find -iname *.java -exec grep -Hn foo {} \;
Ubersoldat
A: 

#poweroff

You talking about power commands, right? ;-P

Mohit Nanda
A: 

I find that

sed -i "regexp" $file

Is very useful sed on a file, but in place instead of requiring to put the output somewhere then move it over the top of the source.

grep --color -Rne "regexp" file list

Is also handy to do a recursive grep and hilight the found matches. I actually wrote a little shell function called 'svngrep' to skip over the .svn files in our working copies and just look at active code and provide hilighting. The same can be done for Git and others.

I have a .bashrc that contains the following:

alias ls="/bin/ls --color"
alias ll="ls -l"

pgrep and pkill come in handy all of the time. Awk is your friend. Learn its syntax because it can so many great things and save you much time.

du -shc *

Will give you a break down of sizes of all the file you list and also show you the total. Very useful for working out quickly if the current directory is big and what under it might be big.

vim

Self-explanitory.

screen
screen -DR

Screen is the spawn of the Unix god. If you ever work by SSH on a machine then consider using Screen. It gives you a persistent session with the ability to create multiple "tabs" and also detach, leaving screen running after you log off. You can come back later and reattach with 'screen -r'. If your remote session gets booted due to network problems you can log back in and use the detach-reattach method to get your work back, having not lost that last critical edit that took half an hour and you hadn't saved yet. Screen has saved my ass from that countless times.

Adam Hawes
A: 

My most frequently used commands are ps, lsof (list open files), grep / awk / sed / cut (and various other line editors) as well as netstat.

Lsof is one of the most useful but frequently forgotten. For instance:

# umount /foo
umount: /foo busy

lsof | grep /foo

1339 /bin/bash ... ... ... cwd ... ... ...

Ok, so now I know /foo is busy becuase someone has a shell open and is sitting in /foo :) That's really just the tip of the iceberg. Its also handy to know what fds any given process has open.

A lot of people use find all of the time, when slocate might be better, so don't forget about slocate :) If you find yourself (pardon the pun) doing this:

find / -name foo.txt

... you'd be better off using slocate first.

Finally, valrgind is commonly associated as a programming tool .. however it is entirely useful for detecting leaks and other erratic behavior in other programs. For instance, if by some crazy way you manage to make sed segfault ... valgrind is neat to take a peek.

Tim Post