I am a novice in unix. i am facing a problem in viewing big log files in unix using Vi tool. could you please suggest the best tool for fast viewing of big files on unix. Request you to post your own ways of viewing the big files on unix. appreciate your help:)
less
doesn't need to keep the whole file in memory so it is good for viewing giant files. But for log files, the line wrapping is a nuisance.
Use less
as they have already told you, or most
which is an extended version of less
with more options and cool stuff.
It is usually included in the repository of any linux distro.
It depends on what you are looking for in this big log file.
- If you just want to be impressed by its size,
cat
is enough (you can also roughly locate some unexpected visual patterns). - If you just want to take a look at it, you can use
more
orless
. - If you want to monitor it while it is growing, you may be interested by
tail -f
. - If you are looking for specific patterns, take a look at
grep
. - If you want to extract some useful information from your big data,
perl
is your friend.
Actually vi
(at least vim
) is very performant on large files. I regularly use it to edit files in the dozens of MB range without problems.
You just need to be aware that a few operations will be slow on large files: big visual selects, global searching, and syntax highlighting. For large files, always turn off syntax highlighting (if you have it on by default): :syn off . Then you should be fine.
less and tail are the most efficient for viewing long files. less displayes a portion of file at a time and you need to scroll it in up direction whereas tail facilitates you to view the last n number of lines.