tags:

views:

28

answers:

2

I have 20 GB of (uncompressed) log files. They're initially compressed though (as one 70 KB .gz file per log file) and total about 700 MB. I need to browse and search them to diagnose some issues. I don't know exactly what I'm searching for and I probably will need to quickly browse through a couple thousand hits for any search I try.

I tried doing this by uncompressing all of the files and then searching through them using Notepad++ or Visual Studio. The problem is that the searches are horrendously slow and put a lot of load on the hard disk. I assume that for every search I do, it needs to read all 20 GB from disk.

What might work better is if there's a text editor (or a Notepad++ plugin?) which can search inside .gz files without uncompressing them to disk. The 700 MB of .gz files can easily fit into system cache and I assume that uncompressing each file in memory would be much faster than reading an uncompressed file from disk.

I guess my alternative is to only work with a couple of gigs at a time so that all of it has a chance of being cached, but that would be pretty inconvenient. Thanks for suggestions.

+2  A: 

zless from cygwin should be able to do this.

liori
[zgrep](http://www.cyberciti.biz/faq/unix-linux-grepping-compressed-files/) should be able to do this as well.
0xA3
A: 

A combination of gunzip and grep can do this:

gunzip -c *.gz | grep "Search for something"

Native windows versions available here:

http://unxutils.sourceforge.net/

Sam
It seems like zgrep does the same thing. Is there an advantage of this over zgrep?
jthg
The native windows toolset doesn't include zgrep. The other apps you mentioned indicated you're on Windows.
Sam
It looks like cygwin does have zgrep.
jthg
@jthg, personally I prefer native tools over abstraction layers, but either solution would work fine.
Sam