Troubleshooting, analyzing & filtering log files is by far one of the most laborious daily jobs. My issue is searching through a log file, which could be well over 4 gigs in size. Simply loading the file takes up to 15 mins. I'm running a fairly fast processor with 8 gigs of memory. After the file loads, I literally only have the luxury of grep and/or control+F to scan through the file. This gets worse when I'm trying to look files from multiple systems each weighing over a gig. Have tried segregating the files based on time-stamps to make them smaller, but no joy really.
Is there a tool or even a process that I could use to make troubleshooting less time consuming (apart from the usual "just fix the bug first")?
Your comments are appreciated.