views:

1183

answers:

5

Hey guys.

I'm dealing with huge glassfish log files (in windows, eek!) and well ... Wordpad isn't cutting it.

Are there any tools out there that can handle these log files in a more intelligent manner? Functionality that would be welcome:

  • View all lines of a certain log level (info, warning, severe)
  • Show logs between two timestamps
  • Occurency counter (this exception was thrown 99 times between time x and time y)
+1  A: 

try UltraEdit (paid) or Notepad++ (free)

Martin OConnor
A: 

I use Excel for parsing log files. If you use tab-delimited log files this can work great. The filtering and sorting features of Excel lend themselves well to logfile analysis.

Nick
A: 

Try the MS LogParser tool: http://www.microsoft.com/downloads/details.aspx?FamilyID=890cd06b-abf8-4c25-91b2-f8d975cf8c07&displaylang=en

Basically turns your flat log file into a "database" you can run SQL-like queries on. You can even output in grids, charts and graphs.

Patrick Cuff
Cool! Thanks. :)
Ace
A: 

On Windows I'd still go perl or awk. Download and install cygwin, then use awk or whatever you are familiar with. awk has the time functions needed for filtering, and features such as getline for log file navigation.

Ex: Exception occurency count - all time

$ awk '/^java.*:\W/ {print $1}' server.log* |sort|uniq -c|sort -nr
 60 javax.ejb.EJBException:
 45 java.rmi.ServerException:
  2 javax.persistence.PersistenceException:
  2 javax.ejb.ObjectNotFoundException:
  1 java.lang.Error:
fredarin