Welcome to part two of Unleashing log file analysis. In the last post I talked about the power of using LogParser to transform your Windows Event Logs. In this post I want to talk about using LogParser to transform your IIS Logs!
So a quick recap from Part 1 of why I think LogParser is very useful…
- you can use SQL command syntax for filtering the log data on the front end. You can build commands that generate report type summaries.
- You can import logs directly into Microsoft SQL Server (only RDMS supported is MSSQL, sorry) or other flat file formats.
- You can use wizardly in SQLServer including .NET CLR assemblies to manipulate the data once it is imported. Think Regex, data decoding, data mapping, crazy algorithms, etc…
- It’s a fast command line tool.
- You can still convert to flat file or syslog server if SQL isn’t in your plan.
You have your standard flat file output formats available as well like CSV, etc… but what this post is addressing is digging through large numbers of event logs fast. Use the power of SQL Server!
Importing IIS logs
The syntax can vary when dealing with IIS logs due the fact the log file format can change based on how you have your IIS configured.
One of the first question you might ask if your new to this sort of thing is where are the IIS logs? The default path you to the logs is “C:\inetpub\logs\LogFiles\”.
Now, to know what fields are being stored in your log you should open one of the log files and look at the line labeled “#Fields”. In my case the first log file in my logs directory is “C:\inetpub\logs\LogFiles\W3SVC1\u_ex130826.log”. I have highlighted it in red below.
#Software: Microsoft Internet Information Services 8.0
#Version: 1.0
#Date: 2013-08-26 01:26:14
#Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) cs(Referer) sc-status sc-substatus sc-win32-status time-taken
Now that we know the fields being stored in the log files we can adjust our Parser syntax to match and get to importing logs. Just another note on #Fields. This example is from my development workstation running Win 8.1… A windows Server will have more fields, and they can vary based on the roles/services running with IIS.
Here’s the LogParser syntax reflecting the #Fields from my logfile show above. It’s important to use [ ] as qualifiers to inclose fields or columns as they are known SQL. This prevents parsing errors with spaces, dashes and )(, etc…
LogParser.exe -i:W3C “SELECT [LogFilename],[RowNumber],[date],[time],[c-ip],[s-ip],[cs-method],[cs-uri-stem],[cs-uri-query],[sc-status],[time-taken],[cs-username],[cs(User-Agent)],[cs(Referer)],[sc-win32-status] INTO [dbo].[iilogs] FROM W3SVC1\*.log ” -o:SQL -createTable:ON -server:sqlservername -username:sqluser -database:iislogsdb -password:databasepassword
What you get here are a few extra columns, [LogFilename] and [RowNumber].
[LogFilename] stores the logfile name, pretty self explanatory. [RowNumber] is the row from the [LogFilename]; so you get a direct link back to row in the original log file. Great for debugging and tracing events!
Getting Fancy
Ok, so maybe we don’t want _all_ the data. Lets say we just want a unique list of IP addresses and we don’t care about SQL, we just want to see a list. Or maybe we just want a unique list of usernames.
Here are a few examples
LogParser.exe -i:W3C “SELECT distinct [s-ip] FROM W3SVC1\*.log “
LogParser.exe -i:W3C “SELECT distinct [cs-username] FROM W3SVC1\*.log “
We can do aggregate group by functions, nom nom.
LogParser.exe -i:W3C “SELECT [cs-username], count([cs-username]) FROM W3SVC1\*.log group by [cs-username] “
The result will be a unique list of usernames with a count of times each username was found in the lost files.
LogParser.exe -i:W3C “SELECT [cs-username], count([cs-username]) FROM W3SVC1\*.log Where [c-ip] not in (‘192.168.1.1′,’192.168.1.180’) group by [cs-username] “
Here i’m still doing a aggregate but I want to exclude usernames for some IP addresses.
The output will look like this:
cs-username COUNT(ALL c-ip)
———– —————
someuser 12
Statistics:
———–
Elements processed: 12
Elements output: 1
Execution time: 0.00 seconds
Wrap Up
This was a little longer than I expect but I think you have a taste for what can be done using LogParser against IIS Logs. If you have a lot of log files you need to load somewhere or just need a quick set of statistics, this is a great way to go! The syntax is flexible, powerful and familiar, and having aggregate functionality is really nice for quick reviews.
I hope you found this post post both informative and useful!