Today I noticed we were getting an increasing amount of spam on one of our form pages. I was curious to see if all of the user IP addresses were the same (in which case I’d just add them to the IIS7 IP Restrictions list). To quickly and easily figure this out I decided to use LogParser. Besides just querying for the page though, I wanted to add an additional condition to exclude rows that came from a certain internal IP address that we use for monitoring.
Here’s a generic version of the query I used:
LogParser.exe -q:on "SELECT * FROM x:\wwwlogs\W3SVC1\u_ex130411.log WHERE cs-uri-stem='/SomePage/' and c-ip<>'10.10.1.100' >c:\temp\PageVisitors.txt"
I wanted to see the full logged data for the request, but if I didn’t, I could have very easily just pulled the IP addresses using:
LogParser.exe -q:on "SELECT c-ip FROM x:\wwwlogs\W3SVC1\u_ex130411.log WHERE cs-uri-stem='/SomePage/' and c-ip<>'10.10.1.100' >c:\temp\PageVisitors.txt"
You can see that I’m piping the results to a text file (the “>c:\temp\PageVisitors.txt” part) so that I can easily deal with the results. You may also want to take note that I’m using the “-q:on” flag which runs the command in Quite Mode. If you don’t set this flag then LogParser will show results one page at a time. When piping to a text file rather than the command prompt window, you obviously can’t hit a key for “next page” so without this flag the query will actually hang forever if there is more than one page worth of results.
Happy hosting!
Nice post. While LogParser can be a great tool, I’d prefer the ported GnuWin32 tools, like grep.exe (cut.exe, etc), for such easy tasks. My fingers type grep/cut commands faster than SQL :razz:
On the other side, LogParser has the advantage of selecting multiple sections of information at once and you’d need more than one GnuWin32 tool to do the job.
PS: SANS and Symantec have great articles about using LogParser for gathering forensic information. Google for it, I’m sure you like it.
I haven’t used the GnuWin32 tools but I’ll check them out. Thanks.
FYI: I created a small introduction on my blog/site.
One of the problems I’m facing with LogParser is not being able to use it recursively when log files are in different folders, for example DIR*\File.log, where with DIR\*.log recursive usage is possible.
Jan – how about starting the query one folder level higher (folder that contains both of the other folders)? Then use the -Recurse flag but set the match to only catch files you want? It might take longer if it has to scan lots of other folders, but still should only catch the files you filter for.
An update: I was browsing through my old archives and found the following, which kinda solves my *\file.log problem. Just use ‘for’ with ‘dir’ to list all logfiles, for example:
for /f %i in (‘dir /s/b u_ex1112.log’) do @LogParser -i:w3c “SELECT COUNT(cs-method) AS nmb_get FROM %i WHERE date = ‘2011-12-05′ AND time = ’18:30’ AND cs-method = ‘GET'”
How could I forget… :)
Re quiet mode, you should try specifying -o:csv instead. If you want multiple queries to output to append to the same txt file, you need -filemode:0 too.
hi..wants to see his ip
I haven’t used the GnuWin32 tools but I’ll check them out. Thanks.
Log Parser Lizard is a GUI for MS Logparser. Vey useful for quckly build and organize queries (supports intelisense, charts, export to Excel, row filtering, etc. Thought you should know about it.