As enterprises invest greater resources in their digital presence and adapt their IT infrastructure to the demands of an evolving economy, it’s critical that data experts have the tools they need—and a firm command of those tools—to sift through important company information. By doing so, IT professionals stand a better chance of assessing threats and analyzing errors before they disrupt workflows throughout the business.
More often than not, that process will begin with staff looking through data logs to understand what might be causing issues. If a specific application isn’t working, for instance, sifting through logs will help pinpoint the nature of the problem so your team can begin crafting an appropriate solution. The same goes for investigating cybersecurity threats and server failures throughout your operation.
There are many platforms that can help you and your team perform this process, and log analysis can be done no matter the size of your team or the scope of your budget. Whether you’re investing in premium tools that can provide key insights into your digital environment or you’re looking at free options that offer a quick look at your data, it’s important you find a way to monitor your logs that suits your specific needs.
The bottom line is this: if you’re looking to improve the health of a company’s IT infrastructure, you need to pay attention to its logs in a way that offers flexible control over the process. Log parsers play a pivotal role in successful, digitally minded organizations—and there are some essential commands that IT troubleshooters can’t do without.
Why is log parsing done?
Data environments store information in unstructured ways, and these digital rolls of information are known as logs. While logs contain valuable information that’s essential for IT staff looking to get to the bottom of a range of problems, that information is too unwieldy to sift through in its raw, unstructured format. In order to glean actual insights from logs, it’s necessary to find a way to structure that information so qualified staff can then analyze it and better understand what’s at the root of particular problems.
By using certain commands and queries to structure and organize this information, IT staff can parse through logs assembled in ways most relevant to their particular needs. For example, if you and your team needed to look into what URLs on your site are the slowest, entering a certain log parsing command can quickly structure the available information based on that criteria. Then staff can analyze the logs that have been processed and reformatted based on that command to pinpoint slow URLs and understand what might be causing such delays.
In this way, log parsing can help IT professionals in assisting both small businesses and larger enterprises. When working with limited resources and personnel, log parsing can make more effective use of proprietary data in order to improve outcomes. On the other hand, log parsing can assist with the daunting task of working through the unstructured information of an expansive enterprise.
What is log parsing?
Despite what the term may imply, log parsing doesn’t only apply to logs, and can be thought of as an umbrella term that includes XML files, CSV files, the Event Log, the Registry, the Active Directory, and the file system—just to name a few. If a type of unstructured data exists in your digital environment, there’s a good chance that log parsing commands can help you organize and understand it.
That said, the practice of log parsing may vary depending on the exact product or platform you and your team are using to execute it. For instance, the free Microsoft Log Parser tool offers users basic functionality at no charge, but will look different from the commercial variants that service providers would recommend for most enterprises with broader IT needs. While Log Parser allows users to search text-based logs, tools like SolarWinds®Loggly include additional—and in some cases essential—monitoring, reporting, and visualization features.
You’ll want to choose a log parsing tool that allows you to run a search for a specific file format, based on the kind of data you plan to examine. An effective log parsing tool should allow you to aggregate data, apply your search queries and unique commands against that specific type of information, and provide deep insights into performance, status, and issues throughout your IT infrastructure.
Depending on your needs, that may include using Log Parser commands to analyze Windows logs or other products to look into SQL log files. Formatting will differ in each instance, and depends on what you’re looking for, as well as the specific nature of the problems you’re investigating.
How do I view Windows logs?
If your team is looking for an affordable way to begin log parsing Windows files, Log Parser can be a great option—although it’s not the only one. The tool provides access to log files and data sources through a Windows operating environment, and that includes the Event Log, the Registry, Active Directory, and more.
With Log Parser, analyzing information can be as easy as entering an SQL-formatted query into the command line interface. By doing so, you’ll be telling the tool what information you need, how to format that information, and where you need that information to be pulled from.
Here are a few Log Parser examples that may help get you started:
- All pages hit by a specific IP address:
logparser “select cs-uri-stem, count(cs-uri-stem) as requestcount from [LogFileName] where c-ip = ‘000.00.00.000’ group by cs-uri-stem order by count(cs-uri-stem) desc”
- Hits on a specific page by a specific IP address:
logparser “select c-ip, count(c-ip) as requestcount from [LogFileName] where cs-uri-stem like ‘/search.aspx%’ group by c-ip order by count(c-ip) desc”
- Hits per hour generated by a specific IP address:
logparser “select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.000′ group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))”
- Pages being hit and the specific IP addresses doing it:
logparser “select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc”
- IP addresses driving traffic:
logparser “select c-ip, count(c-ip) as requestcount from [LogFileName] group by c-ip order by count(c-ip) desc”
How do I view SQL log files?
In a similar way, viewing SQL log files just depends on writing the right queries in the right format. While this may be possible with Microsoft’s Log Parser, a number of commercial products exist that can support teams of varying sizes as they work to monitor logs, analyze issues, and craft necessary solutions.
These SQL commands should give you an idea of how to view SQL log files:
- Top 25 URLs:
SELECT TOP 25
cs-uri-stem as Url,
COUNT(*) As Hits
GROUP BY cs-uri-stem
ORDER By Hits DESC
- Number of requests made by a specific user:
SELECT TOP 25
cs-username As User,
COUNT(*) as Hits
WHERE User Is Not Null
GROUP BY User
- Top 25 types of files:
SELECT TOP 25
EXTRACT_EXTENSION(cs-uri-stem) As Extension,
COUNT(*) As Hits
GROUP BY Extension
ORDER BY Hits DESC
As you can see, there are many different ways to leverage log parsing in order to examine and organize data. Ultimately, the commands you use will depend on what you hope to find. As you assess the company’s needs, log parsing will likely be an invaluable tool for both troubleshooting issues and discovering actionable insights.