One of the great powers and conveniences of having all logs in 1 place is in fact that they are getting indexed and you can query them for different scenarios.
Just recently I was working on a project together with SCCM engineers and they basically told me a couple of times “it’s in this or that logfile”, they fire up SCCMtrace and start looking for the specific entry and start troubleshooting from there.
“OK” I thought, maybe just maybe there’s a better solution. Because of my monitoring background I don’t like to think reactive as in “it already happened” but love to think proactive.
That’s why I proposed to dump all the logs in Azure log analytics to get them indexed and have alerting / reports on them.
It took some convincing to get the SCCM engineers to believe this is possible but it is actually quite simple to set it up using log analytics and custom logs.
So first up the requirements:
If these are met the following steps will ensure that the custom logs are coming in:
· Select your workspace in the log analytics blade and select “advanced settings”
Navigate to “Data” => Custom Logs => Add +
This opens the 4 step process with is basically all that is to it.
Step 1: Select a sample log file with the format required. Note that this sample logfile can’t exceed a size of 500k
For this I’ve selected a file on my SCCM site server which was called : SMS_CLOUDCONNECTION
Click browse => select the file => upload => click next
Select the record delimiter:
This is a 2 way choice :
Note : If there’s no date format selected Log analytics will fill the field “date generated” with the date that the logfile was uploaded instead of the alert / log entry occured.
Step 3 : Adding log collection paths:
This is where Log analytics is going to look for the log files.
A couple of things to keep in mind:
For demo purposes I’ve added the path to all logfiles in SCCM as shown below and I’m uploading all *.LOG files.
The advantage of using the wildcards is in fact that no logs get missed. If new logfiles are created due to size issues the new logfile will be picked up as well
Step 4 :
Add a name for all the records. This name is actually called a type within Log Analytics. This type will hold all the log entries and will be your first stop to start querying.
Click done and at this point the new custom log has been created. The log analytics agents will get notified and will search for logs in that specific directory.
After a while the logs will be parsed and be available in log analytics to query.
In the next blog post I’ll show how to efficiently search across these types.