How to query custom logs data in Log analytics

This post is a follow-up on how to SCCM custom data into your log analytics environment.

As soon as you have your SCCM custom logs, or any other logs, in log analytics they get indexed under the type you have specified.

In this particular case I used SCCMLOG_CL (note that the CL is mandatory). So lets jump into the log analytics query window to find out what’s in the logs at this time:

Browse to Log analytics => Logs


The log analytics query window will open and will give you the opportunity to start your query journey:


Remember our custom type: SCCMLOGS_CL. Note the autosuggest feature which will help you to create your own queries


If you run this query you will get all the results within the type. This is a great way to check whether data is flying in.


So now we’ll start finding more in detail patterns. If you type where in the next line you’ll get all the fields in your data:


Let’s select Rawdata where the word “error” is in the line:


So we get already a lot of results:


So another trick in your sleeve. You don’t need to type everything. It’s a point and click world to combine your query. Just click the + sign next to a field. In this case “Computer”.


This will add the field AND the content of the field to your query:


So now you can really start building your searches on your custom data.

Next time we’ll go over how you can actually create custom fields you can search on.

How to upload SCCM logs in Log Analytics

One of the great powers and conveniences of having all logs in 1 place is in fact that they are getting indexed and you can query them for different scenarios.

Just recently I was working on a project together with SCCM engineers and they basically told me a couple of times “it’s in this or that logfile”, they fire up SCCMtrace and start looking for the specific entry and start troubleshooting from there.

“OK” I thought, maybe just maybe there’s a better solution. Because of my monitoring background I don’t like to think reactive as in “it already happened” but love to think proactive.


That’s why I proposed to dump all the logs in Azure log analytics to get them indexed and have alerting / reports on them.

It took some convincing to get the SCCM engineers to believe this is possible but it is actually quite simple to set it up using log analytics and custom logs.

So first up the requirements:

  • You need to have an active azure subscription
  • You need to have Log analytics workspace
  • You need to have a SCCM server onboarded on that workspace.

If these are met the following steps will ensure that the custom logs are coming in:

· Select your workspace in the log analytics blade and select “advanced settings”


Navigate to “Data” => Custom Logs => Add +


This opens the 4 step process with is basically all that is to it.


Step 1: Select a sample log file with the format required. Note that this sample logfile can’t exceed a size of 500k

For this I’ve selected a file on my SCCM site server which was called : SMS_CLOUDCONNECTION


Click browse => select the file => upload => click next


Step 2:

Select the record delimiter:

This is a 2 way choice :

  • Either you choose that every line is a new record in Log Analytics
  • You specify a date format

Note : If there’s no date format selected Log analytics will fill the field “date generated” with the date that the logfile was uploaded instead of the alert / log entry occured.


Step 3 : Adding log collection paths:

This is where Log analytics is going to look for the log files.

A couple of things to keep in mind:

  • The path you fill in here will be checked on ALL machines which are onboarded to the Azure Log Analytics workspace
  • If you want a specific log you fill in the full name
  • If you want all logs with a certain extension you can actually use wildcards as well
  • You can add multiple logs to the same custom type.

For demo purposes I’ve added the path to all logfiles in SCCM as shown below and I’m uploading all *.LOG files.

The advantage of using the wildcards is in fact that no logs get missed. If new logfiles are created due to size issues the new logfile will be picked up as well


Step 4 :

Add a name for all the records. This name is actually called a type within Log Analytics. This type will hold all the log entries and will be your first stop to start querying.


Click done and at this point the new custom log has been created. The log analytics agents will get notified and will search for logs in that specific directory.


After a while the logs will be parsed and be available in log analytics to query.

In the next blog post I’ll show how to efficiently search across these types.

Enough talk, let’s build
Something together.