Attending MVP summit 2019

For those who don’t know the MVP summit. This is the annual gathering of all MVP’s at the Microsoft campus in Redmond.

For those who KNOW the MVP summit they know that you are under strict NDA not to tell anyone outside of the MVP program where we are heading with different aspects of Microsoft products.

What I can show though how awesome it is to connect with so many people around the globe and meet them here at Microsoft to discuss different topics regarding the technologies we work with every day.


How to query custom logs data in Log analytics

This post is a follow-up on how to SCCM custom data into your log analytics environment.

As soon as you have your SCCM custom logs, or any other logs, in log analytics they get indexed under the type you have specified.

In this particular case I used SCCMLOG_CL (note that the CL is mandatory). So lets jump into the log analytics query window to find out what’s in the logs at this time:

Browse to Log analytics => Logs


The log analytics query window will open and will give you the opportunity to start your query journey:


Remember our custom type: SCCMLOGS_CL. Note the autosuggest feature which will help you to create your own queries


If you run this query you will get all the results within the type. This is a great way to check whether data is flying in.


So now we’ll start finding more in detail patterns. If you type where in the next line you’ll get all the fields in your data:


Let’s select Rawdata where the word “error” is in the line:


So we get already a lot of results:


So another trick in your sleeve. You don’t need to type everything. It’s a point and click world to combine your query. Just click the + sign next to a field. In this case “Computer”.


This will add the field AND the content of the field to your query:


So now you can really start building your searches on your custom data.

Next time we’ll go over how you can actually create custom fields you can search on.

How to upload SCCM logs in Log Analytics

One of the great powers and conveniences of having all logs in 1 place is in fact that they are getting indexed and you can query them for different scenarios.

Just recently I was working on a project together with SCCM engineers and they basically told me a couple of times “it’s in this or that logfile”, they fire up SCCMtrace and start looking for the specific entry and start troubleshooting from there.

“OK” I thought, maybe just maybe there’s a better solution. Because of my monitoring background I don’t like to think reactive as in “it already happened” but love to think proactive.


That’s why I proposed to dump all the logs in Azure log analytics to get them indexed and have alerting / reports on them.

It took some convincing to get the SCCM engineers to believe this is possible but it is actually quite simple to set it up using log analytics and custom logs.

So first up the requirements:

  • You need to have an active azure subscription
  • You need to have Log analytics workspace
  • You need to have a SCCM server onboarded on that workspace.

If these are met the following steps will ensure that the custom logs are coming in:

· Select your workspace in the log analytics blade and select “advanced settings”


Navigate to “Data” => Custom Logs => Add +


This opens the 4 step process with is basically all that is to it.


Step 1: Select a sample log file with the format required. Note that this sample logfile can’t exceed a size of 500k

For this I’ve selected a file on my SCCM site server which was called : SMS_CLOUDCONNECTION


Click browse => select the file => upload => click next


Step 2:

Select the record delimiter:

This is a 2 way choice :

  • Either you choose that every line is a new record in Log Analytics
  • You specify a date format

Note : If there’s no date format selected Log analytics will fill the field “date generated” with the date that the logfile was uploaded instead of the alert / log entry occured.


Step 3 : Adding log collection paths:

This is where Log analytics is going to look for the log files.

A couple of things to keep in mind:

  • The path you fill in here will be checked on ALL machines which are onboarded to the Azure Log Analytics workspace
  • If you want a specific log you fill in the full name
  • If you want all logs with a certain extension you can actually use wildcards as well
  • You can add multiple logs to the same custom type.

For demo purposes I’ve added the path to all logfiles in SCCM as shown below and I’m uploading all *.LOG files.

The advantage of using the wildcards is in fact that no logs get missed. If new logfiles are created due to size issues the new logfile will be picked up as well


Step 4 :

Add a name for all the records. This name is actually called a type within Log Analytics. This type will hold all the log entries and will be your first stop to start querying.


Click done and at this point the new custom log has been created. The log analytics agents will get notified and will search for logs in that specific directory.


After a while the logs will be parsed and be available in log analytics to query.

In the next blog post I’ll show how to efficiently search across these types.

Speaking at MMS 2018

Speaking at MMS


I’m pleased to announce that again I’m able to speak at one of my favorite events => MMSMOA.

MMS really is giving you the possibility to connect with the speakers by making the speakers accessible to the audience.

This gives the audience the opportunity to really ask that one burning question which is bugging them for a while and gives the speakers the opportunity to connect even outside the session with attendees to see what really is important in the session and give feedback.

This event I again may speak with many of my close friends:


Together with Robert Hedblom I’m speaking about whether you need to use SCOM and or log analytics and we’ll discuss how to choose the perfect fit for your scenario


Together with Florent Appointaire I’ll speak about one of the most mysterious 4 letter acronyms out there at the moment: GDPR. How to become GDPR compliant and avoid the massive fines that are enforced


Another awesome session with Bob Cornelissen. This session will be all about how you can use the new tools that Microsoft is developing to augment your security insights and how you can monitor and report on the common security issues.


Together with Cameron Fuller, we’ll showcase what you can do with log analytics and data from all your connected devices. An out of the box session how you can get data in and even get more insights in your world with log analytics.


Another cool session with Robert Hedblom discussing how you tackle one of the most underestimated but very important aspects to stay safe and keep you safe: backup and restore. Make sure to attend this session to actually see how to start and test your DR and backup policies.

Looking forward to the event and hopefully see you there!

How to get OMS alerts in SCOM


During recent events and customer contacts I got a lot of question regarding integrating SCOM with OMS. Also recently with my webinar with Savision it popped up several times. This question actually makes sense because SCOM has already a lot investments in it + is mostly the start of your ITIL process… But how do you actually get alerts in SCOM from OMS? Well by using OMS and Azure Automation of course!


Step 1 Define what you want to forward to SCOM by defining a scenario and a search query

The scenario is key in this stage of the process. You need to define what you are looking for. Alerting in OMS is quite different than SCOM for example. In OMS you need to ask yourself “How many time did X happen in Y time” instead of “If this then that” kind of monitoring in SCOM.

This is very important to find the right search query. In this scenario I’m going to demonstrate the following scenario:

I want to have an alert in SCOM when there are 5 password attempts in the last hour on the administrator account

It’s possible to solve this issue with SCOM but hey we are going to use OMS + Azure automation right?

Step 2 get all the building blocks linked together:

The following high level steps need to be in place for this to work. For the purpose of preparing links are provided:


Step 3 Create the Azure Automation runbook

Open the azure portal by going to and select the subscription where your workspace is configured in.

Select the Automation Accounts logo:


Make sure you select the correct Automation Account


Now you get an overview of all the runbooks which are configured in your automation account. Select Runbooks in the middle bar:


In the next screen choose: “+ Add a runbook”


Choose “Create a new runbook”


Give the new runbook a name and choose Powershell as Runbook type:


Copy the following powershell code in the right window:

## check whether log source exists ##
$logsourceexist = [System.Diagnostics.EventLog]::SourceExists("OMS");
if ($logsourceexist -eq $false)
## Create the log
{New-EventLog –LogName Application –Source “OMS”}

## Get the content of the webhook
$RequestBody = ConvertFrom-JSON -InputObject $WebhookData.RequestBody
## This is just to show you what’s in it ##
$RequestBody | Export-Clixml -Path C:\Temp\Invoke-OMSAlertDiskCleanup_RequestBody.xml
## You can get all the values! ##
$user = $RequestBody.SearchResults.value.Account[0]
$computer = $RequestBody.SearchResults.value.Computer[0]
$counter = -split (Get-Content C:\temp\Invoke-OMSAlertDiskCleanup_RequestBody.xml | Out-String) | Where-Object { $_ -eq "Account" } | Measure-Object | Select-Object -exp count

## Let’s create this for the SCOM
Write-EventLog –LogName Application –Source “OMS” –EntryType Error –EventID 1 –Message “User: $user has too many failed logon attempts on $Computer. This happened $counter times. ”


Click the Save button and then the Publish button and click yes to publish the runbook to your azure automation account.



Your runbook is now ready to be triggered by our alert in step 4

Step4. Develop the search query in OMS and create the OMS alert

Ok I’m cutting some steps short here. I assume you already have your machine connected to OMS and are sending up your security logs. If not follow these guidelines to get you going:

So let’s we are going to solve this… First of all most of the search queries do not have to be constructed from the ground up. They can just be found in the solutions and tweaked a bit. For example this scenario can easily be extracted from the Security and Audit solution (if you have configured it of course):

Open up the Security and Audit Solution by clicking on the Security and Audit solution:


In the left part of the screen you have “Identity and Access, Click on it to open it


In the middle of the screen you get the amount of failed logons and eureka! Vlab\administrator is in there… Well for demo reasons I had my 5 year old try to login…

So click on the desired account.


The search query window opens and there you have your search query all ready to go…


Type=SecurityEvent AccountType=user AND EventID=4625 Account=’VLAB\Administrator’

Now click on the Alert button on the top left choices to instantly create an OMS Alert which will be our trigger for the process to get the alert in SCOM:


The Create alert window pops open and basically has 3 areas:

  • General: This is where you define your criteria for the alert to be fired
  • Schedule: This is where you define your frequency of checking + the amount it has to occur within this timeframe
  • Actions: This is where you define how you would like to be notified

First things first: The General part:


  • Fill in a name for the Alert
  • Choose the Severity
  • Search query is already filled in and copied from the search query window earlier on.
  • Time window this can be no lower than 5 minutes. For demo purposes we set it at 15 min

Note: You already see we have 6 results for the given timeframe so our alert is going to fire.

Second the schedule part:


  • Alert frequency is when the search query needs to run. We choose here every 5 min.
  • Generate alert based on: Here we define how many results the search query needs to return before we want to be notified. In his scenario there’s no point in alerting when someone mistyped the password just once. That is highly doubtable an attempt to hack.

Third the Actions pane:


  • Email notification: Well self explanatory
  • Webhook: If you have another application which is taking in a webhook url you can activate it here. In fact calling a runbook is also a webhook but more on that later.
  • Runbook: Here you can select a runbook of Azure automation which linked to your workspace. (note I selected a runbook I made earlier on. Select here the name you gave your runbook in step 3)
    • Click yes


    • select the runbook (note you can not change the automation account the one displayed is linked to your workspace)


Run on (choose hybrid worker)

      • Note a small bug is still live in the console. If you close this view after configuring the actions and check the config of the alert this will always highlight Azure although you have selected Hybrid Worker => no panic!


So now we already have the alert which is kicking of our runbook on our Hybrid worker on prem.

At this stage we have:

  1. An alert which is detected in OMS
  2. An alert is raised in OMS. This can be checked by clicking the red dot on the bell in the top toolbar of your OMS workspace


3. A runbook is triggered which:

    1. Extracts the data from the oms alert webhook
    2. Creates a log file on the azure hybrid worker
    3. logs the data in the eventlog of the hybrid worker.

Step5. Get the alert in SCOM

So now when we check the eventlog of the Azure hybrid worker on prem we normally find the following alert everytime the OMS automation runbook is triggered by the OMS alert:



Now it’s quite straightforward to get the alert in SCOM by using a standard Monitor (self resetting after a while)


Note: I used a custom targetting to Hybrid Runbook Worker to make sure the monitor is not run on all machines.

and eureka:


The MP I used for reference:



The alerts show up in SCOM triggered by our search query, transferred through OMS alerting, treated by an OMS automation runbook towards our Azure Hybrid runbook worker where it’s picked up by our management pack…

SCOM 2016: Import management packs install button grayed out


During one of my installations of SCOM 2016 I came across “a first” for me which I would love to share.

Apparently the default behavior has changed when importing Management packs in SCOM 2016 which are already in the management group. In SCOM 2012R2 it was possible to just import the management pack over the existing management pack which makes sense as these are sealed management packs. As long as the version is equal or higher there’s not an issue.

In SCOM 2016 however this behavior somewhat changed causing the install button when you import MP’s from disk staying grey not able to continue.

In my case I was importing the SQL management packs. Apart from the error that the catalog was not up to date I came across the situation below:


After trying to delete the already installed mp’s the other mp’s were actually complaining that they are missing their dependent mp’s as shown below:


The solution I found to continue was in fact removing ONLY the “Microsoft SQL Server Visualization Library”. After that the install button magically became active and I could continue the install.


So in general if you can’t continue with the install make sure you try to remove the mp’s already in place. Start with the general ones and work your way down.

OMS: Getting the most out of OMS security features


Yesterday I hosted a workshop at Microsoft Belux about OMS security and compliancy features built-in in the OMS suite. It’s always nice to talk people through the different things which are included + give tips and tricks based on their questions.

As a lot of questions are returning I decided to bundle them in an overview blog post how you could effectively tune your environment. This is not a “how to” to setup OMS but just a summary of the small tips and tricks.


If you need a full “how to” setup OMS security check here:

1. Add your IIS logs to the mix

A significant portion of the insights on how you are doing regarding security comes from you IIS logs. Assuming that you have an OMS agent installed and added to your workspace it is invaluable to send these logs to your workspace as well for indexing and feeding the different users which are taking benefit from this knowledge.

  • Install an agent on the web server and connect it to your workspace. (I’m assuming you know how to do this)
    • Open your workspace and open settings by clicking the gear icon on top of your workspaceprintscreen-24-02-2017 0000
    • Go to Data => IIS Logs => tick the box “Collect W3C IIS Log files. From this moment on your IIS logs will be gathered, uploaded to OMS and indexed. They will be automatically used to feed the security solution amongst other solutions.

          printscreen-24-02-2017 0001

    To show you how important / reliant the Security solution is on IIS log data I’ve included the stats in my workspace.
    Go to Security and Audit:

printscreen-24-02-2017 0002

    Scroll to the right to Threat Intelligence and click on the Detected threat types dial:
    printscreen-24-02-2017 0003
    So check in the left corner you can see that the type of data is almost 50% based on the IIS logs. So make sure to add them
    printscreen-24-02-2017 0004

    2. Limit the amount of security events uploaded to your workspace

    Another handy tip is limiting the amount of data sent to your workspace to protect your usage. It used to be only possible to send all or nothing but just recently there’s a filter added to what events will be uploaded.

    To select this filter go to your security and audit solution:

    printscreen-24-02-2017 0002

    Click the gear icon on top left corner:

     printscreen-24-02-2017 0006

    use one of the predefined filters:

    printscreen-24-02-2017 0007

    For more info on the filters click the “For additional details” link.

    To summarize the different filters check the different scenarios.

    I’ve added to the list of events which are included in each scenario for your reference:


    3. Check your usage (especially in a POC scenario)

    Adding the security logs can have a significant impact on your uploaded data in your workspace and can cause overage payments or bad POC due to suspension of your workspace due to breach of max amount data uploaded a day.

    To check the usage of the security events follow the following procedure:

    Go into the main screen of your workspace and select usage:

    printscreen-24-02-2017 0008

    Scroll to the middle of the screen and look for Data Volume by solution => click on “Security”

    printscreen-24-02-2017 0009

    Check the graph to see which machines are consuming the most of the usage and try to take corrective actions.

    printscreen-24-02-2017 0010 

    In summary

    These are just some tips and tricks to get the most out of your security solution. This solution is heavily dependant on other solutions (anti malware, compliancy,…) so the more solutions you deploy and configure the more clear the picture will be on how you are doing on the security field.

    Stay tuned for more tips and tricks which will help you to get the full grasp and value out of your OMS investments.

Speaking @ ITPROCEED 14/06/2016 in Mechelen


It’s that time of the year again! Everyone is waiting for the summer to hit Belgium (I heard it will happen on a Wednesday this year!). Have some time and relax… BUT… Not before we go out with a bang at ITPROceed!

This is THE not to miss event in Belgium focusing on ITPRO’s. This event will be packed with sessions of both national and international speakers who use their expertise and gathered knowledge to prepare you for the next steps in your ITPRO career. ITPROceed is organized by the different Belgian user groups and backed up by Microsoft.

All the new technologies which will revolutionize your ITPRO world will be showcased giving you a real look and feel what the next steps will be to move your environment forward.

I myself will give you insights in the world of OMS. My session is scheduled on the “Transform the datacenter” track. During a demo loaded session I’ll showcase how you can use the latest and greatest in OMS to get the insights and reports you want.

If you are interested in OMS and what it can do for your organization this is a not to miss session.

Ow and by the way… Did I mention the entrance is completely FREE! Number of tickets is limited so sign up today!

More info here:

OMS Webinar: Get insights in your big data 07/03/2016


printscreen-26-02-2016 0000

On 07/03/2016 I’ll be hosting another webinar on the excellent Microsoft Belux platform. This webinar about OMS will focus on getting the insights you need from the big data which resides in your workspace.

It’s basically a next step in your journey into a new way of creating your insights in your environment. This session will be filled with tips and tricks to select the correct solutions and create your first search queries to create something no one else can: your insights.

This session assumes you already have a basic knowledge of OMS and have already set up a workspace with data flowing into it. If not you can always check my get started fast series here:

Hurry up as seats are unlimited and selling fast!

Register here and hopefully see you 7th of march at 10AM!

SCU 2016: Prepare to have your mind blown!


I got the news that I have the privilege (that’s how I definitely see it) to speak once again at SystemCenterUniverse in Dallas on the 19th of January 2016.


I consider this a huge privilege as I have a special relationship with this particular event. This is in fact where my wild journey through the System Center Universe as a speaker started. 2 years ago SCU held a fierce battle to determine who would be the new SCU_Jedi winning a session at this event… I was lucky enough to pull it off and suddenly I was presenting among the (what I consider) big shots in the System Center world…

Most of them are still presenting today if you look at the list of speakers it is quite impressive:

The first but not complete list:

As you can see al the usual suspects are there!

For the full agenda please check here:

this year again there’s a 2 track approach so you have the ability to cross over and see a session out of your comfort zone to learn really new cool stuff!

My session will be about the vast power of OMS and how it can create new possible insights in your environment. A truly not to miss session if you ask meSCUheader

Can’t fly in?

Too bad… You are missing out…

Not really! Because SCU is (I think) the only event who offers free streaming of the event over the web. There are even a lot of viewing parties organized near your location where you can easily follow the event from your location!

OK but why should I fly in then?

Well that’s very simple as well! IF you have the ability to fly in you get a chance to mingle with peers and talk to the speakers. There are no designated areas for speakers or whatsoever so everyone is really accessible to have chat or answer your questions…

So this is probably expensive right?

A full day of training on different subjects for only 150$ that’s a bargain if you ask me!

Last but not least

This is one of the events who are really embracing the social media (twitter, facebook,…) to reach out to attendees onsite but also across the world to engage during and after the event.

Make sure you follow: @scu2016 and #scu2016 on twitter for the latest updates and feeds!


Hopefully see you all there!

Enough talk, let’s build
Something together.