During recent events and customer contacts I got a lot of question regarding integrating SCOM with OMS. Also recently with my webinar with Savision it popped up several times. This question actually makes sense because SCOM has already a lot investments in it + is mostly the start of your ITIL process… But how do you actually get alerts in SCOM from OMS? Well by using OMS and Azure Automation of course!
The scenario is key in this stage of the process. You need to define what you are looking for. Alerting in OMS is quite different than SCOM for example. In OMS you need to ask yourself “How many time did X happen in Y time” instead of “If this then that” kind of monitoring in SCOM.
This is very important to find the right search query. In this scenario I’m going to demonstrate the following scenario:
I want to have an alert in SCOM when there are 5 password attempts in the last hour on the administrator account
It’s possible to solve this issue with SCOM but hey we are going to use OMS + Azure automation right?
The following high level steps need to be in place for this to work. For the purpose of preparing links are provided:
Open the azure portal by going to portal.azure.com and select the subscription where your workspace is configured in.
Select the Automation Accounts logo:
Make sure you select the correct Automation Account
Now you get an overview of all the runbooks which are configured in your automation account. Select Runbooks in the middle bar:
In the next screen choose: “+ Add a runbook”
Choose “Create a new runbook”
Give the new runbook a name and choose Powershell as Runbook type:
Copy the following powershell code in the right window:
param(
[Object]$WebhookData
)
## check whether log source exists ##
$logsourceexist = [System.Diagnostics.EventLog]::SourceExists("OMS");
if ($logsourceexist -eq $false)
## Create the log
{New-EventLog –LogName Application –Source “OMS”}
## Get the content of the webhook
$RequestBody = ConvertFrom-JSON -InputObject $WebhookData.RequestBody
## This is just to show you what’s in it ##
$RequestBody | Export-Clixml -Path C:\Temp\Invoke-OMSAlertDiskCleanup_RequestBody.xml
## You can get all the values! ##
$user = $RequestBody.SearchResults.value.Account[0]
$computer = $RequestBody.SearchResults.value.Computer[0]
$counter = -split (Get-Content C:\temp\Invoke-OMSAlertDiskCleanup_RequestBody.xml | Out-String) | Where-Object { $_ -eq "Account" } | Measure-Object | Select-Object -exp count
## Let’s create this for the SCOM
Write-EventLog –LogName Application –Source “OMS” –EntryType Error –EventID 1 –Message “User: $user has too many failed logon attempts on $Computer. This happened $counter times. ”
Click the Save button and then the Publish button and click yes to publish the runbook to your azure automation account.
Your runbook is now ready to be triggered by our alert in step 4
Ok I’m cutting some steps short here. I assume you already have your machine connected to OMS and are sending up your security logs. If not follow these guidelines to get you going: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
So let’s see.how we are going to solve this… First of all most of the search queries do not have to be constructed from the ground up. They can just be found in the solutions and tweaked a bit. For example this scenario can easily be extracted from the Security and Audit solution (if you have configured it of course):
Open up the Security and Audit Solution by clicking on the Security and Audit solution:
In the left part of the screen you have “Identity and Access, Click on it to open it
In the middle of the screen you get the amount of failed logons and eureka! Vlab\administrator is in there… Well for demo reasons I had my 5 year old try to login…
So click on the desired account.
The search query window opens and there you have your search query all ready to go…
Type=SecurityEvent AccountType=user AND EventID=4625 Account=’VLAB\Administrator’
Now click on the Alert button on the top left choices to instantly create an OMS Alert which will be our trigger for the process to get the alert in SCOM:
The Create alert window pops open and basically has 3 areas:
First things first: The General part:
Note: You already see we have 6 results for the given timeframe so our alert is going to fire.
Second the schedule part:
Third the Actions pane:
Run on (choose hybrid worker)
So now we already have the alert which is kicking of our runbook on our Hybrid worker on prem.
At this stage we have:
3. A runbook is triggered which:
So now when we check the eventlog of the Azure hybrid worker on prem we normally find the following alert everytime the OMS automation runbook is triggered by the OMS alert:
Now it’s quite straightforward to get the alert in SCOM by using a standard Monitor (self resetting after a while)
Note: I used a custom targetting to Hybrid Runbook Worker to make sure the monitor is not run on all machines.
and eureka:
The MP I used for reference: http://scug.be/dieter/files/2017/06/OMS.Alerting.MP_.rar
The alerts show up in SCOM triggered by our search query, transferred through OMS alerting, treated by an OMS automation runbook towards our Azure Hybrid runbook worker where it’s picked up by our management pack…
Yesterday I hosted a workshop at Microsoft Belux about OMS security and compliancy features built-in in the OMS suite. It’s always nice to talk people through the different things which are included + give tips and tricks based on their questions.
As a lot of questions are returning I decided to bundle them in an overview blog post how you could effectively tune your environment. This is not a “how to” to setup OMS but just a summary of the small tips and tricks.
If you need a full “how to” setup OMS security check here: https://docs.microsoft.com/en-us/azure/operations-management-suite/oms-security-getting-started
A significant portion of the insights on how you are doing regarding security comes from you IIS logs. Assuming that you have an OMS agent installed and added to your workspace it is invaluable to send these logs to your workspace as well for indexing and feeding the different users which are taking benefit from this knowledge.
Another handy tip is limiting the amount of data sent to your workspace to protect your usage. It used to be only possible to send all or nothing but just recently there’s a filter added to what events will be uploaded.
To select this filter go to your security and audit solution:
Click the gear icon on top left corner:
use one of the predefined filters:
For more info on the filters click the “For additional details” link.
To summarize the different filters check the different scenarios.
I’ve added to the list of events which are included in each scenario for your reference:
Adding the security logs can have a significant impact on your uploaded data in your workspace and can cause overage payments or bad POC due to suspension of your workspace due to breach of max amount data uploaded a day.
To check the usage of the security events follow the following procedure:
Go into the main screen of your workspace and select usage:
Scroll to the middle of the screen and look for Data Volume by solution => click on “Security”
Check the graph to see which machines are consuming the most of the usage and try to take corrective actions.
These are just some tips and tricks to get the most out of your security solution. This solution is heavily dependant on other solutions (anti malware, compliancy,…) so the more solutions you deploy and configure the more clear the picture will be on how you are doing on the security field.
Stay tuned for more tips and tricks which will help you to get the full grasp and value out of your OMS investments.
On 07/03/2016 I’ll be hosting another webinar on the excellent Microsoft Belux platform. This webinar about OMS will focus on getting the insights you need from the big data which resides in your workspace.
It’s basically a next step in your journey into a new way of creating your insights in your environment. This session will be filled with tips and tricks to select the correct solutions and create your first search queries to create something no one else can: your insights.
This session assumes you already have a basic knowledge of OMS and have already set up a workspace with data flowing into it. If not you can always check my get started fast series here: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
Hurry up as seats are unlimited and selling fast!
Register here and hopefully see you 7th of march at 10AM!
https://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032737604&Culture=en-BE&community=0
I got the news that I have the privilege (that’s how I definitely see it) to speak once again at SystemCenterUniverse in Dallas on the 19th of January 2016.
I consider this a huge privilege as I have a special relationship with this particular event. This is in fact where my wild journey through the System Center Universe as a speaker started. 2 years ago SCU held a fierce battle to determine who would be the new SCU_Jedi winning a session at this event… I was lucky enough to pull it off and suddenly I was presenting among the (what I consider) big shots in the System Center world…
Most of them are still presenting today if you look at the list of speakers it is quite impressive:
The first but not complete list: http://www.systemcenteruniverse.com/presenters.htm
As you can see al the usual suspects are there!
For the full agenda please check here: http://www.systemcenteruniverse.com/agenda.htm
this year again there’s a 2 track approach so you have the ability to cross over and see a session out of your comfort zone to learn really new cool stuff!
My session will be about the vast power of OMS and how it can create new possible insights in your environment. A truly not to miss session if you ask me
Too bad… You are missing out…
Not really! Because SCU is (I think) the only event who offers free streaming of the event over the web. There are even a lot of viewing parties organized near your location where you can easily follow the event from your location!
Well that’s very simple as well! IF you have the ability to fly in you get a chance to mingle with peers and talk to the speakers. There are no designated areas for speakers or whatsoever so everyone is really accessible to have chat or answer your questions…
A full day of training on different subjects for only 150$ that’s a bargain if you ask me!
This is one of the events who are really embracing the social media (twitter, facebook,…) to reach out to attendees onsite but also across the world to engage during and after the event.
Make sure you follow: @scu2016 and #scu2016 on twitter for the latest updates and feeds!
This blog post is part of the Coretech Global Xmas blogging marathon. To find all cool content please take a look at http://blog.coretech.dk/
Recently I have been exploring OMS a lot and came across a cool user scenario which really showcases the benefits of having all data in one place. Using this big data to connect the dots between different systems and creating even more insights in your environment and the relationships between the different systems.
One demo which really had some eyes popping was in fact the calculation of the SCCM patch window with OMS. A lot of people already know that there’s a specific System Update Assessment solution which points out which machines are missing which updates. But there’s more to this solution that meets the eye on first sight.
You can use this solution, but also the data gathered by OMS for all your updates, to calculate very precisely how long it will take to patch a particular machine to create a patch window accordingly.
For this demo I presume you already have an active OMS subscription + workspace. For more info please refer to my OMS quick start guide to get you going fast: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
Log on to your workspace and make sure you have machines connected + the solution installed:
First click on Solutions Gallery:
Find System Update Assessment Solution and make sure it is added to your workspace. If it’s not yet added make sure to click the icon and add in the next screen
Make sure to add the Solution to your workspace
If you add the Solution for the first time it will perform an Assessment to gather the data for your environment:
When the Initial Assessment has been complete you will get your info on the tile which represents the System Update Assessment:
TIP: No worries my environment is not that badly patched but if you are looking into taking this solution for a test drive you can always install Azure VM’s with an earlier image (a couple of patch Tuesday’s ago) to have a machine which is in fact missing updates)
Click on the tile to open the detailed pane shown below:
Click on the Required Missing updates pane:
The next window will give you by default a graphical overview of the patches missing + the days ago the patches were released. This gives you a nice overview of how severe your machines are not patched. You also get a nice pie chart to give you an overview on how many patches are missing + the category of the patches.
Note on the right there’s an indication in minutes how long it will take on Average to install these missing updates:
This is not just a “Guesstimate” but OMS is actually using data out of the logs collected by all machines to give you an accurate time of install of this particular set of patches missing on this machine.
The number (in this case 81) is indicating that in fact they have data for all patches missing regarding the install time they will take to install.
At this time you can clearly state that the machine will probably be patched in approximately 14 minutes. You can build in some margin but definitely don’t need an hour to patch this machine.
This is just the pretty eye candy view of the Solution!
If you want to have the data by update you can dive into the big data gathered and create your own insights in your patch strategy. This can be achieved by using the “raw data” in the Search Query view and creating your own views. Let’s see how we can find out for example which patches will take more than 60 seconds to install so we can put them in a different patch group:
Click on “results” next to updates right underneath the search query window
At this point you get the 81 results with all their data but… no install time?
Click “Show More” on the bottom of the screen to unveil the InstallTimeAvailable / InstallTimePredictionSeconds / InstallTimeDeviationRangeSeconds properies
This is the data gathered for all the updates which are identified as missing on my systems.
InstallTimeAvailable: Will give you an indication whether enough data is gathered in the OMS system to give you an actual prediction of the install time. For new updates it can take some time to find the right data to be reliable to give you an accurate prediction of course.
InstallTimePredictionSeconds: This is the prediction based on all the data gathered through the OMS system (note this is not only based on your environment but across all environments connected to OMS showing the huge advantages of the Big Data approach of Microsoft Operations Management Suite.
InstallTimeDeviationRangeSeconds: Will give you an indication how much fluctuation is possible on the prediction. In this case the value is 0,83 meaning this can either be minus or plus.
Now to find out how many of the updates (81 of them) have an install time of more than 60 seconds we need to use the Search Query power:
Click in the Search Query window on the top of the screen and start typing Install at the end of the line:
OMS will give you suggestions on which parameter you want to search. In this case we are going to search on “InstallTimePredictionSeconds =”
So just click on it to get it into the Search query as shown below. At this point we can put “Greater than” 60 and run the search query by clicking the search Icon on the right or hitting Enter:
There we go… We have 6 patches will take longer than 60 seconds to install so we can take appropriate action regarding these patches in SCCM:
This is just a small example of the huge amount of insights you can create with OMS to help you further tune the management of your environment.
Today the OMS agent installation bits for Linux came online in public preview giving OMS the possibility to pull in performance and event data into the OMS workspace from Linux machines.
This is basically the next step to get OMS to monitor your entire environment. It is a clear example of the possibilities of OMS to monitor your entire datacenter not bound by OS or system.
Log on to your workspace and go to your overview => settings => connected sources and download the Agent for linux:
Obviously you will need the access to your Linux machines.
The install docs can be found on github on this url: https://github.com/MSFTOSSMgmt/OMS-Agent-for-Linux
In general you just need to run these commands:
$> wget https://github.com/MSFTOSSMgmt/OMS-Agent-for-Linux/releases/download/1.0.0-47/omsagent-1.0.0-47.universal.x64.sh
$> md5sum omsagent-1.0.0-47.universal.x64.sh
$> sudo sh ./omsagent-1.0.0-47 --install -w <YOUR OMS WORKSPACE ID> -s <YOUR OMS WORKSPACE PRIMARY KEY>
The following distro’s are supported:
More will come pretty soon but the mainstream distro’s are already on there
YES and you are very entitled to do so!
These are the channels to get your feedback / suggestions to Microsoft:
All data will flow in and your events and performance will be uploaded to your OMS instantly.
Expect a more detailed post in a short while. In the meanwhile just try it!
This is one thing I really like about the new strategy of Microsoft: All platforms (I know it’s not the official statement but still)
The OMS app was already available on WindowsPhone platform (in preview) and quite frankly it makes sense to actually develop for your own native platform first.
But today Microsoft has announced the availability of the OMS app across all the different platforms (Ios, android and winphone).
The install is crystal clear as you are used to through the store.
More info here: http://blogs.technet.com/b/momteam/archive/2015/10/21/log-analytics-on-the-move.aspx
Direct link: http://www.microsoft.com/en-us/server-cloud/operations-management-suite/mobile-apps.aspx
NOTE: Fellow MVP Cameron Fuller has a blog post about the experience on an ipad here: http://blogs.catapultsystems.com/cfuller/archive/2015/10/21/the-microsoft-oms-experience-on-an-ipad/
A couple of screenshots of the possibilities and look and feel on iPhone:
First start of the app (really like the look and feel):
Login screen looks very familiar:
Auto switch between corporate or personal
Signed in and detected that your workspaces, it’s indeed possible to switch between the different workspaces:
You have 3 options:
Starts into your dashboard:
Overview:
Also possible in landscape 🙂
Searches:
Settings tab can be reached by tapping the 3 red dots on the top of the screen.
The app is intended as an extension / dashboard for your OMS workspace. It’s not possible to add servers or delete servers from your workspace nor add solutions. This is not a drawback in my opinion as you only want to see things happening in your workspace on the go. This is a first version of course but I had no issues installing and connecting it. I will keep an eye on the data usage on my cell phone plan though just to see how it will affect my usage of mobile data and of course my battery life.
This blog post is part of the Microsoft Operations management Suite Quick start guide which can be found here: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
One of the things I’ve noticed right away when I fist opened the Microsoft Operations Management Suite (OMS) was the fact that I had different workspaces. They were all created in opinsights because the fact I added 3 different management groups in their respective SCOM console.
No sweat of course. I now build 1 management group in my lab environment where I configured everything so I wanted to get rid of the other workspaces.
Turns out there are 2 ways you can delete a workspace and in fact this was not clear in the beginning.
The remove option is well hidden in the menu’s to probably avoid deletion by accident which is actually a good thing but it’s a little bit too hidden in my humble opinion.
To get to the remove option follow the steps below:
Log on with your account. You will actually get all the different workspaces which are configured and hold data:
In this case I would like to remove the DWIT workspace as this is my ancient lab environment.
Select DWIT and open the workspace.
Select DWIT in the right upper corner and select the DWIT EUS | administrator wheel:
At this point you will have the settings of your workspace and right at the bottom there’s an option to close the workspace.
NOTE: Make no mistake your workspace will be removed and your data will be erased!
Now here is where things can go either way. There are 2 different options here:
This one is actually very simple.
If you see the printscreen of the post above just click close workspace…
OMS will present you with a nice message box with what’s going to happen and kindly asks you why you want to close.
Note: It’s not required to select an option but please do so to help Microsoft further develop the product to whatever direction you want it to go.
When your workspace was created with the azure management portal you will not be able to close your workspace from the OMS interface but you will need to delete the workspace in azure itself. You will get the message “This account can only be deleted from the Azure Management Portal”
Open your Azure management portal and navigate in the bar in the left to Operation Insights (note this name can be changed when you read this article as MS is aligning all the naming toward the OMS brand):
Select the account you want to delete and press the delete button at the bottom of the page
Are you really sure?
At this point the account is deleted and within a couple of minutes it should disappear from the available workspaces.
Note: The accounts that are created outside of the Azure portal will have a GUID like name. This name is generated when you link a workspace to your Azure account.
This blog post is part of the “Microsoft Operations Management Suite: Quickstart guide” which can be found here: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
After we have successfully created our workspace and have installed our Solutions it’s now time to bring in our data to start the magic and witness the insight in our data that OMS can bring
Here you have 3 options:
Note: If you receive errors when connecting these servers to your environment review this troubleshoot article to set the firewall correctly: http://blogs.technet.com/b/momteam/archive/2014/05/29/advisor-error-3000-unable-to-register-to-the-advisor-service-amp-onboarding-troubleshooting-steps.aspx
If you want to attach several servers which are not monitored by SCOM you can easily download the agent and installed. No need to fiddle with the certificates yourself any more!
Download the agent and install it on a server:
The agent package is around 25mb and will be downloaded to your local machine. Transfer the package to a machine which is not monitored by SCOM and install the package.
Note: The same restrictions as installing an agent from the console apply. It’s not possible to onboard a server which has a SCOM component installed such as a gateway server , management server,… Which makes sense because if you have these servers in place you have a SCOM environment and it’s far more easy to onboard the management group entirely instead of doing this per server.
Copy the MMASetup-AMD64 package to your server and run as administrator
The standard manual install dialog for a Microsoft Monitoring Agent Starts
click through the first screens
The next screen is interesting. Here we need to decide whether we are going to actually install the microsoft monitoring agent exclusively for OMS or also for the on prem SCOM. In this scenario we are choosing to exclusively use the agent for OMS
Now we need to fill in the GUID keys which are shown on the OMS page right under “connect a server”.
The workplace ID is straight forward: The workplace ID noted in the OMS console
The Workspace key is in fact noted as the “private key” in the OMS console.
Note: Again this probably will be aligned after the SCOM console is aligned with the new OMS system.
Click next and install
Finish. Wait 5 min and refresh your console:
Note: if you have more than one workspace make sure you select the correct workspace where you want to connect the server to as the id will be unique per workspace.
Open your SCOM environment and navigate to Administration > Operational Insights > Operational Insights Connection
Note: These names will probably change in the next UR or management pack release.
Click configure or Re-configure Operational Insights
Select whether you are using a work or Microsoft account. I’m using a Microsoft Account:
The associated workspaces with your account are loaded and selectable
Select your workspace and click update or create
Next choose which groups or servers you would like to send data to your OMS workspace. Click add a computer / group in the tasks bar on the right.
Select the servers / groups you want an click add
So now all the servers are coming into your Operational Insights Managed view.
This management group will show up in your OMS workspace as 1 connected management group:
The name / number of servers and the last data received is shown to give you a clear view on the status of your management groups.
A lot of solutions are dependent on the logs received. As this was one of the first valuable additions that opinsights brought this is almost mandatory to have in OMS as well.
Go to the last step of the “wizard” and select what logs that need to be gathered on the connected servers:
When configured we’ll get a nice 100% mark and we are ready to go!
Connecting is a breeze if your servers are able to reach the OMS service on port 443. You can connect individual servers or entire management groups where you decide which servers are actually sending data to the OMS service.
For now the agents for linux are not available yet but they will become available very soon.
So now you are all set to start playing with the Solutions you have installed while data is pooring in!
This blog post is part of the Microsoft Operations management Suite Quick start guide which can be found here: http://scug.be/dieter/2015/05/08/microsoft-operations-management-suite-quickstart-guide/
A wokspace is basically the same as your management group in SCOM. It contains all the differernt Solutions, connected datasource and azure account to start working. You can have several workspaces based with one account but interaction between different workspaces is not possible.
In this scenario we are going to build a new workspace. Just choose the name / email and the region and click create
Next up we need to link the Azure subscription we have associated to our Microsoft or corporate account. Note that having an Azure subscription is not a prerequisite for this step (you can just click not now) but it is highly recommended.
To make sure you are the proper owner of the email (note that it doesn’t have to be an email that is by default the email address associated to your account) Microsoft is sending you a confirmation mail which you need to follow.
Click confirm now and continue.
At this point your workspace will be ready and you will have all the standard tiles but no data is poring in just yet.
Head over to the Settings tile where you will be guided to connect your sources to the OMS service. In the past this involved setting up proxy servers and complicated settings as since the integration with SCOM this has become peanuts. OMS is also using the same entry point that Opinsights was using to get connected.
First step is in fact to add solutions. Formerly known as Integration packs (IPs) these solutions each will have their own purpose to tailor the way you want to use OMS. There are by default already some Solutions installed so you can click “connect a data source” to continue.
Now that you have your workspace configured it’s time to connect your datasources to get your data in!