Blog

Flukso Energy meter monitoring pack: Part 2: Get data into MySQL

This blog post is part of a series check out the other posts in this series:

So after we have successfully installed the device and data is flowing to the flukso website we get a nice graph on our dashboard which is available by logging into the website:

clip_image001

Cool… So now we get a clear overview of our energy consumption. But there’s nothing we can do with it basically. We can look at it. Make some adjustments but no alerts, no long term reports nothing…

So as I discussed in the first post there’s an open API which makes the data available locally. This is great. No need to retrieve the data from an external website. It stays inside my own network.

The setup was very similar to my Nest Thermostat approach because I had that framework already in place I planned to use it the same way and get the data in SCOM via the same process.

Again the heart of my setup is my trusty Synology DS412+ hosting my linux distro and my MySQL dbase instance:

clip_image002

How did I get data in?

The setup is very similar to my Nest Thermostat approach. To get the data queried out of the Flukso device I have used the script written by a fellow fluksonian PeterJ (yep that’s the official name of users of flukso): https://docs.google.com/file/d/13wB85cPx_5nykBq3ZShnClHa1rpkRE5edNEluMqxrFaCRlvJrD8Bn_6UDCs9/edit?pli=1

He uses a set of PHP scripts to get the data in.

A high level overview of the install:

  • Connect to your Synology box with Winscp
  • Copy the content of the files to /volume1/web/flukso (Make sure to follow the exact same paths as described on the google drive).
  • Open settings.php and fill in the parameters requested:

[xml]

<?php
// Rename to settings.inc.php

// DB Settings
define(‘DB_HOST’, ‘localhost’);
define(‘DB_NAME’, ‘flukso’);
define(‘DB_USER’, ‘<fill in a user with rights to create dbase on your Mysql>’);
define(‘DB_PASS’, ‘<fill in the password of that user>);

// Flukso settings
define(‘FL_ADDRESS’, ‘ipaddressofflukso:8080/sensor’);
define(‘FL_PASSWORD’, ”); //for future use
define(‘FL_SENSOR1’, ‘sensor ID’);
define(‘FL_SENSOR2’, ”);
define(‘FL_SENSOR3’, ”);

// Meter settings
define(‘START_DAY’, ‘070000’);
define(‘END_DAY’, ‘230000’);
?>

[/xml]

  • Note the sensor ID can be retrieved from the website in the sensor section (make sure to use the ID and not the token)

clip_image003

  • Run the install.php script by accessing your synology via putty to gain a terminal access (more explanation check the Nest thermostat topic here)
  • At this point the dbase should be created an ready to go:

clip_image004

  • All left to do is create another line in the cron and restart it to get data flowing into our dbase and ready to get extracted by SCOM.
  • The crontab which needs to be changed is located in /etc and is named crontab. The line is in red.

Note: make sure to use TABS between the different columns otherwise the line will be deleted with the next reboot. On a Synology box it is… I don”t know on other linux distro’s but better safe than sorry right:

clip_image005

  • The line that needs to be added: */1    *    *    *    *    root    /usr/bin/php /volume1/web/flukso/cronjob.php

After this install the data should normally be coming in.

I’ve tried to create a brief overview on how to setup the Synology to get the data from the flukso into my own MySQL dbase using a community driven script. However this is a System Center blog so I’m not going to go further in detail here.

If you still have questions either check the flukso forum which has some really active members out there eager to help spread the word on this nifty device: https://www.flukso.net/forum

Or connect with me on twitter @dieterwijckmans so I can assist where needed.

Flukso Energy meter monitoring pack: Part 1: Intro on the device used

This blog post is part of a series check out the other posts in this series:

This post is part of an ongoing series on how to monitor my house with SCOM and build scenarios based on the data that comes into SCOM.

More info on the blog series here: http://scug.be/dieter/2014/02/19/monitor-your-home-with-scom/

After monitoring the temperature / humidity and heating in my house I now have turned my focus on the aspects that cost basically money. My electrical bill. To get this data in you need an energy meter. I actually have 2 at the moment so I can level them out to see which one is right… Boys and toys right.

clip_image001

They are both of Belgian companies but can be used on any power grid. The above device is called the smappee.

clip_image002

It’s a rather new device with a very spacy exterior and lighting. Indeed it’s connecting quite easy to your environment and it measures everything beautifully. The nice thing about this device is in fact it has a nice shiny app for iPhone and Android so you can get your data while on the road. The coolest thing is in fact that this device is capable of detecting certain patterns on your internal electrical grid to identify certain devices in your household so you can easily pinpoint what the big consumers of power are. This works quite well… The downside of this device however is that there’s up until now no way to get the data from the device towards your own device. This is not open source. Although there’s no additional fee for the website and the apps it’s kind of useless if you want to get the data out and play with it…

More info on the smappee can be found here: http://www.smappee.com/

clip_image003

The device just below is completely different. Although it serves the same purpose: monitoring your power consumption. This device is just a small box which holds a custom made device which was built from the ground up with the open source community in mind. The software is running on a linux distro, dd-wrt for the routing and you have the possibility to access it via a terminal to gain root access and play with the device. The data gathered is logged to the flukso server and nicely graphed on a custom dashboard protected by your user name and password. You get a nice overview of your consumption even in real time. Besides the electrical consumption you can also check water and gas consumption so an all-round device for a little bit less than the Smappee. The cool thing in fact is that you can access the data locally by checking the box in the admin dashboard. This opens up the local API which can be addressed by a simple CURL call.

More info on the Flukso can be found here: https://www.flukso.net/about

Installation?

The installation for both devices was straight forward. As soon as the device came online you needed to connect it to an account on the website and that was it… Now only to get the data into the device.

To use this you need to have a little background of electrical work. Both website come with a huge disclaimer if you are not confident with installing the metering device ask a professional.

What you need to do is clamp a power metering device over the hot wire of your electrical installation behind the meter and before the first fuse in your fuse box:

clip_image004

After connecting the clamp to the device you are good to go to get things monitored. Both devices use the same tech so if you have both just connect both of the clamps to the wire. No cutting is involved.

So this was a blog about System Center right?

True… But I’m also active in the flukso community and promised to give feedback to them as well how I cracked this box open to get all the data into a MySQL dbase. I’ve used a similar approach as the nest thermostat series which can be found here: http://scug.be/dieter/2014/02/19/nest-thermostat-monitoring-pack-part-i-how-did-i-get-data/

So how did I get data?

Still not much System center content but important for the people who are going to use this or try this at their home because face it… Monitoring is our profession and if we can save some money while we are at it… Check out the other parts to find out how I got data in.

Nest Thermostat monitoring pack: Part 4 seeing it in action

This blog post is part of a series check out the other posts in this series:

So after all this hard work. To get the data into my MySQL dbase and into SCOM. What can I actually do with it?

This is just the beginning of a far greater monitoring project I’m building to basically monitor my house but now I have control over the temperature and heating in my house.

I’ve created the views in the nest folder for humidity, Heating status and a separate view for target and current temperature.

Humidity view:

clip_image001

Nothing much we can do with this view as this is actually giving me a good reading. Everything between 30 and 60 is healthy condition so no complaints here.

Next in line is the Heating status:

clip_image002

This is basically a Boolean (on or off). The standard graph in Nest is also telling me this but I have to click through some views to get there. Now I can get this in a simple graph in my console wherever I want it.

Saving the best for last the temperature graph

clip_image003

The first 2 graphs are nice to have but this one is actually pretty cool. This is giving me the relationship between the target temp asked by my household at a given time and the actual temperature in my house. Here I can clearly see that it takes approx. 2 hours to get my home heated up (radiant floor) but the heat stays constant for a long time. This is due to the nature or radiant floor and because my house is well isolated. If I overlay the 2 graphs I can clearly see that the temp is rising as soon as my heating is working…

So now I have the data in there. Next step in the process is to create a API control console task to actually change the target temp. This is possible via the API I’ve mentioned so it will be added to the mp in a short while.

Nest Thermostat monitoring pack: Part 3: Create the mp

This blog post is part of a series check out the other posts in this series:

The Nest thermostat monitoring pack is in general part of a “monitor your home with scom” series which can be found here: http://scug.be/dieter/2014/02/19/monitor-your-home-with-scom/

downloadbuttonfertig11.jpg

Download the MP I’ve created here: http://scug.be/dieter/?p=1204

Now that we have discussed how to get the data from the Nest website via an api call into our dbase where we were able to get the data via PowerShell into a property bag. It’s now time to get SCOM working with this data.

I’m gathering a sample of the dbase every 5 minutes and the dbase itself is filled with data only when there was a call home from the device to the Nest website. As some of you will probably already know it’s not possible to use a PowerShell script to populate a performance rule in the console itself. You need to have a vbs. In fact it’s not a good idea at all to create a management pack in the console as it will be filled with GUID’s and such.

I’ve used the SCOM 2007 authoring console for quite a long time and am still using it but the biggest disadvantage is the fact it cannot interpret SCOM 2012 mp’s. If you create a MP with the authoring console it will work on both SCOM 2007 and SCOM 2012. But if you try to load a scom 2012 mp into the authoring console you’ll get a schema mismatch because it just can’t cope with the new schema. This makes it impossible to create the mp in the authoring console, load it in the SCOM 2012 management group, make minor modifications and then load it back into the console…

Well then you should use Visual Studio Authoring Extensions… True… But I don’t know Visual Studio. It’s still on my to do list but hey there are many things in there.

So for now I’ve used another great tool: Silect MP Author which is freely available. I made the core rules in there to get the PowerShell performance collection rule in there and then made view modifications and such in the console itself. Ok it’s not pretty but it’s just to showcase the possibilities of SCOM and I plan to integrate this bit into a larger “Monitor your home” mp which I probably rebuild from scratch using VSAE.

So enough chatting. Let’s create this performance rules to get the data we have in our dbase via PowerShell into our management group.

First things first. Download the free Silect MP author tool here: https://bridgeways.com/mp-author-landing-page

Install the tool and open MP Author

clip_image001

Create a new management pack:

clip_image002

Give it a proper name:

clip_image003

Save in a location (I’ll do it by default on my SkyDrive so I can work on my project anytime from anywhere)

clip_image004

Leave the references as is

clip_image005

Choose Empty

clip_image006

Create

clip_image007

Now create a target for our watcher node which is identified by reg key: HKLM\software\nest\watchernode

clip_image008

Right click target and choose group

clip_image009

Check for a server where the regkey is located on. It’s easier to browse than to type in the key.

clip_image010

Supply credentials.

clip_image011

and locate the key: HKLM\Software\nest\watchernode

clip_image012

Give the attribute a name

clip_image013

identify the discovery

clip_image014

and we only want to check whether it exists. We don’t care about the content.

clip_image015

We run this every day.

clip_image016

Create.

clip_image017

Next thing we need to do is create the group of watcher nodes

clip_image018

Fill in the desired expression and click next (note this changed in the final mp I’ve uploaded)

clip_image019

Create the group

clip_image020

So now we need to create the performance rule with our PowerShell we tested earlier on:

clip_image021

Copy paste the script in the script body window.

Make sure to change the credentials in the connection string as discussed in part 2 of this blog series.

clip_image022

Fill in the location variable

clip_image023

Identify the performance rule

clip_image024

Map the content of the property bags to instance and value which scom can use to create the performance dataset.

clip_image025

Leave the schedule as is (more on this later)

clip_image026

Create the script

clip_image027

So now we have a script to get humidity in our environment, the value that is… The same process needs to be followed to get current temperature, target temperature and heating status in as well.

I already did it in the mp I’ve uploaded but I really wanted to show you the ease of use of mpauthor. I plan to do a more thorough blog series on this great tool but this is not in scope of this blog series of course.

So now the scheduler part… We want to collect data more frequent than once a day of course. Turns out it’s not possible to change this in MP author nor in the console after you have loaded the mp. You need to change the xml code itself.

The collection rule is only configured in the mp as daily:

clip_image028

You need to change this to:

clip_image029

And you need to do this for all rules you have created of course.

Note: Like I said before this is just a small showcase of how the management pack is constructed. The management pack which is attached here is slightly different and has some config done in the console in there so it’s not as clean as ID’s concerned. Again I’m planning to rebuild a full mp when I have all my different monitoring aspects in place.

If you want to use the management pack I have created make sure to change the connection string values to your dbase location username and password

So all left to do now is load the MP in your management group and check whether everything is running.

Nest Thermostat monitoring pack: Part 2: Get data into SCOM from MySQL

This blog post is part of a series check out the other posts in this series:

So after we have successfully set up the connection between the nest and our mysql dbase data is pouring into our own dbase. So how do we get that data into SCOM so we can graph it and monitor it.

This blogpost will explain how to retrieve the data with PowerShell (of course) and dump it into a property bag which is readable by SCOM. This is the second phase in our schematic example:

printscreen-0109_2

Requirements

What do we need to retrieve the data out of the MySQL dbase.

  • A watchernode which has PowerShell V2.0 installed (can be a server or a desktop laying somewhere)
  • a reg key to identify this watcher node. I’m using “HKLM\SOFTWARE\NEST\Watchernode” for this
  • The mysql connector installed: http://dev.mysql.com/downloads/connector/net/ (note in this example I’m using version 6.8.3)
  • Scom agent installed on the machine to be able to discover it as a class

There’s no additional install required on the mysql server although you will need the following to connect:

  • Location
  • User which has access to the mysql dbase (I use Root but this is not the safest way)
  • password

I’m using this on a virtual Win2012 machine without any issues.

Retrieve the data from MySQLusing a PowerShell script

This is the script I created to get the data out of MySQL.

Note that this script only is retrieving one value. It’s possible to retrieve multiple values all at once but I prefered to use different scripts to get the different parameters out of the dbase.

The Nest parameters I read in:

  • Current temperature: The current temperature measured by the Nest device
  • Target temperature: The target temperature set for the Nest device at that time
  • Humidity: The Humidity measured by the Nest device.
  • Heating status: Whether the heating is on (1) or off ( 0 )

The script used:

downloadbuttonfertig11.jpg

It can be downloaded here: http://gallery.technet.microsoft.com/SCOM-Retrieve-performance-507293f1

[xml]<br />#===================================================================================================<br /># AUTHOR: Dieter Wijckmans<br /># DATE: 18/02/2014<br /># Name: Nest_humidity.PS1<br /># Version: 1.0<br /># COMMENT: Get the current humidity value from the nest device from the mysql dbase<br />#<br /># Usage: .\Nest_humidity.ps1<br />#<br />#===================================================================================================<br />param($location)<br />#load the connector but make sure to check the path if you are using a different version<br />[void][system.reflection.Assembly]::LoadFrom(“C:\Program Files (x86)\MySQL\MySQL Connector Net 6.8.3\Assemblies\v2.0\MySQL.Data.dll”)<br />#Create a variable to hold the connection:<br />$myconnection = New-Object MySql.Data.MySqlClient.MySqlConnection<br />#Set the connection string:<br />$myconnection.ConnectionString = "database=&lt;Fill in your dbase name&gt;;server=&lt;Fill in your server ip&gt;;user id=&lt;fill in user id&gt;;pwd=&lt;not 1234 right?&gt;"<br />#Call the Connection object’s Open() method:<br />$myconnection.Open()<br />#Prepare the property bag<br />$API = New-Object -ComObject "MOM.ScriptAPI"<br />$PropertyBag = $API.CreatePropertyBag()<br />#The dataset must be created before it can be used in the script:<br />$dataSet = New-Object System.Data.DataSet<br />#Run the actual query<br />$command = $myconnection.CreateCommand()<br />$command.CommandText = "SELECT humidity FROM data ORDER BY timestamp DESC LIMIT 1";<br />$reader = $command.ExecuteReader()<br />#Processing the Contents of a Data Reader we only want the last value<br />while ($reader.Read()) {<br /><%%KEEPWHITESPACE%%> for ($i= 0; $i -lt $reader.FieldCount; $i++) {<br /><%%KEEPWHITESPACE%%> $value = $reader.GetValue($i).ToString()<br /><%%KEEPWHITESPACE%%> }<br />}<br />$myconnection.Close()<br />$PropertyBag.AddValue("location", $location)<br />$PropertyBag.AddValue("humidity", $value)<br />$PropertyBag<br /><br />[/xml]

This script will basically do the following.

  • Prepare the environment
  • Open the connection to MySQL
  • Get the data in the data reader
  • Read out the last line because we are only interested in the most recent value
  • Fill it in the property bag

Note: I’m also using a variable $location to identify the Nest Thermostat if you have more than one.

Now to get all the different parameters as mentioned above the only things you need to change are:

  • The name of the script itself
  • The Select statement: SELECT humidity FROM data ORDER BY timestamp DESC LIMIT 1 with the column name of the desired value
  • The property bag value: $PropertyBag.AddValue(“heating”, $value)

If everything goes well you have your data in your MySQL dbase now and can retrieve it remotely via PowerShell to pass it on to SCOM.

Next blog post we’ll get everything in SCOM.

Stay Tuned.

Nest Thermostat monitoring pack: Part I: How did I get data?

This blog post is part of a series check out the other posts in this series:

I recently bought myself a Nest Thermostat. For those who are not familiar with this device. It’s basically an internet connected thermostat which learns your heating patterns with different variables. It’s of course packaged in a nice futuristic design… Cool…

More info here: https://nest.com/thermostat/life-with-nest-thermostat/

Although it is not officially released in Europe nor supported it’s actually quite easy to install. If you are reading this and need more info please make sure to check out this blog post: http://www.fousa.be/blog/nest-thermostat

So I survived the installation…

clip_image001

This round eye thing as my wife refers to it is doing its job nicely. I get a lot of feedback from the app and even on the website. Everything looks shiny and cool… But… What if… What if I could get this data in SCOM and see whether I can monitor and control it from there… Imagine the fun.

Well this blog post will document my way of getting data into SCOM. To give you a high level idea this is actually what I did:

§ Retrieve data from the nest.com website (not from the actual device as my first attempts failed miserably)

§ Get this data into a MySQL dbase

This is a schematic overview of what components were used:

clip_image002

In this blog post we’ll discuss how to get data from the nest into a MySQL dbase

Retrieving the data from nest

So how was I going to get data out of this nice circle on the wall which is controlling my heating… My first attempt was to sniff all the traffic which came from the IP of the nest thermostat and see whether there were valuable entries in the traffic between the device and the Nest web service. Well a lot of data was sent but it would take a lot of work to make sure everything was discovered correctly. I opened Google and Bing (always nice to compare both results) and started searching on the web. A lot of users had the same question how to get data out but not many had the answer. Until i stumble upon the github library and did a search there.

I found this API of Guillaume Boudreau on github: https://github.com/gboudreau/nest-api

This nice gentlemen wrote a PHP based api which could retrieve all the info and return it with a simple REST API call. I’m not a dev so this was unknown territory for me but I gave it a shot. My aim was to get this data stored locally so I can have full control over how it’s stored and retrieved by SCOM. This to make my life afterwards a bit easier.

I could inject it into a SQL dbase but this would require my server to be switched on all the time. I would also need to have a machine which is running a webserver to take care of all the different php requests to the API and such. Sure I can build a dedicated machine on my hyper-v host but the server had to be on all the time for the entire process to work not a good idea with current energy rates. But wait a minute. I have another device running all the time. My Synology DS412+. Up until now it only served as a storage device for both entertainment and iscsi luns for hyper-v. But this thing is capable of doing much more. I checked the different fan blogs / forums and realized right away that this device was the way to go. Without any knowledge of MySQL and or php I set of on an adventure..

Note: All this is configured on the beta version of DSM 5.0 so screenshots may differ from previous versions of DSM.

Configure the Synology NAS to store the data in MySQL and run the PHP settings.

I had a fun time configuring the NAS to get it a point that everything was working. I’ll link to some blog posts which got me going as this will take me too far. If you have issues please do not hesitate to comment or connect on twitter @dieterwijckmans to get more specific on this topic:

Install Mysql and phpadmin on synology nas

Install winscp to browse your Synology build in ftp server (via scp actually)

Install Putty to get a terminal access to the linux distro underneath your synology OS Note: Just install it you do not have to setup a key chain to access your synology.

A couple of quick pointers (I will write a more detailed process if I ever find the time or if there’s a huge demand for it)

  • When you connect to your synology station with Winscp use the SCP protocol on port 22 instead of the FTP or SFTP
  • To connect to your device via putty or winscp use the root user. The password is actually the admin password.
  • Set a password on your MySQL install for the root password because some scripts do not allow to run it with password no option.

Get this data in Mysql dbase

Again this will be a fairly straight forward process if you’ve done this before. For me all this was quite new as a 99% windows oriented tech (ok it’s out in the open I admit it) but it proofed to be a challenging but rewarding route.

First things first. These were my resources to get everything in Mysql. I started down a different path to write my own scripts when I stumbled upon a post on github by chriseng. He actually wrote a nice front end webpage to get data of Nest in a comprehenable way… He writes the data to a mysql dbase first. Exactly what I needed!

https://github.com/chriseng/nestgraph

So follow the instructions on there and get the api + script in place.

If all goes well open up your phpmyadmin by opening your browser and surfing to: http://<address of your synology>/phpmyadmin . You should see a dbase created and the first line of data in there:

clip_image003

I know this is a high level and quick manual but this is in fact how I got it running without any knowledge of mysql / php and so on.

Now that we have 1 line in the dbase (mine has more in there but I have already automated the process) we need to schedule the command to poll the web service on a regular basis.

This took me quite some time to figure out so if you made it this far I’ll safe you the time of figuring it out yourself.

High level steps:

  • Create an extra line in the cron
  • Restart the cron

Create an extra line in the cron

The cron is basically the Linux equivalent of task scheduler. You need to put in different parameters and Linux will run the command for you on a regular interval. More info can be found here: http://en.wikipedia.org/wiki/Cron

On the Synology the cron is located in folder: /etc and the file is called crontab.

It should look like this:

clip_image004

Some considerations which got me searching for quite a while why it wasn’t working:

  • Make sure to separate the columns by TABS instead of spaces.
  • Always use the root user or the job will be deleted when the NAS will reboot
  • Update the path of your php bin. Normally on a synology it’s located in /usr/bin/php
  • Update the path of your php scripts. Mine are stored in /volume1/web/nestgraph/insert.php
  • Verify your command by running it in a putty session like: “php /volume1/web/nestgraph/insert.php”. You will not receive a visual confirmation that everything went successful but you will notice an extra line in your mysql dbase.

So… You wait patiently to see the data coming into your dbase. But nothing is happening. You check again a while later. Nope. I see myself again struggling with this. Turned out you still need to restart the cron service on your Linux distro. All so complicated for us windows guys… sigh…

To do this on DSM 4X run:

  • /usr/syno/etc/rc.d/S04crond.sh stop
  • /usr/syno/etc/rc.d/S04crond.sh start

To do this on DSM 5 run:

  • /usr/syno/sbin/synoservicectl –restart crond

Do this in a putty session as these are both Linux related commands:

clip_image005

After this the data should be flowing in…

Still with me?

Next thing we need to do is get the data queried out of our MySQL dbase and into a property bag to pass onto to SCOM.

Check out part 2 (link in top of blog post)

Monitor your home with SCOM…

So I completely refurbished my house recently…. After getting all the painting done and such it’s now time to focus on the toys which we all love in our home. More and more devices are becoming internet connected these days giving you control and surveillance over your house.  A lot of apps on my phone keep track of different things such as surveillance, heating, power consumption, water consumption, smoke detectors…

But they are all different apps…

clip_image001

I have a SCOM lab running at home (ok not everybody has a full blown scom lab at home but maybe you can reuse some of my features in your setup). So during one of my scares holidays it got me thinking. Wouldn’t it be cool to have all this data in SCOM. There’s a little added value because all the programs that came with the products are written specifically to deliver the data of that device to your phone but hey I like a challenge.

This blog post will be a placeholder to link all the different aspects of my home which I covered in SCOM. It’s a nice showcase to show the versatility of SCOM and the different ways to get nonstandard data in.

First up…

nest

Nest Thermostat monitoring pack

Second in line: Flukso energy meter

6pos_3phase3.png

Flukso Energy Meter monitoring pack

 

System Center Universe 2014 Houston session video online

 

Wow I can’t believe it has been already 3 weeks ago that I was able to speak at System Center universe 2014 in Houston. The topic of my session was all about connecting both your on premise SCOM environment with your Azure cloud.

BfQYWpCIQAAZ3At

The event was a cool experience and a great chance to catch up with a lot of people and meet new ones while we are at it. Had some great talks about our beloved technology and how we see it developing in the next year.

My Session video is available here:

System Center Universe 2014 Dieter Wijckmans

 

In addition to the video above I’m including some links to articles with more info on how to connect your on prem with your Azure cloud:

Introduction and install of a Scom Azure Gateway server (Cameron Fuller): http://blogs.catapultsystems.com/cfuller/archive/2013/12/04/operations-manager-and-azure-better-together-introducing-the-azure-monitoring-gateway-[sysctr-scom-azure].aspx

Configure Site to site vpn: http://www.windowsazure.com/en-us/documentation/articles/virtual-networks-create-site-to-site-cross-premises-connectivity/

Configure point to site vpn: http://msdn.microsoft.com/en-us/library/windowsazure/dn133792.aspx

Configure the Scom Azure management pack: http://blogs.technet.com/b/cbernier/archive/2013/10/23/monitoring-windows-azure-with-system-center-operations-manager-2012-get-me-started.aspx

SCOM: Troubleshoot SQL version is unsupported while upgrading

 

Upgrading SCOM in a high security environment can be challenging from time to time. You actually need a lot of rights (some of them just temporarily of course) to get things installed.

Besides the user rights needed by the user which you launch the install with  ( http://technet.microsoft.com/en-us/library/hh212808.aspx ) you’ll need access to SQL, the machines where you want to perform the upgrade,…

Getting all this access can be challenging and adds an extra layer of complexity to the upgrade.

However I came across a common error message during installation which had a rather uncommon cause…

Problem:

"The installed version of SQL Server is not supported. Verify that the computer and installed version of SQL Server meet the minimum requirements for installation. Please see the Supported Configurations document for further information"

(Of course I did not take a screenshot in the heat of the moment but this is the actual message on the prereq checker and a good reference for search engines)

Solution:

This can have many causes but the most common is in fact you do not have a required version installed. Check and double check whether your SQL is up to spec. To find out which versions are supported check here: http://technet.microsoft.com/en-us/library/dn249696.aspx

If you are 100% sure that your version is correct chances are that SCOM prereq can’t check the version and just throws this message. Make sure you have proper firewall ports open if there’s a firewall in between the SQL and management server. Check this article for pointers: http://geekswithblogs.net/mattjgilbert/archive/2013/02/15/scom-2012—the-installed-version-of-sql-server-is.aspx

If this is ok check the rights on the SQL server. You need to have enough rights to check the SQL version and be able to create a new dbase. A tip here is to use the SDK account this probably has all the sufficient rights to do so.

Actually I’m running an upgrade from SP1 to R2 so the version should be ok no?

If all this fails (or even sooner) it’s time to look at the log files. As soon as SCOM starts it’s install it creates log files to document it’s progress.

This post easily documents where to find these log files: http://www.systemcentercentral.com/opsmgr-2012-my-installation-failed-what-log-files-do-i-read-and-where-can-i-find-them/

For reference when the link above should become broken:

“Logs are located in the %LocalAppData%\SCOM\Logs directory for the account under which installation was run. On a default installation on Windows 2008 R2, that would be c:\users\<UserName>\Appdata\Local\SCOM\Logs where <UserName> is the name of the account you used to run the installation.”

So In my particular case I opened the log file and found the issue below:

printscreen-0000 7-02-2014 

The lines in the above excerpt  “Exception Error code: 0x80070005, Exception.Message: Access to the path is denied.” and “Error: Could not create the directories for the specified DB Path”  caught my attention.

After checking I found out I did not have access to the path where the dbase files are stored neither with the SDK nor with my account. After I got access to the path the installation of the R2 upgrade went without an issue.

As this is not documented (by my knowledge) or not a common issue I’ve encountered during my many SCOM installs / upgrades  I documented it here on my blog hoping it will save you some troubleshooting time.

Event 20052 Certificate subjectname does not match local computer name

 

Well… One of the things which really divides the SCOM admins from the normal SCOM users is in my believe installing a gateway server. A lot of things can go wrong when installing one and even if you have done a couple of installs still it sometimes goes haywire. It’s basically a one shot or start all over again in my opinion. I’ve spoken about the Azure gateway management server install at SystemCenterUniverse 2014 in Houston and got a lot of feedback after my session that indeed this is the case.

During a recent install at a customers site I came across another great event id:

Event 20052

The full description of the alert (for search engine purposes)

“The specified certificate could not be loaded because the Subject name on the certificate does not match the local computer name

Certificate Subject name: servername.domain.local

Computer name: servername”

clip_image002

So the certificate was created for the full fqdn name but in fact our gateway server is not part of the domain.

Solution

By adding the DNS suffix to the computer name the certificate can be configured.

Open the computer properties of the server by right clicking ‘this pc’ and opening properties:

Selecct Change settings:

clip_image004

Click Change on the computer name tab:

clip_image006

Click the button More…

clip_image008

Fill in the domain FQDN which was documented in the event in the primary DNS suffix and reboot the machine.

clip_image010

This solved my issue and the alert did not return.

Any other suggestions are welcome of course…

Enough talk, let’s build
Something together.