During my session on how to prepare yourself I’ve showcased some tips and tricks which will make your live much easier when you upgrade your existing SCOM 2007 R2 installation to SCOM2012.
One of the tricks I’ve mentioned was to run the SCOM2012 web console in it’s own app pool. During the upgrade of your web console the application pool will be removed and you can only choose the default application pool to install the website which is in my opinion not a best practice.
So before installing the Web Console on your webserver perform the following tasks in IIS:
Open the IIS manager on your machine
Right click the application pools and choose “Add Web Site…”
Fill in the details:
Note: At this point you need to choose another port than the default 51908. You can change this again after the upgrade.
The site is up and running.
During the wizard for pre installation you’ll get at one point at the following dialog to choose your application pool. Here’s the reason explained why we can’t reuse the “old” site:
The Operations Manager WebConsole will be deleted during the upgrade so it will default to the default application pool.
Select the new created website and continue.
After the upgrade you’ll notice that the old site has been removed. At this point you can edit the binding of the application pool again to the default port to keep the same transparency towards your users.
More tips to follow so stay tuned!
In SCOM 2007 it’s possible to fill in custom fields with rules like you did in MOM 2005 as explained here: http://scug.be/blogs/dieter/archive/2011/05/13/scom-2007-custom-alert-fields.aspx
However this is not possible in monitors because there’s a fundamental change in how the alerts are created. In Rules the GenerateAlert module is used to create the alerts. In this GenerateAlert module it’s possible to pass extra data like the custom fields. In monitors the alert creation is slightly different. The alert is generate with parameters in the monitor itself so it’s not possible to pass extra data.
For a client I’m migrating MOM2005 to SCOM2007R2 and of course I would like to take advantage of the fact I can create monitors instead of rules. My client has a mainframe based problem management system (could also be any other system which doesn’t have a connector) which uses a mail scrubber to read out mails and scan for specific keywords to create tickets.
The specific keywords were passed in MOM through the custom fields. This is also possible in SCOM but only by using rules and not monitors. A solution could be to create a monitor and create a separate alert generating rule for that monitor. This solves our issue but is not manageable because if things change you have to change both the monitor and the rule to make sure they reflect the new situation.
Therefore I came up with another solution. Because there are only 15 possible combinations of keywords at my client I choose to use the Subscription / notification channel to insert the keywords in the dbase before I send it to the Problem management system. I could have just passed the parameters to the mail and send it but I prefer to update the dbase as well to also reflect the changes in the alert.
I’ve based my script on the script I used earlier on and is featured here: http://scug.be/blogs/dieter/archive/2011/05/11/scom-dump-alerts-to-text-file-and-mail.aspx
The main difference with the script above is that instead of reading the custom fields out of the dbase I pass them with the Notification Channel. Doing this makes the keywords centrally manageable when the keywords change.
As mentioned above I mostly reused the script of my previous blog post but for the record I’ll explain the script here once more:
First of all. You can download the script here: http://scug.be/members/DieterWijckmans/files/create_customfields_monitors.zip.aspx
The main difference with the previous script is the fact that we are not reading the data out of the dbase (in the $_.customfield fields) but inserting the data in the dbase through parameters by using the script.
Parameters: The “param” statement needs to be on the first line of the script. In my case I’m reading 3 parameters: The alertID (which is mandatory for the script), The Problemtype and the Objecttype.
The last 2 fields will be inserted in the $_.customfield dbase fields and are needed by the third party problem management solution to make the proper escalation.
RMS: Read the RMS server name of your environment. If you are using a clustered RMS it’s better to fill in the name of the cluster and comment the automatic retrieval of the name out to avoid problems.
Resolution State: The resolution state needs to be defined here and also defined in the SCOM environment (for more details on how to configure this in the SCOM environment check here: http://scug.be/blogs/dieter/archive/2011/05/11/scom-create-custom-alert-resolution-states.aspx
Loading the SCOM cmdlet
Culture Info: To make sure that the data format is correct you need to fill in the Localization. In my case it’s nl-BE.
Read in alert + fill in custom fields: The alertID which is passed as parameter is read in here and the data is retrieved out of the dbase. The other 3 custom fields which are required by the problem management system are filled in here and updated in the dbase. Technically there’s no obligation to fill in the fields in the dbase but to make sure that the custom fields are filled in when you open the alert in the console I update the alert anyway.
Note that I needed to make modifications to the date format to reflect the localization format here. All the data will be dumped to a file which is kept for future reference. The File path in yellow can be changed to reflect your location.
Mailing section:
Mailing the file to the problem management system or if in case an error occurred alerting the SCOM admin. Make sure you fill in the OK recipient, the NOK recipient and the SMTPserver to send out the mail.
Last but not least we are writing an event in the event log whether the operation was successful or not. This gives us the opportunity to monitor the problem creation script from within SCOM.
This solution works for me because I have a limited number of possible combinations.
A couple of things you need to configure before this script can be used in production:
The script must be run on the RMS (if it’s a clustered RMS make sure that the script is on both clusters in the same location).
Note: If you want to use more parameters or different names you have the change the following things:
There are 10 customfields available in the dbase so you can pass up to 10 parameters in the script and thus into the customfields.
If you have remarks or questions regarding the script please do not hesitate to drop me a line or contact me on twitter http://twitter.com/#!/dieterwijckmans
Scom is a great product but from time to time you need a custom build tool or script to do just the thing, or change just that bit that’s not possible in the SCOM console.
I’m personally a huge fan of the Powershell cmdlet supplied with SCOM. For most of the tasks (whether it’s automating or extending SCOM) it does the trick quickly and easily.
From time to time there’s a tool passing by on the world wide web that fills a gap to make our lives as a SCOM admin more easy.
Yesterday another of these fine tools emerged: http://systemcentercentral.com/forums/tabid/60/indexid/88501/default.aspx?tag=
Note: You need to register to download the tool.
This is the first version of the nice tool to count the instances per management group. This can be helpful to troubleshoot your environment. The PowerShell script which was posted in the community a while back took sometimes 3 hours to complete the task while this nice .net program is taking minutes…
You need .net framework 4 to run the tool.
Keep an eye on the topic because I’m sure it will progress in the next days like the authors mentioned in the topic itself.
I came across an excellent list written by Sonda (MPFE in UK) of all the required resources you’ll need to get up to speed with SCOM 2007 R2.
There’s plenty of info for the absolute rookie and the novice.
You can find the exhaustive list here:
If you are looking for a similar list of links to resource for Service Manager look no further. Kurt Van Hoecke created a nice list of all the resources you’ll need to get you going.
Happy reading
Yesterday Microsoft released the Cumulative Update 5 (CU5) for Scom 2007 R2.
This new update contains some additional fixes for Operations Manager 2007 R2 + support is added for Red Hat 6.
You can download the CU5 package (948.0 MB) here: http://www.microsoft.com/download/en/details.aspx?id=26938
The KB2495674 article apparently is not online yet but can be found here: http://support.microsoft.com/kb/2495674
For instructions on how to install a CU package in general (blog is written for CU4 but best practices can be followed for installation of CU5 as well) you can check this :http://blogs.technet.com/b/kevinholman/archive/2011/02/01/opsmgr-2007-r2-cu4-rollup-hotfix-ships-and-my-experience-installing-it.aspx
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
Next step in our backup process is to take a backup of our unsealed management packs to make sure we don’t loose all the customization we’ve made to the environment.
First a little bit of explanation about the difference between sealed and unsealed management packs. At my clients I sometimes see some misunderstanding about these 2 sorts of management packs.
Difference between sealed management packs and unsealed management packs.
The difference is rather simple. All the management packs you download from vendors such as Microsoft, Dell, HP,… are sealed once. They have been developed by the vendors and sealed to prevent any further customizations. These management packs often include a variety of Rules, monitors, views and even reports which are installed when you import the management pack.
All the management packs you create yourself are by default unsealed ones. In here you store all your customizations such as: overrides on the sealed management packs, custom reports, custom made rules, custom made monitors…
Notice the word “custom”… In my book the word “custom” equals a lot of time and effort are spend to create them… Don’t want to loose them in case of a disaster then!
So how do we back these up… There are basically 2 ways: Manually or automated.
If you have one unsealed management pack that you want to backup or you want to quickly back it up while working in the console you can use the following method:
Open the console and navigate to Administration > management packs > select the management pack you wish to backup and right click > choose Export Management Pack…
Select a location for your backup:
Click ok and your management pack is successful.
If you check your location you’ll see the management pack in XML format.
While the above method works like a charm for a quick backup before changing something in your management pack it’s not workable and a hassle when you want to backup several management packs. Not to forget the human factor… You have to remember to take backups of your management packs…
Therefore the preferred way to backup is by automating it via script using PowerShell.
Microsoft has actually gave the proper tools to do so in the powershell cmdlet set for SCOM.
The command to use:
$mps = get-managementpack | where-object {$_.Sealed -eq $false}
foreach ($mp in $mps)
{
export-managementpack -managementpack $mp -path “C:\Backups”
}
There are basically 2 approaches to automate this: SCOM scheduled rule or Scheduled tasks on the RMS itself
You can make your choice based on this nice discussion to compare the 2: http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexId/56798/Default.aspx
I choose to use the Scheduled task method to avoid the extra (although it’s minimal) load on the server and create a management pack to monitor the process.
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
There are 2 versions supported in SCOM and they both need a different approach to backup.
IIS6 is normally used if you are running SCOM 2007R2 on a Windows 2003 platform. It is used to support the components of the web console and the SQL Server Reporting Services. If you are using SQL Server Reporting Services from SQL 2008 IIS is not used anymore and it is just sufficient to backup the dbases.
IIS7 is normally used if you are running SCOM 2007R2 on a Windows 2008 platform (it can also run on a windows 2003 server but is not installed by default). The approach to backup IIS7 is somewhat different as it stores it’s data differently. The files you need to backup are web.config files and the applicationhost.config files. For more info on how to backup IIS 7 you can read this nice reference: http://blogs.iis.net/bills/archive/2008/03/24/how-to-backup-restore-iis7-configuration.aspx
So let’s start with the backup shall we:
Connect to the RMS and navigate to Start > administrative Tools > Internet Information Services (IIS) Manager
Right click the server name and navigate to “All Tasks” > “Backup/restore Configuration…”
The configuration backup dialog box will come up.
Fill in a backup name and if you want to make a secure backup you can tick the box “Encrypt Backup using password” and supply a password for the backup.
Click OK and your backup is made…
The actual backup file is stored in “%systemroot%\system32\inetsrv\MetaBack”
I suggest you take a copy of this file in case your RMS is unrecoverable. Otherwise the file itself which resides on the server will also be gone and what’s the point in backing up then
Open an elevated prompt on your RMS
Navigate to %windir%\system32\inetsrv
Run the command “appcmd add backup “name of your backup set”. If you not specify a name a name will be generated with the date and time.
If you are using IIS 7 your live just became a little bit easier. If you are using Vista SP1 or later / Windows Server 2008 the backup is automatically done if you create an initial backup like shown below. IIS automatically makes a history snapshot of ApplicationHost.config each time it detects a change so you can easily restore a prior version. By default it will keep 10 prior versions and checks every 2 minutes.
The files are stored in “%systemdrive%\inetpub\history”
Pretty cool feature if you ask me and a big improvement from the previous version.
To enumerate a list of backups and configuration history files, use the following command:
“%windir%\system32\inetsrv\appcmd.exe list backup”
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
One of the key factors in a successful restore of your environment is the SCOM encryption key.
This encryption key is used to store the data in the Operations manager dbase. It’s ensuring that the data in the dbase remains confidential and encrypted. The RMS uses this keep to read and write data to the Operations Manager dbase.
Pretty severe actually. If you don’t have the key you can’t establish connection from your fresh RMS to your existing Operations Manager dbase and therefore you loose all your settings, customizations and have to start all over again.
Please note that’s it’s a best practice to take this backup once after installation of the environment and after ANY changes to the RunAs accounts in the environment.
So how do you back this key up in case Murphy pays you a visit
There are actually 2 ways: GUI or command line
Log on to your RMS with an account with admin privileges
Open an elevated command prompt and navigate to your Operations manager install folder. In this case I kept it at default so c:\program files\system center operation manager 2007\
Note: Securestoragebackup.exe is only installed if you have installed a console on your RMS. If not you need to copy the securestoragebackup.exe file from the SupportTools folder from the installation media
The Encryption Key Backup or Restore Wizard pops up:
Click continue and select Backup the Encryption key.
A dialog box will appear to save your bin file. Best practice is to not save the file on the RMS. This makes perfect sense because you’ll need the file when there’s an issue with your RMS so there’s a big chance you can not reach the file.
I always save it on my file server and keep an extra copy somewhere else just to be save. As soon as you have exported the key you can make a copy of the bin file and store it twice on different locations.
So the location is set let’s continue.
Fill in a password to secure the backup bin file. Make sure you remember the password in X amount of time when you’ll need it to restore the key.
It will take no more than a few seconds to backup the key and if all goes well a nice complete message appear.
Log on to your RMS with an account with admin privileges
Open an elevated command prompt and navigate to your Operations manager install folder. In this case I kept it at default so c:\program files\system center operation manager 2007\
Run securestoragebackup.exe /? to get the syntax of the command.
The command used: securestoragebackup backup <filename>
You need to supply the password twice
and the second time
And the key was successfully backed up.
Downside is you cannot automate this process without further scripting because you need to put in a password. Would be nice that it would be an option in the exe to give your password as a parameter but maybe in another release
Recently I was working on a migration from MOM2005 towards SCOM2007R2. Unfortunately the MOM2005toSCOM2007 migration tool which was included in the SCOM2007 install media was not working anymore so I had to convert the management packs manually.
This post is for my reference whenever I need it again but in my search on the web I did not find many good write-ups so I wrote one of my own.
I needed a Microsoft Biztalk 2004 management pack in SCOM 2007. Unfortunately Microsoft never released a SCOM 2007 management pack but only a MOM2005 one.
So I had to convert it and load it into SCOM.
I’ve downloaded the management pack here: http://www.microsoft.com/download/en/details.aspx?id=14417
It’s an exe file which you need to unzip to a folder of your choice. I this example I’m going to put it on c:\managementpacks .
The file you need for the initial conversion is “c:\managementpacks\microsoft biztalk server 2004 management pack\Microsoft Biztalk Server 2004.AKM”
Next we need to convert the management pack to a valid XML file format which is used by Scom 2007.
You need MP2XML.exe to perform this conversion. It’s part of the MOM2005Resourcekit.
The syntax: C:\program files\microsoft operations manager resource kit\tools\convert management packs to xml\mp2xml.exe “inputfile”.akm “outputfile”.xml
so in this case C:\program files\microsoft operations manager resource kit\tools\convert management packs to xml\mp2xml.exe “c:\managementpacks\microsoft biztalk server 2004.akm” “c:\managementpacks\microsoft_biztalk_server_2004.xml”
Note the underscores in the naming. It’s not allowed to have spaces in the name of the XML file.
Next thing we need to do is convert the actual XML file to SCOM2007 format.
This is achieved with the mpconvert.exe which resides on the RMS in c:\program files\System center Operations manager2007\mpconvert.exe
The syntax in our example is: c:\program files\system center operations manager 2007\mpconvert.exe “c:\managementpacks\microsoft_biztalk_server_2004.xml” “c:\managementpacks\microsoft_biztalk_server_2004_converted.xml”
Now you can load the xml in SCOM like you load any other management pack.
There was an error with this particular management pack resulting in a failure of the import. Turned out there was an issue with the XML.
If you want to import the management pack you get the following error:
XSD Verification failed for management pack. The ‘Name’ element is invalid. This is an issue in the XML itself. A parameter which needs to be present in the XML is either corrupt or missing.
I’ve installed XML notepad 2007 on my machine to check the xml file.
You can download this nice tool here: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=7973
Now if you look in the XML you browse down to the manifest > name field you immediately notice that the field is empty:
You can just fill in a name at the right like I did below:
Save the file and try to import the management pack again.
This time it’s working and it let’s you continue with the import.
In this example a standard mp without any customization was used from Microsoft. However during my migration I had a lot of mp’s which were full of custom rules.
I ‘ll create a Powershell script to automate this process and post it as a follow up on this blog.
If you have any questions please feel free to leave a comment or shoot me a mail.
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
In order to keep the possibility to restore your SCOM environment in case of a disaster you need to make sure that you have a good backup of your dbases.
Your dbases keep track of all the info in your environment so it’s crucial that you don’t loose these valuable assets of your environment.
A good backup strategy of your SQL dbases is crucial as you need to make sure that you always have a recent copy at hand.
If you have a backup admin in your environment and are using a backup product like Data Protection Manager it’s best to meet up with the admin to check how his SQL backup schedule is configured. If he’s confident that he provides your backups no need to backup them twice…
If not you’ll need to perform the backup yourself.
First of all a small word about the different options you have to backup a SQL dbase (this applies to all SQL dbases and is not SCOM specific)
A good mix of the 3 methods above is a good strategy. I always take a full backup of the operations dbase once a week (datawarehouse once a month), a incremental backup once a day (for datawarehouse once a week) and Transaction log backups every 2 hours.
Again you can skip the Transaction log backups but then you risk to loose a max of 23h59m of data.
So let’s get the backups up and running:
We’re going to create the full backup schedule for Operations dbase in this example:
Connect to the dbase.
Open the tree and go to the “OperationsManager” dbase > right click > Tasks > Back Up…
Choose Options in the left pane and set:
Verify backup when finished
When all the settings are correctly configured choose Script button at the top of the page and choose “Script action to Job”.
This will generate a SQL job which you can schedule in the SQL agent so it can fire the backup when needed.
Name the job: In this case “Back up Database – OperationsManager_weekly”
Choose Schedules in the left pane and select New at the bottom:
Name the schedule and define the frequency + schedule. This will be scheduled in the SQL agent jobs.
When your schedule is made > click ok and the job is created.
You can check this job in the SQL Server Agent under the tree Jobs…
This is for the Weekly full of the Operation manager dbase.
You need to complete the same steps to perform the backup schedules for your other dbases.