Last week at TechDays me self and Fredrik Nilsson had a session about managing your workloads in Azure with Chef and PowerShell DSC. Despite the fact that we had the same timeslot as Arwidmark, Nystrom and Ben Armstrong our room was quite full of kung fu interested geeks 😉
Here is the presentation so you can find the links and info:
I have been playing around with automation of the deployment of OMS agents in my lab environment and wanted to share one way to do it, of course there are others.
I found Adam´s contribution to send files over PS2+ WinRM instead of utilizing a remote share or other deployment of the installation file.. Sometimes you maybe have firewall constraints to take into consideration and then this is a handy way to get that MMAgent over and installed if you have maybe only PowerShell remoting open 🙂
If you already have WMF 5 deployed you can utilize that instead of the Send-File function as it is in the new PowerShell version by default.
And here is the code:
# Send over and install MOMS Agent to remote computers
but they did not help as the error continued, I also tried to reinstall the VMM agents on the hosts if that would have been the cause.
Starting to investigating this issue more thoroughly i noticed some strange things, I could store VM´s from the hosts into the VMM library so that way worked with BITS, So the problem seemed to be related to the VMM Server..
Talking with my friend and MVP Daniel Neumann he sent me a link to another blog post that describes if you have deleted the VMM certificate (my certificate was there but apparently out of sync in some way) so here is the steps to regenerate the VMM certificate:
1. Launch the VMM PowerShell on the Virtual Machine Manager server.
2. Type the following and press enter:
$credential = get-credential
3. Type the username and password that is a local admin on the VMM server.
I have helped two customers moving their System Center VMM 2012 R2 servers to a Hyper-V VM.. Instead of carrying legacy stuff we installed a new Generation 2 VM in Hyper-V with Windows Server 2012 R2.
Easier said than done… or?
So what went wrong at both customers and how did I solve it?
We copied the library and the database backup from the old server. Did a shutdown of that one and then started the new one and added that to the domain and then installed the VMM server.
Patched it to UR7 with windows update and after that we did a restore of the db from the old system with the binary scvmmrecover.exe -path <db-backupfile>.
After that I started the console and trying to check things in properties and stuff and the console crashed the service got a dump:
Looking at the dump, I could see that not everything was great with the database, (the old VMM server was patched to UR7 before I did a database backup). Based on the log file something is missing in the restored database…
So how did I solve it? I uninstalled the UR7 on the VMM server and then reinstalled it and voila, no more crashes!
Today Microsoft announced a change in the MVP program and they have changed to 10 award categories containing the old ones (although there are a few remaining Power Users that is not IT Pro or Developers and are left unchanged).
This meaning that I (Mr Hyper-V) am now part of the Cloud and Datacenter Management. Together with about 350+ other MVP´s we make quite a large group of community contributors. I have already been contributing in PowerShell, Azure and lots of other areas during the year so based on that it feels good to be recognised for and also be able to be part of a broader contribution community!
And here are the five others:
And here are the unchanged categories:
If you want to read more about this change please follow this link to the MVP Award site
As you might have noticed a “new” backup solution has arrived and that for the Azure Backup, if you check your backup vault in the azure portal you can see that the new option for “Application Workloads” have appeared.
Downloading and installing it shows that it has traces of the DPM server:
And it can be installed on a windows 2012 R2, although you need the .Net 3.5 for the SQL 2014 instance (I know it is crazy!). The SQL 2014 license is included in the setup but can only be used for the Azure Backup Server.
So with some PowerShell and an Internet connection I add the .Net 3.5 and can continue installing…
In the installation wizard you add your backup recovery vault from Azure and then when it is installed you will have to install agents on the instances you want to protect. You can install this server in a VM or on a physical box, you will need some storage attached to cope with the backups that will be stored locally before they are lifted to the cloud.
When it comes to licensing I have not found anything else than the documentation on Azure and that says the price for protected instance and that makes this really interesting if you do not have System Center and want to start utilizing a backup solution that can protect Hyper-V, Exchange, SQL etc this becomes a viable solution!
Also if you are looking at the Microsoft Operational Management Suite where backup is a part of, this new feature makes it even more compelling to start utilizing the suite when not already using System Center.
I got a question last week from a customer about if it was possible or supported to use VMware vSphere with Windows Azure Pack. I have got this question a couple of times and wanted to share the info on a quick blogpost 🙂
And as you can read on the Technet Site, several of the different roles are supported with WAPack but not VM Clouds:
“Windows Azure Pack supports using VMWare virtual machines for deploying and administering the web sites service, databases, hosting the management portals, and the management API and web services. Third party extensions, however, are required to enable provisioning of virtual machines from a VMWare hypervisor using Windows Azure Pack.”