In my last post about upgrading vCenter to 6.5 I’ve outlined the steps I needed to do for the migration. Both pre-migration, the actual migration and some post-migration steps. Today we were about to upgrade/migrate one of our oldest and biggest vCenters which presented some additional steps to consider for the pre-migration checks and tasks. This is also described in the VMware documentation so it shouldn’t be a surprise to those reading that before migrating.
In a previous postI talked about the upgrade of our vSphere environment. The first post described the upgrade and migration of the Platform Services Controller (PSC) from a version 6 running on Windows to a 6.5 Appliance. Our upgrade path would be PSC -> vCenter -> Hosts. Our environment consists of several vCenters so this will take some time and we might upgrade hosts in a vCenter that has been upgraded to 6.
This is part 2 of my vSphere Performance Data series. Part 1described the project and my goals with the project. This post will be my thoughts on retrieving performance data from our vSphere environment. As I described in part 1 our environment consist of 100+ hosts and 4000+ VMs. These are hosted in 3 different vCenters in the same SSO domain (Enhanced Linked Mode). All hosts and vCenters are at version 6.
There is lots of posts on retrieving performance data from your vSphere environment (I’ll probably use a lot of them in this series), but here’s my take on it. My ultimate goal is to build my own database of performance data and have a nice front-end presenting this. I also want to have an API that extracts data from the performance DB which I will use in our in-house portals and dashboards.
This month we started our upgrade journey from vSphere 6.0 -> 6.5 in production. We have had 6.5 running in lab for some time after the release last fall and we have enjoyed it so far. As a part of this upgrade we plan to migrate to the vCenter appliance on both PSC’s and vCenter’s as this has gotten all functionality (i.e. Update Manager), and it scales to the same extent as the Windows one.
Recently I’ve been messing around with the new .NET Core. I’ve created a class library which I’m using in an ASP.NET Core Web API. Firstly, at least before 1.1, I couldn’t add a reference to the class library just by giving it the .dll file. I needed to create a NuGet package of my library and publish it to a NuGet feed. I set up a local NuGet feed and added it as a source in VS and were able to add my code.
I have recently installed Visual Studio 2017 and with that started to migrate some of my .NET Core projects to the new .csproj Check out this post by Steve Gordon to get more background insight on the migration This looks easy enough with the migration wizard in VS. And the first try on a class library seemed to work just fine. However when revisiting the project a couple of days after the migration I get this in VS 2017:
A colleague asked if I had a script lying around that could investigate datastores and find orphaned files. Well, I didn’t. But instead of spending a lot of time creating one from scratch I went to our friend google and searched. There were lots of results, but I found one from Powershell guru LucD (lucd.info) and stopped searching. The script and instructions are found here: http://www.lucd.info/2016/09/13/orphaned-files-revisited/ Be aware that the script, or more precisely a VMware VIM method that the script uses, is relatively slow.
Welcome to my blog. I’ll keep this short as a Hello World should be. This blog will be a place for me to post things that I am currently working on and that interests me. If others find it interesting that’s great. Over the years I have been using different communities to my benefit so if I could give something back that would be awesome