Another lab post!
… I love it!
Having assumed the role of a manager this past summer (July, 2013), I find my opportunity to really dig into ANYTHING and satisfy my technical cravings are severely limited. There is a lot of travel around the division for customer meetings, team meetings, internal meetings, etc., and hardly any for training. I get that as a manager that is part of the gig, but I must admit that part of me rails against the idea that I have to drop my technical chops at all.
The lab setup described in earlier posts (here, here, and here) has actually spent the past year or so in the EMC lab in Columbia, MD, under the care and feeding of my partner in crime, Larry. However, feeling more and more like I am losing my technical abilities (and ultimately my credibility), I decided to retrieve those systems and impress them into use once more as my proving grounds.
- Western Digital Caviar Black WD1001FALS 1TB 7200 RPM SATA 3.0Gb/s 3.5″ Internal Hard Drive
- Crucial 16GB (4 x 4GB) 240-Pin DDR3 SDRAM DDR3 1333 (PC3 10600) Desktop Memory
- AMD Phenom II X4 965 Black Edition Deneb 3.4GHz Socket AM3 125W Quad-Core Processor
- Diablotek PHD Series PHD750 750W ATX12V / EPS12V Power Supply
- SAPPHIRE 100293L Radeon HD 5570 1GB 128-bit DDR3 PCI Express 2.1 x16 HDCP Ready Video Card
- MSI 790FX-GD70 AM3 AMD 790FX ATX AMD Motherboard
- Antec Nine Hundred Black Steel ATX Mid Tower Computer Case
The on-board REALTEK NIC drivers gave me a problem for a while, but I found this blog post that describes how to use esxcli to install new drivers, and it worked like a charm (clearly something I probably should have learned before now, but there you go).
I have added an additional couple of drives to each computer – a 100GB SSD as well as a 256 GB 15K RPM SAS drive. Now each of the systems above has 3 tiers of storage – SSD, SAS, and SATA. I have also included an Iomega ix4-200d for network storage of templates, bin files, ISOs, home directories, etc. My plan is to use the Iomega for data at rest, or read activity only…. I have tried to use this particular device for active, running vms, but it doesn’t seem to be up to it.
Instead, I plan to have all 3 systems boot off a USB stick with ESXi installed, and use the 3 internal drives in an implementation of vSphere vSAN. That will give me 48 GB RAM, 12 cores, and 3 TB of storage with SSD acceleration without shared storage! Very cool… I already have vSphere 5.5 installed on all 3 systems, without vSAN, and have tried using the vMotion capabilities between vSphere hosts without shared storage and it works great. I am using the vSphere vCenter 5.5 appliance for all of this.
Once I have all the cluster stuff settled (HA / DRS / vSAN), my ultimate goal is to get the whole rig installed and running with CoudFoundry… More on that later.