Finally Got it Right…
For those of you who have been following my progress (thanks for sticking around), there have been some changes to the home lab setup. You may recall, in my post last fall – Home Lab, Part 2 – I was having some network issues. I enlisted the assistance of one of my local Cisco System Engineers to troubleshoot (Jeff Manning – Jeff, if you’re out there, thanks, man!). Unfortunately, our initial round didn’t go anywhere. But he ws generous enough to help me install a Cisco 3750 for testing purposes… (I had serious questions about the little Dell Powerconnect I was using, but mostly it just wasn’t sophisticated enough to properly create andn manage VLANs and give me proper diagnostics). Through lots of cable swapping and port swapping and multiple reconfiguration of my ESX vSwitches, I was finally able to determine that several of my NICs were bad…. I guess storing them in my attic turned out to be a bad idea, since a couple of them apparently got baked.
So, I turned once again to NewEgg, and purchased 6 new IntelPro 1000 PT PCI-Express adapters, and put two in each of my servers. I now have (4) good solid 1 GbE ports on each ESX host, and a proper switch for VLAN configurations… Lo and behold, my rig finally screams! So, the lab is now as follows:
The lab runs set of three systems with 2 NAS boxes to use as an ESX cluster running vSphere 4.1. The complete gear list is below – each ESX system is built as follows:
- Western Digital Caviar Black WD1001FALS 1TB 7200 RPM SATA 3.0Gb/s 3.5″ Internal Hard Drive
- Crucial 16GB (4 x 4GB) 240-Pin DDR3 SDRAM DDR3 1333 (PC3 10600) Desktop Memory
- AMD Phenom II X4 965 Black Edition Deneb 3.4GHz Socket AM3 125W Quad-Core Processor
- Diablotek PHD Series PHD750 750W ATX12V / EPS12V Power Supply
- SAPPHIRE 100293L Radeon HD 5570 1GB 128-bit DDR3 PCI Express 2.1 x16
HDCP Ready Video Card - MSI 790FX-GD70 AM3 AMD 790FX ATX AMD Motherboard
- Antec Nine Hundred Black Steel ATX Mid Tower Computer Case
- 2x IntelPro 1000 PT dual-port GbE NIC
In addition to the 3 ESX Servers, the following were used for storage and connectivity:
- Thecus N4100 Pro NAS (purchased a while ago)
- IOMEGA IX4 StorCenter
- Cisco Catalyst 3750 ethernet switch
VMware vSphere Configuration
Here we see the three ESX servers have been configured in a VMware High Availability Cluster with DRS enabled to automatically apply Priority 1-4 recommendations. Other than the Windows PC used to manage the environment and the ESX servers themselves, all workloads were run as virtual machines. This includes VMware vCenter, as well as the Windows Active Directory Domain Controller, and the Microsoft SQL2008 server used for database connectivity.
vSwitch Configuration
Each of the four (4) NIC ports on the ESX servers is dedicated to a separate VLAN:
- vSwitch0 – VLAN 10 (native VLAN); virtual machine traffic and Service Console
- vSwitch1 – VLAN 172 (storage_net); ‘production’ iSCSI and NFS datastore connectivity
- vSwitch2 – VLAN 10 (native VLAN); NFS & CIFS traffic to ISO / templates / User directories and virtual storage appliances (UBERCelerra VSA)
- vSwitch3 – VLAN 192 (vmotion); vmotion dedicated
IP Addressing
Datastores
- bin – NFS datastore mounted from a 4-disk Thecus NS4100Pro; used for ISO files and installation binaries
- esx_local – local VMFS on each ESX server, served from internal 1 TB SATA
- esx-backup – NFS datastore mounted from a 4-disk Thecus NS4100Pro; used for backups, templates, and disk images (flp, etc)
- nfs-prod1 – NFS datastore mounted from a 4-disk IOMEGA IX4; used for production virtual machine disk files (VMDK)
One comment