Saturday, March 06, 2010

vSPhere 4 on GA-EX58-UD3R

If you're looking to put together a vSphere whitebox, the Gigabyte GA-EX58-UD3R motherboard with an Intel Core i7 920 isn't a bad choice. If you're thinking about going down this path - there are a few things worth mentioning... and while there's nothing particularly challenging or "show-stopping" about getting this working properly (except maybe the NIC), reading this post might save you some time.

Networking

You should know that the onboard Realtek RTL8168C NIC doesn't work - vSphere doesn't detect it. Ultimatewhitebox has a note about this, as do some forum posts... and sure enough, they're right. I found a few Intel PRO/1000 MT PCI Desktop Adapter's on eBay that work just fine, but if you're buying parts for this - go ahead and get a couple of the Intel NICs now. And you might want to consider disabling the onboard NIC during the setup.

Optical Drive

The takeaway here - make certain that your optical drive and SATA drive are on different channels. Also, if you want to avoid the below consider using a SATA optical drive instead of bothering with using an old IDE drive that you might have laying around.

Things to avoid...

I was using the eval version of vSphere 4 (4.0.0, 208167) on disc. The disc was detected on boot up, and I was able to get launch the vSphere GUI setup and get the installation started.
Initially, I was using an old IDE DVD drive. While the boot process went mostly as expected (with an odd exception sometime after the setup process started I had a message "CDROM Failed to Mount") I was able to get to the point in the setup where I had to create partitions. At that point, the setup process wasn't seeing my onboard SATA hard drive. So I rebooted, checked the BIOS and confirmed that the drive was being detected... I did happen to notice that it was on the same channel as the hard drive. After moving them to different channels, I launched the GUI setup again and when it got to the point of detecting the SATA hard drive, it was there. After seeing that, the setup process needs to copy some files off of the CD... which it suddenly failed to do... no drive detected. That seemed kind of strange, given that I had booted off the DVD drive, but recalling the CDROM failed to mount message - perhaps not. I went back into the BIOS, and changed the mode that the Optical drive was in - rebooted, but again failed to detect CD at the create partitions window. Next, I swapped in a SATA DVD writer from a different machine, launched the vSphere GUI setup (no more CDROM failed to mount message), detected the SATA hard drive, created partitions, and it completed setup without incident.

Overall, not too challenging for a Friday night... and now I have a vSphere host in my lab to start playing with.

Setup
vSphere 4 (4.0.0, 208167) disc
Gigabyte GA-EX58-UD3R bios version FB
Intel Core i7 920 CPU
Realtek RTL8168C on-board NIC disabled
Intel PRO/1000 MT PCI Desktop Adapters
SATA DVD Writer (SH-S223)
Western Digital WD5000 Blue Label, 500GB 7200RPM SATA Drive
Update in 2012 - added a Intel 128GB SSD




4 comments:

app said...

I am thinking of making whitebox esx host using same components.

Can you please tell me the performance of this system.

Thanks

Nick said...

It's runs fine for what I need it to do. And really performs just like you'd expect a single-socket, quad-core to perform. I've got about 5 thin-provisioned, VM's running various Linux, and Windows versions. It's not like I'm really pegging these VM's... I've got a DC, and some application servers... just stuff I want to work on at home. In fact, the hardware is probably overkill... I know you can get by on less. As to if it's going to meet your needs are not - what are you planning to do with it?

Anonymous said...

Hi Nick, great post.
I am thinking to set-up a couple of ESX with this configuration to give support to a small development office (around 30 person, a few web servers and app servers and a mysql cluster)

How do you think the performance will go for this based on your experience?
I would appreciate your comments.

Thank,

Gurb

Nick said...

Well, I don't want to say "it depends" ... but it really does depend on exactly what your goals are. If you're asking me if I think you can run 8 VM's (3 Web, 3 App, 2 SQL) on two hosts of this configuration then yes... you should be able to bring up 8 VMs and use them to some reasonable extent. The limitation that you'll quickly find is that you just don't have much storage I/O with a single 7200rpm drive. So yes, you should be able to bring up your dev environment, and yes - your 30+ devs can probably do something on it. But how useful this is, will really depend on the storage I/O demands that you have. So you might want to consider doing something different for storage. What that something is could be a number of different things... perhaps an SSD instead of (or in addition to) the 7200 RPM drive, and put your VMs on the SSD. Or a better solution might be to add an iSCSI storage network/SAN appliance that you role yourself. One option that comes to mind is using home-built storage array/SAN that employs ZFS... add bunch of RAM, maybe some SSDs for the layer-2 cache, and then a RAID10 or RAID50 array of 7200+ RPM drives for the "slow" storage, and you have an option that might meet your need better.