Cloud Computing Lab vSphere

Hardware and Licensing Requirements

This page catalogs the hardware, software, and licensing that was used to successfully test and ultimate create the Cloud Computing Lab. This is not necessarily a minimum requirements document because there may be aspects that can be accomplished with a less robust configuration.

Required Hardware

CCL Master Server Box: DELL PowerEdge r710

  • BIOS 2.1.9
  • 2x Intel Xeon e5620, 2.40 GHz, Quad-core
  • level 2 cache, 4x 256 kb
  • level 3 cache, 3 mb
  • Memory: 144 GB, eccddr3 1067Mz
  • 4x 146 GB 10000 RPM HDD
  • 4x 900 GB 10000 RPM HDD
  • RAID controller(hardware), perc h700 integrated
  • 1 10GB Network Interface Card (NIC) (additional)
  • 4x 1 GB/s ethernet ports
  • 2x 10 GB/s ethernet ports

CCL Switch Environment: 2 Cisco 2960-S switches (Cisco Catalyst 2960S-24TD-L)

  • 24 Gigabit Ethernet ports
  • 1G/10G SFP+ slots
  • USB interfaces for management and file transfers
  • LAN Base or LAN Lite Cisco IOS® Software feature set

CCL Lab Workstation Environment: 20 Dell Optiplex 790 Workstations

  • Windows 8
  • Intel i7 @ 3.40 GHz
  • 8GB RAM
  • 500 GB Hard Drive
  • Integrated NIC (Enabled w/PXE)

Required Software and Licensing

CCL Master Server Box OS : ESXi 5.5

CCL VMWare Environment : vSphere 5.5

  • 1 vSphere 5.5 License
  • 21 ESXi 5.5 Licenses (for server and hosts)
    • host machines need at least 2 cores
    • minimum of 4GB of RAM

Setting up Auto Deploy

Autodeploy flowchart

VMware Auto Deploy Administrator’s Guide:

Required client/server resources:

  • vCSA to host DHCP, TFTP and auto-deploy services
  • ESXi 5.5 server
  • PXE bootable client workstation(s)

Required implementation software:

Optional (but helpful) software:

  • A non-IE browser (chrome, firefox etc)

Server Setup

The steps for configuring the CCL environment can be found in the Configuration Guide.

Integrating the vCenter Server Appliance with NETLAB

Setting up a trunk line between the large ESXi server and NETLAB:

  • At least one NIC needs to have a cable running from the ESXi host to the control switch associated with NETLAB. This must also be configured as a trunk line in order to allow proper communication between NETLAB and the contents of the vCSA’s datastore.
  • Console into the control switch using the appropriate credentials (you should use the defaults suggested by the NETLAB documentation to maintain proper automation and support compatibility).
  • Input the following commands:
    • interface x/x
    • description inside connection for ESXi Server
    • switchport mode trunk
    • switchport nonegotiate
    • no switchport access vlan
    • no shutdown

Create a NETLAB+ user on the appliance:

  • Login to the appliance’s CLI with the username and password you configured when you built it out from .ovf
  • Enter useradd –m NETLAB
  • To change the new user’s password, enter passwd NETLAB. You will be prompted to enter and then confirm the new password for the NETLAB user

Create a NETLAB role in the appliance

  • Enter the appliance through vSphere and click on Administration > Roles.
  • Right click the Administrator role and select Clone, entering NETLAB for the new role object’s name.
  • Right-click on the NETLAB role and select Add Permission.
  • In the window that appears, click Add and then select the NETLAB account and click OK.
  • Back in the Assign Permissions window, use the drop-down menu on the right to select
  • NETLAB and associate the cloned administrative permissions to the NETLAB user you created earlier.

Create a new vSwitch and bind it to a physical NIC

  • In the appliance’s vSphere view, navigate to Inventory > Hosts and Clusters and click on the ESXi host you want to configure in the left pane.
  • In the main pane, click Configuration and then click Networking in the Hardware Group box, then click Add Networking in the upper left.
  • To allow the ESXi host kernel to communicate with the inside NETLAB network, select the VMkernel radio button and click next.
  • Select the Create a Virtual Switch radio button, then select the physical NIC that’s associated with the trunk line to the control switch.
  • In the next screen, set the Network Label to “NETLAB Inside” and check the box labeled “Use this port group for management traffic”.
  • Enter a unique IP address from the table that appears on page 77 of NetDevGroup’s “Remote PC Guide Series – Volume 2” document.