+ Reply to Thread
Page 1 of 2 1 2 Last
Results 1 to 25 of 36

Thread: VMWare Lab

  1. Senior Member MrAgent's Avatar
    Join Date
    Oct 2010
    Location
    Northern Virginia
    Posts
    1,283

    Certifications
    Sec+, MCP, MCSA 2003, MCTS, MCITP:VA, VCP5, MCSA 2012, MCSE Private Cloud, MCSE Server Infrastructure, C|EHv7, RHCSA, OSCP, GCIH, OSWP
    #1

    Default VMWare Lab

    For all those that are building labs, and wonder what would be a good lab, I found this the other day and thought I would share it.

    Another VMware home lab | Neil Koch
    2016 Goals: GCIH, OSWP - DONE!
    My OSCP review http://www.jasonbernier.com/oscp-review/
    Reply With Quote Quote  

  2. SS -->
  3. Senior Member gabypr's Avatar
    Join Date
    Mar 2012
    Location
    Puerto Rico
    Posts
    136

    Certifications
    A+, S+, MCP XP, MCDST ,MCTS (Vista,7), MCITP Vista, MCSE 2003, 70-410, 70-687, VCA-DCV, EC-Council University Student
    #2
    Nice guide Mr. Agent. Small, quiet, and powerful is what most people look in for today. Here in PR where electricity is almost .30c per kilowatt a setup like this would be nice to have.
    Reply With Quote Quote  

  4. Senior Member kriscamaro68's Avatar
    Join Date
    Apr 2008
    Location
    Utah
    Posts
    1,148

    Certifications
    MCSA: 2012R2, MCS: Server Virtualization, MCTS-Win7, Security+, Server+, Net+, A+
    #3
    Here is my lab:
    1x-http://www.ebay.com/itm/Dell-Poweredge-C1100-1U-2X-XEON-QC-L5520-2-26GHZ-NO-HDD-72GB-DDR3-Tested-/251269644666?pt=COMP_EN_Servers&hash=item3a80d6817 a

    1x-http://www.amazon.com/SanDisk-Solid-State-Consumption-SDSSDP-064G-G25/dp/B007ZWLRSU/ref=sr_1_1?s=electronics&ie=UTF8&qid=1372781767&sr =1-1&keywords=60gb+ssd

    3x-http://www.amazon.com/Samsung-MZ-7TD250BW-Series-Solid-2-5-Inch/dp/B009NHAEXE/ref=sr_1_1?ie=UTF8&qid=1372781684&sr=8-1&keywords=samsung+840

    With this setup you get 16 cores, 72gb of ram, 1x-60gb ssd for host OS and 3x-250gb ssd for vm's.

    I set this up with server 2012 on the 60gb ssd and then setup a storage pool for the other 3 ssd drives. This netted me 700mb/s reads and in the mid 600mb/s writes. This allows for both a hyper-v setup and a VMware setup on the same computer. Also I tested this with a killawatt and the average idle consumption was in the 90 watt range and never saw it peak above 120watts. 90% of the time it was in the 90 watt range. Also this server is slightly louder than a desktop setup with fans all over and quitter than a highend video card blowing during a good gaming session.

    I am very happy with this setup and would recommend this to anyone. Plus this server is on the HCL for ESXi.

    1x-Server: $445
    3x-250gbSSD:$522
    1x-60gbssd:$60
    Total: $1027
    Last edited by kriscamaro68; 07-02-2013 at 05:23 PM.
    Reply With Quote Quote  

  5. Senior Member MrAgent's Avatar
    Join Date
    Oct 2010
    Location
    Northern Virginia
    Posts
    1,283

    Certifications
    Sec+, MCP, MCSA 2003, MCTS, MCITP:VA, VCP5, MCSA 2012, MCSE Private Cloud, MCSE Server Infrastructure, C|EHv7, RHCSA, OSCP, GCIH, OSWP
    #4
    My VMWare lab maching is actually much different than what I posted.
    I have an intel i7 970
    32gb RAM
    4x 1tb storage
    1 Fusion-io ioDrive2 Duo 1.2TB

    I dont remember the motherboard, but it does PCI pass through, which is what I really needed when doing my labs.
    2016 Goals: GCIH, OSWP - DONE!
    My OSCP review http://www.jasonbernier.com/oscp-review/
    Reply With Quote Quote  

  6. Senior Member cyberguypr's Avatar
    Join Date
    May 2007
    Location
    Chicago, IL
    Posts
    5,775

    Certifications
    GCFE, GCED, GCIH, CISSP, CCSP, and others that should never be mentioned
    #5
    I've been running the SH67H3 with an i7-2600 processor, 16GB RAM, 2 SSDs, and 1 SATA drive for a year and a half now and it has been a solid, quiet lady. Love that machine. I keep flipping it between vSphere and Hyper-V as needed. Never complains about anything. Highly recommended.
    Reply With Quote Quote  

  7. Senior Member gabypr's Avatar
    Join Date
    Mar 2012
    Location
    Puerto Rico
    Posts
    136

    Certifications
    A+, S+, MCP XP, MCDST ,MCTS (Vista,7), MCITP Vista, MCSE 2003, 70-410, 70-687, VCA-DCV, EC-Council University Student
    #6
    I decided to buy a custom toshiba laptop for labbing, studying, and stuff with these specs:

    Intel® Core™ i7-3630QM Processor (6M Cache, up to 3.40 GHz) with Intel® Turbo Boost Technology
    16GB DDR3 1600MHz SDRAM
    256 SSD Hard Drive
    1TB Hard Drive (I replaced this with a Samsung 840 512gb SSD)
    2GB GDDR3 NVIDIA® GeForce® GT 630M with NVIDIA® Optimus™ Technology
    Blu-ray Disc™ RE with SuperMulti DVD±R/RW Double Layer drive*
    17.3" FHD TruBrite® LED Backlit Display (1920 x 1080)
    SRS Premium Sound 3D®, harman/kardon® Quad Speakers
    2-USB (3.0) ports, 2-USB (3.0) ports with Sleep and Charge
    Glossy Black LED Backlit Tile keyboard
    WiFi® Wireless networking (802.11b/g/n) with Bluetooth® version 4.0

    I can run various vm's with hyper-v without problems, im very happy with it.
    Reply With Quote Quote  

  8. Senior Member
    Join Date
    Feb 2012
    Posts
    604
    #7
    I have an extra server box at work i'm using. iSCSI storage, 32 GB mems 2x quad core xeon works a treat!
    Reply With Quote Quote  

  9. VMware Dude! TheProf's Avatar
    Join Date
    Jun 2010
    Location
    Canada
    Posts
    327

    Certifications
    vExpert | CCA | CCAA | MCSA | MCTS | MCITP:EMA 2010 | VCP5-DCV/DT | VTSP4/5 | VSP 5 | Network +
    #8
    I built two whiteboxes with AMD 8 core CPU and 32GB of ram... runs pretty well, only thing it doesn't support is FT for the moment, but its enough to get through the VCP and even VCAP. I also use a synology DS412+ for storage along with SSDs and HP v1910 gigabit switch.
    Reply With Quote Quote  

  10. Google Ninja jibbajabba's Avatar
    Join Date
    Jun 2008
    Location
    Ninja Cave
    Posts
    4,240

    Certifications
    TechExam Certified Alien Abduction Professional
    #9
    That's mine

    Quote Originally Posted by jibbajabba View Post
    I always had fully blown server which were doing my head-in. You could usually hear them buzzing in my garage from down the street. I looked for a solution with multiple server and yet I don't pay £$£$ for electricity.

    First of all, I know nested setups work, but I don't like em

    Over the last year I "collected" a few Microserver. I love these little things so I used them as sole hardware for my lab.

    Here the lab from "left to right" (currently vSphere until I got my DCA / IAAS, then change to Citrix)

    1. N40L - Running vSphere 5.1 (Standalone)
    Boot from USB
    16GB of Ram (KVR1333D3E9SK2/16G)
    Adaptec 3805
    4x 2TB SATA 3.5" (3x2TB Raid 5 + Hotswap)

    In 3.5" Optical Bay using 4x2.5" enclosure

    3x 300GB 10k SAS 2.5" (2x300GB Raid 1 + Hotswap)
    1x 750GB SATA / SSD Hybrid 2.5" (Standalone for Host Cache, isos, patches)

    Hosting VMs required to run the whole lot
    1. DC (2008R2 core)
    2. SQL (2008R2 + SQL 2008R2 Standard)
    3. FreeNAS (Presenting local SATA storage as NFS / iSCSI)
    4. WEB (CentOS 6 - Dev Web)
    5. Veeam (2008R2 + Veeam B&R Free)

    2. N40L - Running Server 2008R2 - Running vCenter Server 5.1 suite + PowerCLI etc.
    Booting from SSD (128GB)
    8GB of RAM

    3. N40L - Running vSphere 5.1
    HA / DRS Cluster (shared Storage coming from FreeNAS)
    Boot from USB
    16GB of Ram (KVR1333D3E9SK2/16G)

    4. N40L - Running vSphere 5.1
    HA / DRS Cluster (shared Storage coming from FreeNAS)
    Boot from USB
    16GB of Ram (KVR1333D3E9SK2/16G)

    Got a N36L as well - but that is (probably) dead but if fixed then it becomes a dedicated storage server.

    As for performance : Good enough. Memory is always the bottleneck - CPU never really ... The standalone box sits usually at 50% when running updates of some sort, but apart from that it is a lot lower - maximum I have seen is 90% when I am "doing stuff".

    As for memory though - those 4 essential VMs total up to 14GB but the Veeam and Web server are on only when needed and I am sure I could even lower the SQL server's ram (currently 6GB).
    But quite high maintenance and runs actually out of oompf lately .. Just got my old Supermicro Chassis out of the garage so will probably build a new single / nested lab .. We'll see...
    My own knowledge base made public: http://open902.com
    Reply With Quote Quote  

  11. Self-Described Huguenot blargoe's Avatar
    Join Date
    Nov 2005
    Location
    NC
    Posts
    4,088

    Certifications
    VCAP5-DCA; VCP 3/4/5/6 (DCV); EMCSA:CLARiiON; Linux+; MCSE:M 2000/2003; MCSE:S 2000/2003; MCTS:Exch2007; Security+; A+; CCNA (expired)
    #10
    I did just fine with a Lenovo laptop with a QC Core-i7, 16GB RAM, and a single 512GB SSD running VMware Workstation. If I wasn't running an iSCSI /NFS storage server for my shared datastores on the same laptop, and wasn't needing the laptop to do actual work, 256GB would have been plenty.
    IT guy since 12/00

    Recent: 3/22/2017 - Passed Microsoft 70-412; 2/11/2017 - Completed VCP6-DCV (passed 2V0-621)
    Working on: MCSA 2012 upgrade from 2003 (to heck with 2008!!), more Linux, AWS Solution Architect (Associate)
    Thinking about: VCP6-CMA, MCSA 2016, Python, VCAP6-DCD (for completing VCIX)
    Reply With Quote Quote  

  12. VCDX in 2017 Essendon's Avatar
    Join Date
    Sep 2007
    Location
    Melbourne
    Posts
    4,489

    Certifications
    VCIX-NV, VCAP5-DCD/DTA/DCA, VCP-5/DT, MCSA: 2008, MCITP: EA, MCTS x5, ITIL v3, MCSA: M, MS in Telecom Engg
    #11
    I have a HP DL 380 G5 workhorse with 32GB RAM, dual quad-core Xeon's and about 1.5TB in SAS storage. I have been running a nested ESXi lab for about a year and a half now. I only use 1 power supply. Works like a treat, no performance problems ever.

    Best thing about this setup? It cost me just $150.
    VCDX: DCV - Round 2 rescheduled (by VMware) for December 2017.

    Blog >> http://virtual10.com
    Reply With Quote Quote  

  13. kj0
    kj0 is offline
    Apple and VMware kj0's Avatar
    Join Date
    Apr 2012
    Location
    Brisbane, Australia.
    Posts
    733

    Certifications
    vExpert x 4 | Apple Mac OS X Associate | Cert III - IT.
    #12
    I'm running nested inside VM Workstation. It's been a little flaky the last couple of days, but I think that's the whole machine due to an updated AV Client.

    However: my system

    Gigabyte-Z68XPUD4
    i7 2700k
    16gb Ripjaws gSkill
    I've got quite a number of drives in my machine, but my VMs sit on 2 x 128Gb SSD in Raid 0 on an LSI Raid card.
    I'm running 9 Drives but 4 are SSDs, however, I had to put a 550Watt PSU in from when my 650 Died which may be on the limits of providing enough power.

    I currently have a spare box set up with 2 x 160 in raid 0 and 2 x 120 IDE's in Raid 0 for the purpose of a SAN. I was running FreeNAS but it doesn't do iSCSI very well. So I'm looking for a new way of running it
    2017 Goals: VCP6-DCV | VCIX
    Blog: http://readysetvirtual.wordpress.com
    Reply With Quote Quote  

  14. Google Ninja jibbajabba's Avatar
    Join Date
    Jun 2008
    Location
    Ninja Cave
    Posts
    4,240

    Certifications
    TechExam Certified Alien Abduction Professional
    #13
    Maybe this should be sticky ... Questions about labs coming up all the time.
    Reply With Quote Quote  

  15. VCDX in 2017 Essendon's Avatar
    Join Date
    Sep 2007
    Location
    Melbourne
    Posts
    4,489

    Certifications
    VCIX-NV, VCAP5-DCD/DTA/DCA, VCP-5/DT, MCSA: 2008, MCITP: EA, MCTS x5, ITIL v3, MCSA: M, MS in Telecom Engg
    #14
    I vote sticky too. Good idea jibba!
    VCDX: DCV - Round 2 rescheduled (by VMware) for December 2017.

    Blog >> http://virtual10.com
    Reply With Quote Quote  

  16. Senior Member
    Join Date
    Apr 2008
    Location
    New York, NY
    Posts
    305

    Certifications
    Life
    #15
    Essendon Just to be clear, is that what you used for your VCAP5-DCA /DCD Labs ?

    I think kriscamaro68's setup is quite nice. I am actually considering the following for the VCAP-DCA/ Vmware View exams:

    3 x Dell PowerEdge C1100 1U 2X Xeon QC L5520 2 26GHz No HDD 72GB DDR3 Tested | eBay
    with vSphere 5.1 USB flash boot
    1x Synology 413 with
    4x Samsung 840 250GB

    I hope it can handle the VCAP and View stuff.



    Quote Originally Posted by Essendon View Post
    I have a HP DL 380 G5 workhorse with 32GB RAM, dual quad-core Xeon's and about 1.5TB in SAS storage. I have been running a nested ESXi lab for about a year and a half now. I only use 1 power supply. Works like a treat, no performance problems ever.

    Best thing about this setup? It cost me just $150.
    Last edited by JBrown; 07-08-2013 at 03:17 AM.
    Reply With Quote Quote  

  17. VCDX in 2017 Essendon's Avatar
    Join Date
    Sep 2007
    Location
    Melbourne
    Posts
    4,489

    Certifications
    VCIX-NV, VCAP5-DCD/DTA/DCA, VCP-5/DT, MCSA: 2008, MCITP: EA, MCTS x5, ITIL v3, MCSA: M, MS in Telecom Engg
    #16
    Yes, that's what I used for both VCAP's. Does the job quite nicely, there were a couple of niggles here and there. For example, I couldnt turn on FT, couldnt play with EVC (no biggie this one), adding extra NIC's to nested ESXi hosts sometimes required 2 reboots. I believe this wasnt a problem with the physical machine itself, but an issue with nesting ESXi. You may run into them with any nested setup.

    I reckon one of the best things about a nested setup is the ability to create as many ESXi's (includes NIC's, datastores, VM's) as your hardware will permit and lab it up to your heart's content. Shared storage works without issues too, just go with the software iSCSI adapter. I had 4 nested ESXi's (with ESXi running on the physical machine too), a few VM's and vApps. I setup SRM too using an EMC storage appliance, worked without issue. Played with multiple vCenter's, Linked Mode, setup multiple clusters, moved hosts in and out, the whole works really. Never missed a beat.

    Cant beat a nested setup really. Everything in the one box, keeps electricity costs down too. The noise from the fans aint too bad either. Cant complain!
    Last edited by Essendon; 07-08-2013 at 04:02 AM.
    VCDX: DCV - Round 2 rescheduled (by VMware) for December 2017.

    Blog >> http://virtual10.com
    Reply With Quote Quote  

  18. DoWork
    Join Date
    Jun 2010
    Location
    A major Illinois hospital system near you
    Posts
    1,468

    Certifications
    vExpert, VCAP5-DCA/DCD, VCP5-DCV, VCIX-NV, VCP-NV, BSTM
    #17
    I have a laptop like blargoe with modified NetApp simulators for nested storage. Works great.
    Reply With Quote Quote  

  19. Senior Member
    Join Date
    Jan 2011
    Posts
    357
    #18
    Quote Originally Posted by JBrown View Post
    Essendon Just to be clear, is that what you used for your VCAP5-DCA /DCD Labs ?

    I think kriscamaro68's setup is quite nice. I am actually considering the following for the VCAP-DCA/ Vmware View exams:

    3 x Dell PowerEdge C1100 1U 2X Xeon QC L5520 2 26GHz No HDD 72GB DDR3 Tested | eBay
    with vSphere 5.1 USB flash boot
    1x Synology 413 with
    4x Samsung 840 250GB

    I hope it can handle the VCAP and View stuff.
    JBrown, maybe should consider the DELL F1D its basically the same exact server as the c1100 except around 300 cheaper and typically includes hard drives.
    Reply With Quote Quote  

  20. Senior Member
    Join Date
    Apr 2008
    Location
    New York, NY
    Posts
    305

    Certifications
    Life
    #19
    Quote Originally Posted by MrAgent View Post
    My VMWare lab maching is actually much different than what I posted.
    I have an intel i7 970
    32gb RAM
    4x 1tb storage
    1 Fusion-io ioDrive2 Duo 1.2TB

    I dont remember the motherboard, but it does PCI pass through, which is what I really needed when doing my labs.
    ioDrive2 is like $15K, isnt it an overkill for an i7 970 desktop ? How did you manage to get your hands on a beauty like that one ?
    Reply With Quote Quote  

  21. Google Ninja jibbajabba's Avatar
    Join Date
    Jun 2008
    Location
    Ninja Cave
    Posts
    4,240

    Certifications
    TechExam Certified Alien Abduction Professional
    #20
    Quote Originally Posted by JBrown View Post
    ioDrive2 is like $15K,
    If you're lucky

    Quote Originally Posted by JBrown View Post
    How did you manage to get your hands on a beauty like that one
    All you need is a chequebook

    I was working at a MSP and one big customer used four Fusion IO drives in Raid 0 in their SQL boxes to get the required I/O. Naturally they had multiple boxes for redundancy ... Each quad socket Nehalem server had a value of $120k (2U Supermicro) because of that (they had 8 )...
    My own knowledge base made public: http://open902.com
    Reply With Quote Quote  

  22. Senior Member MrAgent's Avatar
    Join Date
    Oct 2010
    Location
    Northern Virginia
    Posts
    1,283

    Certifications
    Sec+, MCP, MCSA 2003, MCTS, MCITP:VA, VCP5, MCSA 2012, MCSE Private Cloud, MCSE Server Infrastructure, C|EHv7, RHCSA, OSCP, GCIH, OSWP
    #21
    I work for Fusion-io currently. I have a few of them at home for various uses. The nicest one I have is a ioDrive2 Duo 2.4TB
    2016 Goals: GCIH, OSWP - DONE!
    My OSCP review http://www.jasonbernier.com/oscp-review/
    Reply With Quote Quote  

  23. DoWork
    Join Date
    Jun 2010
    Location
    A major Illinois hospital system near you
    Posts
    1,468

    Certifications
    vExpert, VCAP5-DCA/DCD, VCP5-DCV, VCIX-NV, VCP-NV, BSTM
    #22
    NOW it makes sense, good stuff.
    Reply With Quote Quote  

  24. Senior Member whatthehell's Avatar
    Join Date
    May 2006
    Location
    Orange County, CA.
    Posts
    914

    Certifications
    Server+,A+, N+, MCDST, HDI/SCA, MCP, ITIL Foundations 2011, VCA5-DCV,VCA5-Cloud, MTA 98-364
    #23
    This is what I just bought --- need to build, and planning to use a QNAP 2 bay NAS for storage (2x2tb WD red drives)

    CPU: Intel Xeon E3-1230 “Sandy Bridge” – 3.2GHz, 4 Cores, 8 Threads, 8MB (Newegg)
    Motherboard: Supermicro X9SCM-F – Intel C204, Dual GigE, IPMI w/Virtual Media, 2x SATA-3, 4x SATA-2 (Newegg)
    RAM: 32GB (4 x 8GB) Kingston 240 PIN DDR3 SDRAM ECC Unbuffered 1600 (PC3 12800) Server Memory Model (Amazon)
    Disk: Lexar Echo ZX 16GB (Newegg)
    Case: LIAN LI PC-V351B Black Aluminum MicroATX Desktop Computer Case (Newegg)
    Fans: 2 x Scythe SY1225SL12L 120mm “Slipstream” Case Fan (Newegg)
    Power: Seasonic 400W 80 Plus Gold Fanless ATX12V/EPS12V Power Supply (Amazon)

    Yes this is the Wahl Network white box...thank you Senior Wahl for posting!
    Reply With Quote Quote  

  25. DoWork
    Join Date
    Jun 2010
    Location
    A major Illinois hospital system near you
    Posts
    1,468

    Certifications
    vExpert, VCAP5-DCA/DCD, VCP5-DCV, VCIX-NV, VCP-NV, BSTM
    #24
    That motherboard has a NIC that's not on the HCL and requires a custom driver injected into the ESXi installation.

    [H]ard|Forum - View Single Post - Intel 82579LM and 82579V
    Intel 82579LM and 82579V - Page 10 - [H]ard|Forum
    Enabling Intel NIC (82579LM) on Intel S1200BT under ESXi 5.1 | Virtual Drive

    I'm sure you're aware, but if not, now you are.
    Reply With Quote Quote  

  26. Senior Member
    Join Date
    Jan 2011
    Posts
    357
    #25
    I run the dell f1d I suggested, just picked up a HP dl380 g4 and gonna pair it with a sma60 or sma70 to be my "San"
    Reply With Quote Quote  

+ Reply to Thread
Page 1 of 2 1 2 Last

Social Networking & Bookmarks