Home lab 2022 update – house edition


It’s time for my favorite topic, home planning and upgrading 😁

This article will cover changes I’ve made or have planned based on the move to my house. You can read through in it’s entirety or skip to the respective sections for compute hosts, case/rack, networking, storage, hypervisor choices, desk workspace area, UPS, and monitoring

Equipment Rack

I recently bought a house, see my post on that topic here

With the new house, I decided to finally buy a server rack to contain all my home lab / internet gear. After some careful measuring, I chose a 15U from SysRacks (amazon.ca link). Little did I know, the company is from Montreal! It was delivered first week of June 2022 in a truck with the Sysracks logo! NICE

If you do go with SysRacks, be warned, what others have said in the Amazon reviews section is true, the instructions aren’t great, especially the the installation of the 19 rack mount ears. I struggled with this piece, having not done a rack mount server install in many years. once done, I placed the unit into my new home office and it fits perfectly.
With the rack installed, I started to look into what other 19″ profile items I could install into the new rack

Within the new rack will be a Tripp Lite SMART1500LCD 8 port rack mount UPS (Amazon link)

I’ve had UPS units connected to my home computer equipment for almost 10 years, i chose the above unit to consolidate to just one unit, replacing my existing pair of older APC brand UPS units

  • Pic 1 – This was the temp setup I had from June to Sept 5, all the my lab gear was placed on top of the new server rack while I wait to finalize my purchases for new rack mount UPS / Storage units
  • Pic 2 is after I placed all my gear inside the case, except for my cable modem (for the rare time I need to power cycle it)
  • Pic 3 is with the locking front door installed

Update for Sept 13, 2022, I removed the Tripp Lite Smart1400 LCD 8 port UPS, too loud/hot! I wasn’t able to get the rack cooler than 29 degrees Celsius with it installed. I’ve gone back to my original 4 port APC unit, and will more than likely sell the Trip Lite unit or try to return to the manufacturer

Workspace / Monitor setup

I’ve had a two monitor setup since about 2015, it’s worked really well. However, as time has gone on, and slack/teams have mostly replaced outlook, I feel a need a dedicated third monitor just for communications. I’m 100% WFH in my job, so am constantly monitoring for alerts/emails/etc, there are no taps on the shoulders in my work day to advise me that I’m needed for something urgent. Years ago, I had a boss that was all about “inbox zero” and 100% replies on client requests, it’s been years since I’ve worked in an environment with such expectations, but the habit has stuck, I don’t think I’ve missed a replying to an important email in about 10 years…

So, my original plan was to try out a 3 monitor via desk attached arm setup as such

Here’s what it looked like after assembly

However, after trying the setup of the Samsung 27 ” monitor on top, with the two Dell 23″ monitors below, I decided it was NOT for me

So, a few Sundays ago in Sept 2022, and one Redbull later, I switched to the following:

The 2x Dell 23″ monitors are on top, the larger Samsung 27″ is on the bottom. I re-used an iPad stand I wasn’t using to mount my webcam. on an unrelated note, since becoming a home owner, I LOVE PLANTS, OMG. SO MANY. The Yuka on the left was outside till Sept 26, 2022, but it’s now getting cold in Montreal, Canada where I live, so it was time to bring it inside

GREEN POWER 😉

It should be stated, via Reddit/friends, I’ve researched ultra-wide monitors off/on for the past year, but none are the right fit for my workflow at this time. If you’ve got a great working Ultrawide setup and are doing EUC engineering / design / architect work like me, post a pic in the comments, and include what gear you used! I’m not fully sold on the 3 monitor setup, but will keep using it for the rest of 2022

Hypervisor choices

I’m professionally certified on Citrix / VMware, and do some Nutanix integration work with Citrix. I regularly do VMware project work to stand-up new vSan implementations and help customers migrate from vSphere 6.7 to 7 as the Oct 15, 2022 EOL dates approach. I don’t currently run Nutanix on any of my home hardware. My choice to use vSphere is based on job requirements, and my love of their VMUG Advantage program. For $200 USD per year, you can get full access to the entire VMware suite. Nutanix only provides older versions of Prism/AHV via their community edition program. The CE version is often quite behind the GA versions available to customers, so I’ve had the scenario where I wasn’t able to get newer windows builds to boot. Until they rectify this, I’ll stick with VMware

Compute choices

The on-going debate ; AMD vs Intel

I ran my personal desktop on an HP AMD 5600G based system for about a month in from Aug to Sept 2022. Worked fine for two monitors and Windows 11. However, with the exact model I chose from HP , I wasn’t able to drive 3 monitors. So, I switched to an HP EliteDesk G4 Core i7 8700. Before selling the HP AMD unit, I did test ESXI on it, the results weren’t good. I had to disable “secure boot” to get around the “ESXi pink screen of death” many others have reported trying to use AMD home hardware with ESXi. As well, the built-in NIC wasn’t detected, as the HP AMD desktop I bought used from AMD only had one full speed PCI express port, my upgrade path was limited

I’m not alone. There are posts from 2017 all the way to 2020 from home lab fans attempting to use commodity AMD mobo/Ryzen CPUs notingm ESXi 6.7 / 7.x “pink screen of death”. Some report running months without incident, however, to date, I’ve not had either of my HP EliteDesk 800 G3 SFF (Core i5 6500) units running ESXi 6.7 / 7.0.x crash in about 3 years of 24/7 use. As the years have gone by since I finished college in 2005, my “home lab” is no longer used to practice implementations for clients / learn / research, I host plex for me / friends, run active directory, have file servers for archiving and more. If/when any of these servers / services go down, I treat it like prod, and get it fixed as soon as possible. As such, having any of my ESXi hosts go down randomly due to AMD / ESXi issues isn’t going to work for me. I can’t explain why AMD EPYC processors aren’t impacted by the same issues as the Ryzen 3/5/7 counter-parts, maybe it comes down to lack of QA from VMWare on AMD desktop parts? If you have any theories, or have a working AMD mobo/CPU combo, let me know! Also, post your working hardware config to this EPIC thread on William Lab’s blog, I submitted my experience with the HP AMD Pavilion 5600G

The replacement for my 6th gen Intel based HP EliteDesk G3s will be the HP EliteDesk 800 G4 model, which has an Intel Core i7-8700 (6 cores / 12 threads) chip. To date, I’ve not read of similar PSOD issues on this particular model. This model is easily found on eBay for about $400 CDN per box

I’ll re-use my existing Samsung 970 EVO NVM / trad SSD for storage

Networking considerations

In 2019, I bought the Mikrotik CRS309-1G-8S+IN Cloud Router Switch 8xSFP+ switch. Mikrotik is a small Latvian-based networking manufacturer who make robust / reliable well priced gear. The unit has been rock solid, I see no reason to replace it at this point, however, assisting a co-worker with some home lab choices recently, he found the a suitable unit QNAP QSW-M408S 10GbE It’s well reviewed/priced on Amazon

For 10 GBe network cards in your hosts, I like older Intel X520-DA2 model cards. When I was still buying them in 2019, they could be found on eBay for about $75-100, but YMMV as of 2020. These cards aren’t fancy, they don’t support RDMA, for instance, however, I’ve found them reliable and fast. Synthetic benchmarks showed close to the expected line speed , around 9000 Mbit/sec. Real world usage was about 7200 Mbit /sec. The nice thing about this card, you can actually find it on the VMware HCL, good luck finding your other components on there 😜

NAS

For the longest time, here’s been how I’ve provided large-file / long-term storage @ home

  • Step 1: Buy/install a large 3.5 traditional hard drive into a single physical server, for the past 5 years, an ESXI host
  • Step 2: 3-4 years later, notice I’m running out of space
  • Step 3: Review backblaze drive stats reports to ID patterns in reliability for large 3.5 HDDs from Seagate, WD, Hitachi, etc
  • Step 4: But new 3.5 HDD that’s at least 25% larger in size than the one it’s replacing
  • Step 5: Migrate data from old to new drive, and yes, it takes longer to copy over all my data each time
  • Step 6: Think about a better way, look at current available NAS units from Synology/QNAP, curse at the price and lack of 10 GBe + M.2 NVM support
  • Step 7: Evaluate TrueNAS (previously FreeNAS) get annoyed with administrative over-head and stop using it after a few days
  • Repeat steps 1-7 till 💀

However, it’s 2022, it’s time to break the cycle

As I’ve got a 19 inch server rack now, I’m looking into a 19″ rack-mountable QNAP TS-432PXU-2G-US NAS unit. It’s got 3.5 drive support only, but 4 bays, and has built-in 10 GBe support. With a 4-bay unit, I can install one 3.5 drive today, and grow my storage needs as time goes on via RAID 5 or similar via this process. I can look at adding M.2 support for NVM drives via a PCI express add-in card later. However, my plan is re-enable vSan on my home lab, which would use the SSD/NVM drives already in my HP ESXi hosts. I’ve used vSan on/off for years, but as of Aug 30, 2022, I’ve got it disabled as I had re-purposed my third ESXi host for use with Nutanix CE, and didn’t want to use have vSan running as a 2-node cluster with an external vSan witness appliance

Monitoring / cooling

I monitor my physical / virtual assets by a script I maintain on GitHub, here

I don’t do kW power monitoring for now, but might do now that I’m settled into my house. If you have any suggestions for software/hardware to do so, let me know in the comments

I’ve installed a basic LCD screen that shows temperature / humidity inside my Sysracks server cage. I’m averaging about 23 degrees Celsius / 73.4 Farenheit with two low CFM 120 MM fans. The fan that came from Sysracks sounded like a jet engine, and could not be throttled down via a speed control swtich, so I replaced two 120 MM adjustable speed fans from Amazon

Wrap-up

As with any purchase, do your research as much as possible, finding someone who’s got the exact same unit you want to buy, who’s written a formal review on their blog / YouTube Video / Reddit etc is always a good idea

Share what you have in the comments and happy hunting 😀

Owen

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create your website with WordPress.com
Get started
%d bloggers like this: