Server rack at home

Does someone has something like that at home?
https://mega.nz/file/oVAFiYjK#tcBDttyDZbqN9dFhWrRRu9XfYQjSTt0JzIJV2QCi8AM
This is for Plex Media Server, around 100TB of content
Heard, that LTnigger from LET has servers at home, selling his own hosting plans. Who else? :)

Thanked by (1)Abdullah

Comments

  • I am not opening a mega link but I do have 14U rack at home. Used to be full of 2U rackmounts crunching numbers for WCG.

    But that was like 20 years ago. The rack is still here but only a single 2U rackmount remains for cold backup.

    ♻ Amitz day is October 21.
    ♻ Join Nigh sect by adopting my avatar. Let us spread the joys of the end.

  • All I saw is terrible cable management... ;)

    I have a 36U rack in my basement with a k8s cluster, five dual-E5v2 nodes plus a couple SFF desktops, 10/40Gb networking. My storage needs are not so big, more compute.

    My rack is nothing compared with some of the folks in r/homelab .

    Most of the folks in our community at serverbuilds.net are building Unraid whiteboxes for Plex media.

  • edited August 2020

    About 18 months ago I ran about 12U of rackmount gear at home (2 8-bay LFF Supermicros, 4 Dell R815s). SWMBO thought the heat and noise was too much. Now all of that gear has been replaced by two Mediasonic 4-bay Proboxes (well, one is a a Probox, the other is a Proraid I run in JBOD) and 3 Lenovo Tinys.

    An M72e Tiny (i5-3470T, 16GB, 240GB SSD), M900 (i5-6500T, 32GB, 240GB SSD) and an M93p (i7-4785T, 8GB, 240GB SSD). In the two Mediasonics I have 2x8TB WD Reds, 2x3TB WD Reds, 2x3TB HSGT, 2x2TB HSGT. All the pairs of drives are BTRFS RAID1 with transparent filesystem compression turned on.

    I run all the fun homelab/homeserver stuff...plex, smb/nfs, nextcloud, pi-hole, tvheadend, sonarr/radarr/deluge, Win10 VM, etc...

    I'd like to get it all down to running on SBCs like an ODROID-N2 or OrangePi 4.

  • @rajprakash said:
    About 18 months ago I ran about 12U of rackmount gear at home (2 8-bay LFF Supermicros, 4 Dell R815s). SWMBO thought the heat and noise was too much. Now all of that gear has been replaced by two Mediasonic 4-bay Proboxes (well, one is a a Probox, the other is a Proraid I run in JBOD) and 3 Lenovo Tinys.

    An M72e Tiny (i5-3470T, 16GB, 240GB SSD), M900 (i5-6500T, 32GB, 240GB SSD) and an M93p (i7-4785T, 8GB, 240GB SSD). In the two Mediasonics I have 2x8TB WD Reds, 2x3TB WD Reds, 2x3TB HSGT, 2x2TB HSGT. All the pairs of drives are BTRFS RAID1 with transparent filesystem compression turned on.

    I run all the fun homelab/homeserver stuff...plex, smb/nfs, nextcloud, pi-hole, tvheadend, sonarr/radarr/deluge, Win10 VM, etc...

    I'd like to get it all down to running on SBCs like an ODROID-N2 or OrangePi 4.

    In India ??

  • MasonMason AdministratorOG

    Here my home rack (of sorts) out in the shed -

    Just a few servers (a quanta 1U dual node box and a 2U supermicro storage box) and handful of Pi's connected via WiFi bridge (it's surprisingly pretty stable).

    Thanked by (3)Ganonk MrPsycho vish

    Head Janitor @ LES • AboutRulesSupport

  • @deepak_leb said:

    @rajprakash said:
    About 18 months ago I ran about 12U of rackmount gear at home (2 8-bay LFF Supermicros, 4 Dell R815s). SWMBO thought the heat and noise was too much. Now all of that gear has been replaced by two Mediasonic 4-bay Proboxes (well, one is a a Probox, the other is a Proraid I run in JBOD) and 3 Lenovo Tinys.

    An M72e Tiny (i5-3470T, 16GB, 240GB SSD), M900 (i5-6500T, 32GB, 240GB SSD) and an M93p (i7-4785T, 8GB, 240GB SSD). In the two Mediasonics I have 2x8TB WD Reds, 2x3TB WD Reds, 2x3TB HSGT, 2x2TB HSGT. All the pairs of drives are BTRFS RAID1 with transparent filesystem compression turned on.

    I run all the fun homelab/homeserver stuff...plex, smb/nfs, nextcloud, pi-hole, tvheadend, sonarr/radarr/deluge, Win10 VM, etc...

    I'd like to get it all down to running on SBCs like an ODROID-N2 or OrangePi 4.

    In India ??

    Nope. Southeast US.

  • @rajprakash said: Southeast US.

    @Mason said: home rack (of sorts) out in the shed

    Curious - what about a fire hazard to the rest of the house? Any special precautions?

  • MasonMason AdministratorOG

    @nullnothere said:

    @rajprakash said: Southeast US.

    @Mason said: home rack (of sorts) out in the shed

    Curious - what about a fire hazard to the rest of the house? Any special precautions?

    Heh, well my shed is disconnected from the house a good 25-30 feet from any structure (other than a fence), so burn baby burn I guess. Used to have the same setup in my basement at an old house and never really worried. Only component I'd be worried about is the AC to DC converter, but I currently have that server (the quanta dual node) and the converter powered off anyways.

    Thanked by (1)nullnothere

    Head Janitor @ LES • AboutRulesSupport

  • ClouviderClouvider Hosting ProviderOG
    edited August 2020

    Isn't a life partner angered by the noise more serious hazard though @Mason :)?

    Thanked by (2)Mason vimalware
  • MasonMason AdministratorOG

    @Clouvider said:
    Isn't a life partner angered by the noise more serious hazard though @Mason :)?

    Lol absolutely right. The quanta was bearable, but the SM was a damn wind tunnel haha. Primary motivator to move everything out of the house :)

    Thanked by (1)Clouvider

    Head Janitor @ LES • AboutRulesSupport

  • SuperMicro 846 or similar 4U, SQ PSUs, swap the case fans for Arctics, use active tower coolers to compensate for the reduced airflow. It's a bit janky but works. I do something similar in a few cheap Rosewill 4U cases; the loudest noise is from the 7200rpm drives.

    Just avoid anything 1U! ?

    Thanked by (1)Mason
  • The only pieces of network gear I have at home is my WatchGuard Firebox M300 firewall and Aruba S2500-24P switch. Everything else is all desktop form factor. So for now I just use industrial shelving. My internet at home is kind of mediocre for hosting anything at home so anything here is either too expensive to justify hosting in a datacenter for what it's used for, or is only used for stuff that affects me at home (PiHole, PXE server, NAS, VPN access to all the stuff... etc.). For anything serious I rent a VPS or Dedi in a datacenter. Doing the math a lot of the time it's cheaper to have it running in a DC than at home when you factor in proper cooling, power cost, bandwidth caps, internet speeds, and redundancy for all those said systems as well. Maybe if I ever have a need to move the gear to another location I'll consider redoing it all to be rackmount stuff, but for now what I have works well.

    Cheap dedis are my drug, and I'm too far gone to turn back.

  • Why use server gear at home? It is not house-friendly.

    My gears are hand-me-downs or off lease workstations, work just as good.

    The all seeing eye sees everything...

  • edited August 2020

    @nullnothere said:

    @rajprakash said: Southeast US.

    @Mason said: home rack (of sorts) out in the shed

    Curious - what about a fire hazard to the rest of the house? Any special precautions?

    To me there is no higher fire hazard risk with using the rackmount gear than there is desktop gear. In fact, enterprise gear generally uses better quality components so one might say it's less risk than cheaper desktop power supplies and gear.

    Thanked by (1)nullnothere
  • @Mason said:
    Here my home rack (of sorts) out in the shed -

    Just a few servers (a quanta 1U dual node box and a 2U supermicro storage box) and handful of Pi's connected via WiFi bridge (it's surprisingly pretty stable).

    Nice setup. Down here in metropolitan South Florida, running anything out in the back shed will certainly require additional room cooling.

    Thanked by (1)Mason
  • @rajprakash said: no higher fire hazard with using the rackmount gear than there is desktop gear

    Agreed, but depending on the density and (assuming) always-on equipment without any particularly special cooling... kind of keeps me on edge. For desktops/laptops, I assume that they're powered-off/suspended when not actively used but not for rack-mount server class type hardware.

  • @Anon I think you misspelled @LTniger's user from LET...

    Thanked by (1)Mason
  • @nullnothere said:

    @rajprakash said: no higher fire hazard with using the rackmount gear than there is desktop gear

    Agreed, but depending on the density and (assuming) always-on equipment without any particularly special cooling... kind of keeps me on edge. For desktops/laptops, I assume that they're powered-off/suspended when not actively used but not for rack-mount server class type hardware.

    I haven't powered off or suspended my desktop in years.

  • freerangecloudfreerangecloud Hosting ProviderOG

    I have a 42U Sun rack at home. Scored it for free when a local colo had a customer leave who left their racks behind.

    Currently I just have my router, switch and a single whitebox server used for home automation running in it, but it's a great place to store surplus rack gear!

    https://freerangecloud.com - VPS, shared hosting, LIR services & more!

  • MasonMason AdministratorOG

    @Pwner said:
    @Anon I think you misspelled @LTniger's user from LET...

    Big yikes!

    Also... regarding the video in OP, that room would look a thousand times better with a tiny bit of cleaning. Holy shit, stop piling old printers and cables everywhere and organize!

    Head Janitor @ LES • AboutRulesSupport

  • MikeAMikeA Hosting ProviderOG
    edited August 2020

    Anyone recommend any small racks meant for home use? I've been wanting to get a small (6-10U, short depth?) one but don't have much space.

    ExtraVM - High RAM Specials
    Yours truly.

  • havochavoc OGContent Writer

    @MikeA said:

    Anyone recommend any small racks meant for home use?

    Don't think there is anything "meant for home use" as such. @seanho point re reddit sub /r/homelab is good though - those guys know everything there is to know about this

    I wouldn't even consider any of this unless in a location with cheap power & it's far away from living space re noise from server fans.

    Def building a dedicated server room when I have way too much $$$ one day though lol

  • https://www.amazon.com/StarTech-com-Open-Frame-Server-Rack/dp/B00P1RJ9LS

    How much depth do you have to work with? R210ii, CSE-512, and similar are around 16" rack depth, but most full servers need 26-30" for the rails. Network and A/V racks / cabinets are shorter but really constrain your selection of chassis. Also many of them are only 2-post.

  • @Mason said:
    Here my home rack (of sorts) out in the shed -

    Just a few servers (a quanta 1U dual node box and a 2U supermicro storage box) and handful of Pi's connected via WiFi bridge (it's surprisingly pretty stable).

    Nice setup. You should start selling some vps branded “Masons home rack”. For $7 I’ll take a few to idle

    Thanked by (2)Mason Abdullah
  • MasonMason AdministratorOG

    @vish said:

    @Mason said:
    Here my home rack (of sorts) out in the shed -

    Just a few servers (a quanta 1U dual node box and a 2U supermicro storage box) and handful of Pi's connected via WiFi bridge (it's surprisingly pretty stable).

    Nice setup. You should start selling some vps branded “Masons home rack”. For $7 I’ll take a few to idle

    Heh, I'll pass on that. I'd rather get a Brazilian wax before letting any of y'all on my home network lol

    Thanked by (1)vish

    Head Janitor @ LES • AboutRulesSupport

  • I moved away from that idea a long time ago after looking at power consumption costs. Most things I need hosted are low usage and 1 core 1gb VPS with its own dedicated IP is more stable, has a better connection, and is cheaper in power consumption. My only in home server is extremely low power x86 atom nuc connected to a raid 1 USB 3.0 enclosure. Until you've seen the power savings difference between running your own actual server 24/7 along with actual usage requirements you can't gauge your needs properly.

    Every once in a while though I'll need something running 24/7 crunching away. Normally they aren't time sensitive though and can run it off arm. In that case I'll push it to an idle rasberry PI connected to a battery backup and let it do its thing for a week or so.

    Only thing that might get me to change my setup would be the costof 10gbps networking coming down to reality. Then I would try and build a low power raid 10 server. But being capped at 1gbps a 3.0 usb raid 1 enclosure is already not fully saturated in terms of file sharing.

Sign In or Register to comment.