Categories
Tech Explorations

Dirt-Cheap Livestreaming: How to do professional quality streaming on a budget

A couple of years ago I wrote an article on how I cobbled together livestreaming hardware at the very beginning of the pandemic. Finding AV equipment was very difficult, so I did what I could with what I had. Almost three years have passed since then, and in the meantime I built a multi-camera, simulcasting capable, live event oriented livestreaming solution on a shoestring budget. Many compromises were made, many frustrations were had, but it worked.

This is how I built it and made it work.

Why bother changing it?

After my initial stint doing a couple of livestreams for small events, the requirements kept popping up, for simple one camera setups my equipment would do, but it quickly started falling short; and as venues slowly started filling back up with people, I just couldn’t rely on building the event around the camera. I had to find a way to do this without being intrusive, and to be able to fulfill my clients’ needs.

Lessons from the previous setup

The setup I described in the previous writeup was not free of complications. On the video side, I was limited by a single HDMI input, and any switching solutions I had were too rough for my geriatric capture card; it would take way too long to reacquire a picture. On the audio side, my trusty USB interface was good, but way too finicky and unreliable (the drivers were crap, the knobs were crap, and while I got it for free it just wasn’t worth the hassle) for me to be comfortable using it for actual work. I also had a single camera, very short cable runs, no support for mobile cameras, and an overall jankiness that just would not cut it for bigger events.

A new centerpiece: ATEM Mini

My first gripe was my capture card. I was using an Elgato GameCapture HD, a capture device I bought back in 2014 which could barely do 1080p30. I still like it for its non-HD inputs (which I have extensively modified and a story for another time), but the single HDMI input, the long acquisition times, and the near three second delay in the video stream made it super janky to use in practice.

After a month and a half on a waiting list, I managed to get my hands on a Blackmagic ATEM Mini, the basic model, and it changed everything: it has four HDMI inputs, an HDMI mix output, USB-C interface, and two stereo audio interfaces, along with fully fledged live video and audio processing: transitions between sources, automatic audio switching, picture-in-picture, chroma and luma keying, still image display, audio dynamics and EQ control and so much more. Its rugged buttons and companion app make operating the ATEM Mini an absolute breeze, and its extensive functionality and integration makes it the closest thing to an all-in-one solution. Many things that I did using lots of devices and maximum jankiness were consolidated in this one device. Anyone who is getting into livestreaming should get one of these.

Dejankifying my audio

Having good audio is almost more important than having good video; a stream with mediocre video but good audio is serviceable, one with good video and bad audio is almost unbearable. Because most events I work on have live audiences on-site, there is no need for me to handle microphones or other audio equipment directly: most mixers have a secondary output I can tap into and get a stereo mix that I can pipe into the ATEM Mini. For line-level unbalanced inputs I can connect them straight into the ATEM, and if I needed something more involved like multiple audio sources, preamplification, or any sort of basic analog processing I keep a small Phonic AM440D mixer in my equipment case, which gives me endless flexibility for audio inputs.

One of the advantages of using common hardware for audio and video is that both streams are synchronized by default, which removes the need for delaying the audio stream entirely, which once again reduces the complexity of setting up livestreams in the field.

New cameras and solving the HDMI distance limitation

For a while, a single Panasonic HDC-TM700 was my only video source, with an additional Sony camera on loan for some events. This was one of my biggest limitations, which I set out to fix.

Most semi-pro/pro cameras are way too expensive for my needs: even standard consumer cameras are way out of my budget, a single camera like the one I already have would need a couple of months worth of revenue, which given that I’m still at university I couldn’t ramp up. There are ways out though.

For one, I thought about USB webcams. There are some good ones on the market right now that are more than enough for livestreaming, but they are very much on the expensive side and I have never liked them for something like this: poor performance at low light, small, low quality lenses and fixed apertures, and low bitrates are just a few of my gripes. Also, I had a better capture card that could take advantage of HDMI cameras. So I looked around AliExpress, and found exactly what I was looking for: Process Cameras.

A process camera is essentially a security camera with an HDMI output. They have no screen, fixed (although decent quality and reasonably large) lenses, and a perfectly usable dynamic range. Since the do not have a screen or autofocus capabilities they are best used for fixed shots, but most of my streams rarely require movement (for which I have the other camera). Best of all: they were very cheap, at around $100 a piece if you included a tripod.

Now, we need to talk about HDMI: It’s a perfectly good standard for home use, but it has some problems in this use case (which we can forgive, this is very much an edge case), the biggest one being max distance. HDMI rarely works above 10m, and even 5m is challenging without active cables and devices that can actually drive cables that long. There are optical cables which can take them over the 10m mark, but these are expensive, bulky, and stiff, which complicates using them in events where they could end up in the way. The solution is somewhat counter-intuitive: just don’t use HDMI cables. But isn’t it the best way to do it in this case? Yes!

See, just because we’re using HDMI signals doesn’t mean we need to adhere strictly to the electrical specification as long as we can get the message across while converting it to a physical medium better suited for long distances. There are many ways of doing this, some use coaxial cables and HD-SDI, others use simple fiber optic patch cables, but I went for old twisted-pair Cat5e: It’s cheap, it’s available, and there are ready-made converters with an HDMI connector on one side and a 8P8C plug on the other. Add a 3D-printed bracket for mounting it on the side of the camera and some small HDMI patch cables and we’re off. With these converters I can get 25m runs no problem, and even 75m in extreme cases, which is enough for most venues.

This was not the only use for a 3D printer: I made custom power bars which hang from a tripod’s center hook, for powering cameras and converters.

Better networking and server equipment

In my previous article I used a Raspberry Pi 3B+ to run an NGINX server with the appropiate RTMP module and some extra software to make a simulcasting server, where a single stream can feed multiple endpoints. This worked great, but Raspberry Pis are a bit anemic and I wanted something with a bit more oomph in case I wanted to do more with it. The idea of a portable server is useful for me not only for streaming, so I grabbed a 2011 Mac Mini on Facebook Marketplace, swapped the hard drive with an SSD and off I went. The additional RAM (4GB instead of just 1GB) allows me to have more services set up without worrying about resources and the beefier Intel Processor allows me more freedom to run concurrent tasks. There is even some QSV work I could do to use hardware encoding and decoding, but that’s a story for another time.

I also ditched my 16-port rackmount switch in exchange for a cheap Netgear WNDR3400v2 wireless router, which gives me a nice hotspot for connecting my phone or in case someone else needs it; the new router is much lighter too.

A portable camera jank-o-rama

For a couple of scenarios, I really needed a portable camera that was fully untethered; maybe for showcasing something, or to keep an eye on the action while on the move. There are some wireless HDMI solutions but it always felt like loosing a good camera for an entire shoot (I usually run a one-man operation, so it was pretty much always a short run for the portable camera), and the cost argument kept popping up.

The way I solved it is to me as janky as it is genius: just use your phone. Most modern phones have excellent cameras, decent audio, and even optical stabilization. I used Larix, a streaming app to create a streaming server that I broadcast over WiFi (see why I needed a wireless router?) to be picked up by OBS. Unreliable? a little bit. Has it ever mattered? Not really, this capability is more of a novelty and a fun thing to add to my repertoire, but not meant as a centerpiece. I have even toyed with a GoPro Hero 7 Black streaming to my RTMP server and picking it up from there, which works, albeit with lots of lag. It’s a bit of a pain to not have it in my ATEM switchboard and having to switch it over OBS, but, you know, it’ll do.

Miscellaneous

Until now I carried everyting on a duffel bag, which just wasn’t going to work anymore: the weight killed my back anytime I went near the thing and there just wasn’t enough space: so I needed something like the big wooden cases that the pro audio industry uses, without breaking the bank. I just took an old hard-side suitcase and crammed everything in it. It’s big enough for me to house most of my stuff but not too big as to be bulky, and allowed me to keep everything tidy but without wasting space.

Because my new cameras don’t have a screen, setting up the shot and focusing can be a challenge. I usually resorted to using my second monitor to do so, but it was always janky and time consuming. To solve this, I bought a CCTV camera tester with an HDMI input. This is essentialy a monitor with a battery, for way less than a professional one.

I needed lots of cables, some of them really long. I ended up buying rolls of power and Cat5E cable and made them myself. My standard kit includes four 25m Cat5E rolls and a 75m one in case the network jack is far away, plus three 20m extension cords so I can place the cameras wherever I want. This is not including the three power bars for the cameras and a fourth one for my computer.

So what comes next?

To be absolutely honest, I think this is as far as this setup goes. Livestreaming jobs have dried up now that the pandemic has quietened down, and pursuing more stable ventures would require lots of investment, which I’m not really in a position to make. I found a niche during the pandemic, and I milked it as much as I could, I’ve paid for the equipment two or three times already, so I’m not complaining, but until I find the time to do that YouTube channel I’ve always wanted to do, I don’t think it’s going to see the light of day for a while.

Closing thoughts

I’ve had some tremendous fun building up this setup, and for my uses, it has proven itself time and again as a dependable if basic setup. Maybe you can get your own ideas to get creative; many of the lessons learned here are very much applicable to other streaming opportunities and who knows, maybe you’ll get some ideas to get creative with this media.

Categories
Tech Explorations

Building a better Elgato Game Capture HD

Back in 2015 I got myself a brand new Elgato Game Capture HD. At the time, it was one of the best capture cards on the consumer market; it has HDMI passthrough, Standard definition inputs with very reasonable analog-to-digital converters, and decent enough support for a range of different setups.

Despite its age, I still find it very handy, especially for non-HDMI inputs, but the original design is saddled with flaws which prevent it from taking advantage of its entire potential. This is how I built a better one.

Using this card in the field

After a few months of using it to capture PS3 footage and even making some crude streaming setups for small events using a camera with a clean HDMI output, two very big flaws were quickly apparent: First, the plastic case’s hermetic design and lack of thermal management solutions made it run really hot, which after prolonged operation resulted in dropouts which sometimes required disconnecting and reconnecting the device and/or its inputs, and second, the SD inputs are very frustrating; the connectors are non-standard and the dongles provided are iffy and don’t even allow for taking full advantage of its capabilities without tracking down some long discontinued accessories.

My first modification to it was rather crude: after it failed on a livestream, I took the Dremel to it and made a couple of holes for ventilation, coupled with an old PC fan that I ran using USB power (the undervolting of the fan provided enough cooling without being deafening). This obviously worked, but it introduced more problems: the card now made noise, which could be picked up by microphones, and it now had a big gaping hole with rotating blades that was just waiting to snatch a fingernail. This wouldn’t do.

Solving thermal issues

It quicly became clear that the original case for the Elgato Game Capture HD was a thermal design nightmare: it provided no passive cooling, neither by having heatsinks or vents. The outer case design was sleek, but it sacrificed stability on the way.

This device is packed with chips, all of which provide different functions: HDMI receivers and transmitters, ADCs, RAM, and many other glue logic parts, which meant that power consumption was going to be high. Having a custom LSI solution or even using FPGAs could have been better in terms of power consumption, but this is often way more expensive. Amongst all of the ICs, one stood out in terms of heat generation: a Fujitsu MB86H58 H.264 Full HD Transcoder. This was doing all the leg work in terms of picking up a video stream and packaging into a compressed stream and piping it through a USB 2.0 connection. It’s pretty advanced stuff for the time, and it even boasts about it’s low power consumption in the datasheet. I don’t know exactly why it runs so hot, but it does, and past a certain threshold it struggles and stutters to keep a video signal moving.

There was nothing worth saving in the original enclosure, so I whipped up a new one in Fusion 360 which includes many ventilation holes, and enough space above the chip so I could add a chipset heatsink from an old motherboard. I stuck it down with double sided tape, which is not particularly thermally conductive, but along with the improved ventilation is enough to keep the chip to frying itself to oblivion. I ran another protracted test, and none of the chips got hot enough to raise suspicion, and even after three hours of continuous video, the image was still being received appropriately. I initially though there could be other chips in need of heatsinks, but it appears that the heat from this transcoder was the one pushing it over the edge, without it the other ICs got barely warm.

Since we made a new enclosure, let’s do something about that SD input.

Redesigning the SD video inputs

This card hosts a very healthy non-HDMI feature set: It supports composite video, S-Video, and Y/Pb/Pr component video, along with stereo audio. The signal is clean and the deinterlacing is perfectly serviceable, which makes it a good candidate for recording old gaming consoles and old analog media like VHS or Video8/Hi8. However, Elgato condensed all of these signals into a single non-standard pseudo-miniDIN plug, which mated with included dongles. Along with a PlayStation AV MULTI connector, it came with a component breakout dongle which allowed any source to be used. With the included instructions you could even get composite video in this way. S-Video however was much more of a pain; while it was possible to connect an S-Video signal straight into the plug, it left you without audio, and the official solution for this was to purchase an additional dongle which of course by the time I got it no one had.

To solve it, I started by simply desoldering the connector off the board. I saw some tutorials on how to modify S-Video plugs for the 7-pin weirdness of the Elgato, and even considered placing a special order for them, but in the end I realized that it was moot. The dongles sat very loosely on the connector, and any expansion I wished to make on it was going to be limited by that connector, so I just removed it.

To the now exposed pad, I soldered an array of panel-mount RCA and S-Video connectors I pulled out of an old projector, so I could use them with whatever standard I pleased: three jacks for Y/Pb/Pr component video, a jack for S-Video, a jack for composite video, and two jacks for stereo audio, complete with their proper colors too. The SD input combines the different standards into a single three-wire bus: Pb (component blue) is also S-Video chroma (C), Pr (component red) is also composite video, and Y (component green) is S-Video Luma (Y), so the new connectors are electrically connected to the others, but for simplicity I much prefer it to having to remember which one is which, or having to keep track of adapters for S-Video (which I use a lot for old camcorders).

Final assembly and finished product

After printing the new enclosure I slotted in the board (it was made for a press fit with the case, to avoid using additional fasteners), and soldered the new plugs to the bare pads of the connector using thin wire from an old IDE cable. The connectors were attached to the case using small screws, and the design was such that all of the connectors were on the bottom side of the case, which meant no loose wires. The top stays in place using small pieces of double sided tape and some locating pins, which makes dissassembly easy, great for future works or just showing off.

I wish this was the product I received from Elgato. It allows the hardware to work to its true potential, and it makes it infinitely more useful in daily usage. No more faffing around with dongles, no more moving parts, or dropouts on a hot day. It feels like this was what the engineers at Elgato envisioned when they came out with this thing. The Elgato Game Capture HD is now my main non-HD capture device and even for HDMI stuff it still gets some usage, when I can’t be bothered to set up the ATEM switcher.

Finishing thoughts

I love the Elgato Game Capture HD, both for what it is capable of doing and what it did to the nascent streaming and video creation scene back in it’s day. I love its featureset and I’m even fond of its quirks, but with this mod I feel like I have its true potential available without compromises. It changed its place in my toolkit from a thing I kinda know how to use that stays in the bottom of my drawer to a proven and reliable piece of equipment. If you have one of these devices and feel unsatisfied with its performance, I urge you to give it a try, you will no doubt notice the difference and maybe you’ll keep it from going into the bin.

Categories
Tech Explorations

A server at home: childhood fantasy or genuinely useful?

Ever since I was a child, I always dreamed of having the sort of high-speed, low-drag, enterprise-grade equipment in my home network. For me it was like getting the best toy in the store, or having the meanest roller shoes in class (reference which I guess dates me). It was as if getting these devices would open my world (and my Internet Connection) to a world known only by IT professionals and system administrators; something that would show me some hidden knowledge (or class cred maybe) that would bring my computing experience to the next level.

Anyone who has ever worked in the IT field knows the reality of my dreams: while the sense of wonder is not entirely erased, the reality of mantaining such systems is at best dull. But, you know, I’m weird like that. If I weren’t you wouldn’t be here.

Almost a decade ago, I embarked on a journey of building and mantaining a home server. It has solved many of my problems, challenged me to solve new ones, and taught me inmensely about the nuances of running networks, mantaining machines, building solutions, and creating new stuff with it. This is my love letter to the home server.

This article is somewhat different than most of my other content; it’s more of a story rather than a tutorialized narrative. It’s not really meant to be a guide to build or maintain servers at home, it’s more of a tool for understanding the rationale behind having one.

Baby steps

As the designated “IT guy” in my friend group, I often found myself helping my friends and family with their computing related needs. I started helping others install software, customize their computers, using basic software like Office and stuff like that. We also gamed some, whatever we could pirate and get to run in crappy laptops and stuff like that. Our friend group was big into Minecraft at the time, as most kids were, and we loved to show off our worlds, different approaches and exploits to each other, sharing our experiences in the game. Inevitably, one day the inevitable question came: What if we made a Minecraft server for all of us to play in?

The writing was on the wall, so I set out to make one. At the time I was rocking the family computer, a respectable 21.5″ 2013 iMac. It was beefy enough for me to run the server and a client at the same time, and paired with LogMeIn’s Hamachi (which I hated, but didn’t know better), a highlight of my childhood was born. It barely worked, many ticks were skipped and many crashes were had, but it was enough for me and my group of friends to bond over.

Around the same time my parents bought a NAS, an Iomega StorCenter sporting a pair of striped 500GB hard drives for a whopping 1TB of total storage. Today that sounds quaint, but at the time it was a huge amount of space. For many years we kept the family photos, our music library, and limited backups in it. It opened my eyes to the posibility of networked storage, and after an experiment with a USB printer and the onboard ports on the NAS, I even toyed with basic services to provide. An idea was coming to my head, but I was just starting high school, so there was pretty much no budget to go around, so I just kept working around all the limitations.

At least, up to a point. Within a couple of months, both the RAID array in the Iomega NAS and my iMac’s hard drive failed, without any backups to recover. It was my first experience with real data loss, and many memories were forever wiped, including that very first Minecraft server world. Most of our family stuff was backed up, but not my files; It sucked. It was time for something new.

Building reliable servers from scrap

I was still in high school, there was some money to go around, but nowhere near enough me to get a second computer to keep online forever. So I went around looking for scraps, picking up whatever people were willing to give me and building whatever I could with it. My first experiments were carried out on a geriatric AMD Athlon from a dump behind my school from the early 2000s which wasn’t really up to the task of doing anything, but it gave me valuable information regarding building computers and what that entailed. My first real breakthrough was around 2015 when I managed to get a five year old underpowered Core i3 Tower with 4GB of RAM sitting in a dumpster outside an office block near my house. After a clean install of Windows and some minor cleaning I had, at last, a second computer I could use as a server.

I didn’t know much about servers at the time, which meant that my first incursion was basically an extension of what I’d seen before: using SMB to share a 1TB drive I’d added to it by removing the optical drive and rescuing a hard drive from a dead first-gen Apple TV. I added a VPN (ZeroTier One, it’s like Hamachi but good), printer sharing, VNC access and pretty soon I was running a decent NAS.

I added a second 1TB drive a few months after (which involved modding the PSU for more SATA power ports) and some extra software: qBitTorrent’s web interface for downloading and managing Torrents from restricted networks (like my school’s), automatic backups using FreeFileSync, and a few extra tidbits. I even managed to configure SMB so I could play PS2 games directly from it, using the console’s NIC and some homebrew software.

This was my setup for around four years, and It did it’s job beautifully. Over time I added a Plex server for keeping tabs of my media, And I even played around with Minecraft and Unturned servers to play with my friends. Around 2019 though, I was starting to hit a bottleneck. Using a hard drive as boot media for the server was dog slow, and I had run out of SATA ports for expanding my drive roster. I had been toying with the idea of RAID arrays for a while, especially after losing data to faulty drives. Mirroring was too expensive for me, so I my method of choice was level 5: single parity distributed between al drives, single drive failure tolerance. I just needed a machine capable of doing it. For a while I wondered about buying HBAs and just tacking the drives onto the old hardware and calling it a day. I ended up doing something completely different.

At last, something that looks like a server

In the end I decided that a better idea was to upgrade the motherboard, processor, the power supply and a few other stuff. I added a USB 3.0 card for external drive access, upgraded de processor from a Core i3 240 to a Core i5 650, and got a motherboard similar to the current one but with six sata ports, and got four 2TB video surveillance drives for dirt cheap, along with a beefier power supply to tie it all together. Around this time I got a Gigabit Ethernet switch, which vastly increased throughput for backups. It was mostly used, bottom of the barrel stuff, but it allowed me to create a RAID-5 array with 6TB of total storage, and slightly more room to expand my activities. Lastly, I replaced the boot drive with a cheap SATA SSD.

With it came an actual Plex library, a deep storage repository for old video project files, daily backups, a software library for repairs, MySQL for remote database work, and even a Home Assistant VM for home automation. I kept running servers and making experiments on it. This second iteration lasted me for around three more years, which was more than I expected from what was essentially decade old hardware running in a dusty cupboard next to my desk.

Soon enough however, new bottlenecks started appearing. I was getting more and more into video work, and I was in need of a server that could transcode video in at least real time. Most CPUs cannot do that even today, so I was looking into GPU acceleration. I also started suffering with Windows: It works fine for begginers, and I even toyed with Windows Server for a while, but it’s just way behind Linux distros for server work. It took up lots of resources doing essentially nothing, the server software I needed was clunky, and I lacked the truly networked logic of a UNIX-like OS.

Enterprise-grade hardware

Once again, I looked upon the used market. Businesses are replacing hardware all the time, and it’s not difficult to find amazing deals on used server hardware. You’re not getting the absolute latest and greatest stuff on the market, but most devices are not really that old, and more than capable for most home uses.

From the second-generation server I could still salvage the USB 3.0 card, the power supply, the boot drive, and the four RAID drives. All of those were bought new, and they were in excellent condition, so there was no need to replace them. I wanted something that would last me at least the next five years, which could accomodate all my existing hardware, and had plenty of room for expansion: PCIe slots for GPUs and other devices, proper mounting hardware for everything, and a case to keep everything neat.

I went for a tower server instead of a rackmount mainly because I don’t have a place for a rack in my home, and the long-and-thin package of most racked servers made no sense in my case. After a bit of searching I came upon the HP ProLiant ML310e Gen8 v2: An ATX-like server with a normal power supply, four drive bays with caddies, an integrated LOM, and even an optical drive (which given that my server now hosted the last optical drive in the house, was a must). It was perfect. I also managed to score an NVIDIA GTX1060 6GB for cheap, which is more than a couple generations behind at this point, but most importantly for me, it has NVENC support, which meant transcoding HD video at a few hundred FPS with ease.

Building the current server

My third generation server was built around the aforementioned HP tower, but many modifications had to be made in order to acheive the desired funcionality. After receiving it, I swapped the PSU and modified one of the accessory headers for the server’s proprietary drive backplane connector, so I could power the drives from the new power supply. Apart from increasing the max load from 350W to 500W, it also gave me PCIe power connectors to drive my GPU, which the previous PSU lacked.

Then, I installed the GPU and encountered my first problem: Servers like these use only a couple of fans and a big plastic baffle to make sure the air makes it to all the components on the board. This is a great idea in the server world, it reduces power consumption, decreases noise, and allows for better cooling, but it’s also interferes with the GPU: it’s not a server model, so it’s taller than a 2U rack, so the baffle cannot close. Not to worry though, a bit of Dremel work later I had a nice GPU-shaped hole, which I made sure to make as small as possible to not disturb the airflow too much. The GPU shape came in my favor too, as it redirects air perfectly onto the fanless CPU cooler.

Other than the four drive bays (in which I installed the 2TB drives) there isn’t much place for another boot drive, so I used the empty second optical drive bay to screw in the boot SSD. A lot of cable management and some testing later, I was ready to go.

For the OS, I went with Ubuntu Server 20.04 LTS. I’m very familiar with this whole family of Linux distros, so it made sense to use it here. My servers at work also use it, so I had some server experience with it as well. Once de OS was in, I installed the drivers for the LOM and the GPU, along with building the RAID-5 array again using mdadm. I was using Microsoft Storage Spaces for the array in the previous generation, so I had to rebuild it in Linux cleanly. After dumping the data into some spare drives and building the array on Linux I was in business.

For the software I installed Jellyfin (I was getting sick of the pay-to-win model of Plex and I wanted something with IPTV support), Samba for the shared folders, VirtualBox and the latest Home Assistant VM (don’t @ me about Docker, that version is crap and the supervised install is a pain in the ass, I’m done with Docker for now), qBitTorrent, MySQL, ZeroTier One, OctoPrint, and of course, a Minecraft server. I also installed a few quality-of-life stuff like btop for Linux and thus, my server was complete, at least for now.

The realities of living with a server at home

Despite my childhood aspirations, enterprise-grade hardware has a major flaw with respect to home equipment: noise. I can’t blame them too much, after all, these devices are made for production environments where no one really minds, or vast datacenters where noise is just part of the deal. I on the other hand, like sleeping. The first time I turned the machine on I was greeted with the cacophonous roar of two high RPM server fans. It dawned on me pretty quickly that this noise simply would not pass in my house, so I quickly set about fixing it.

Unlike desktop boards, the OS doesn’t have control over the fans. To the sensor package in Linux, it’s like they didn’t even exist. I did get some temperature readings and there might be a way to address them via IPMI, It just didn’t work right. The job of handling the fans is up to the iLO, HP’s name for a LOM, a tiny computer inside the server that allows for low-level remote management, including remote consoles and power cycling. The homelabbing community figured out years ago how to tune them down, and a bit of unofficial firmware later, my fans calmed down to a reasonable 20%, and I could sleep again. I took this oportunity to put a piece of tape over the POST buzzer, which had no right to be as loud as it was.

Closing thoughts

This has been a wild ride of old hardware, nasty hacks, ugly solutions, and wasted time, but in the end, the outcome is so much more that what I ever envisioned: a device from which I could automate the boring bits of my daily routine, keep backups and storage of all my stuff safely, and having access to all my services from wherever I happen to be. If you’re a nerd like me and willing to spend some time faffing around in the Linux CLI, I highly recommend you make yourself a home server, no matter the budget, device, or services.