Categories
Unwarranted Opinions

Personal computers are done and using a 15 year old computer for a week made me realize it

When I started writing the draft for this article back in March, I was spinning up this narrative that old laptops still have uses as writing machines; devices used for distraction-free text composition, especially if you could get a higher-end one with a good keyboard and a decent screen. This was a mostly uncontroversial write-up on my experience using a fifteen year old MacBook Pro for writing this blog, among a couple of other tasks.

But then I saw Cathode Ray Dude’s video on HP’s QuickLook, and my head was thrown in a flat spiral straight into madness. Despite the A plot being about an absolutely heinous abuse of UEFI and an eldritch nightmare of stopping Windows’ boot process in order to get to your email slightly faster, it was the B plot that put my undies in a twist: The second half of the video is this wonderful opinion piece hinging on the fact that for most people, computers are pretty much at their endgame: for most normal applications, a computer from a decade ago is indistinguishable from another one that came out last year. I highly recommend you watch that video as well.

While I could conceptually wrap my head around it, as the resident turbo-nerd in my group of friends I have been used to chasing the bleeding edge for years, if slightly hampered by budget constraints. The idea of an “office-use” computer from fifteen years ago still being perfectly cromulent seemed absolutely insane: after all, that’s pretty much all I do these days involving anything but what I do in my spare time.

So I set out to prove it, and in the way I stumbled across many perspectives, a new dread regarding late capitalism, and maybe some lessons for the less tech-savvy along the way.

The setup

To put this thesis to the test the experiment was simple: use an old computer for daily tasks and see how we fare. For this, I chose an Apple MacBook Pro from mid 2009. It sports a dual-core Intel Core 2 Duo P8700, a crisp 1280×800 display with amazing colors, and a surprisingly well-kept exterior. Inside, I made some modifications in order to better my chances:

  • The RAM was upgraded from the factory 2GB of 1066MHz DDR3 to 4GB.
  • The battery was replaced, as the old one had died.
  • The original 250GB 5400RPM hard drive was replaced by a 250GB SATA SSD. It’s DRAM-less and I got it for cheap, but it turned out to be more than enough.

This wasn’t some lucky find either, it had sat on my junk bin for a while, and you can find many (usually better) choices of Intel Macs for cheap pretty much anywhere that sell used goods, but this made a good starting point for this experiment. Also the new parts came cheap, it was pretty much what I could scrounge from other devices, only the battery pack was bought new, overall I spent around 100 USD total.

Obviously, this particular computer is officially obsolete, no new builds of macOS exists and haven’t existed for a while, and unwilling to induce a blind rage by wrestling with deprecated software, expired SSL root certificates, and poor performance, I decided to load it with Ubuntu 22.04LTS as it’s operating system; there are perhaps better choices performance-wise, but this will do for testing out this theory.

The realization

The idea behind all of this was using this machine as I would my main laptop (which I bought in 2022) for office-related tasks. This means essentially:

  • Writing for this blog.
  • Watching YouTube videos.
  • Writing university assignments.
  • Watching media for sorting my library.

All of this had to be made without significant sacrifices to performance and/or time spent, and had to be done while using the laptop to its fullest: on the go, on battery, and while listening to music or videos in the background so my Gen Z brainrot wouldn’t get to me.

And yeah, it just works.

Sure, it’s not blazing fast, but it’s perfectly serviceable, Ubuntu offers many applications for productivity and with most services relegated to the cloud, pretty much everything worked no problem using just a web browser. Even connecting to my NAS, doing wireless networking (something old-time Linux users will remember with absolute hate), even the infamous display drivers were preinstalled with the OS. Everything pretty much worked out of the box, with minimal CLI nonsense, so even the standard consumer could get this experience without much hassle.

A lesson for nerds

Look, I get it. Computers are fun for us. We like to take them apart, put them through hell and back, create abominations for shits and giggles, and sometimes even turn ourselves into bona-fide data center administrators of our little kingdoms of silicon.

But for most people, computers are no more interesting than a pen, or a saw: it’s a tool.

No matter how much we complain about obscure CLI procedures, or endlessly pontificate about the inevitability of Linux on the desktop, let’s not deceive ourselves, we enjoy doing this, and we do it because it’s fun.

So great, most people just want to turn on their computers, use them to do their job, turn them off, and move on with their lives. But that still leaves a question: Why a 15 year-old computer is still enough to do this? With the relentless push of technology, one would expect a continual state of progress, as it has been the case for many years in the electronics sector.

But it isn’t, we have just demonstrated that you don’t need that. A Core 2 Duo turns out to be more than enough for office-related tasks, and that just doesn’t jive with our collective idea of “who knows what the future holds?”

The assumptions of capitalism

No matter what your opinions are regarding capitalism, it is undeniable that our modern society is fundamentally shaped by the forces that govern supply and demand, yet most of us seem to ignore that its axioms, the postulates we take as a given in order for capital to do its thing, do not always apply to all industries at all times, especially when they fail to properly account for human nature.

One of these core tenets is the idea of perpetual innovation: the idea that as humanity progresses, so does the economy: all market products are bound to get better over time, and the people who can adequately harness new technologies and techniques will be rewarded with capital.

But what if that idea is wrong? What if there is nothing to add to a product? What if we have made something that is so good, that there is no market pressure to innovate?

In classical economics, we would call these products commodities: goods that hold no value regarding its origin or manufacturer. Things like steel, wheat, and gold are all commodities; steel is steel, and no matter how much you revolutionize the steel industry, people will still want steel plates and beams, you can’t really innovate with that.

If you use your computer for writing essays, working spreadsheets, creating presentations, and the odd YouTube video here and there, computers peaked for you in around 2010. At that point, your computer did absolutely everything you wanted it to do and then some. Don’t believe me? Get a copy of Word 2010 and you’d be amazed at what it can do. Wanted the full web experience? Fully HTML5 and JavaScript powered pages run fine on computers from 20 years ago. Spreadsheets? Have you seen Excel? And of course, if you needed something more, the Linux CLI was very much mature by that point.

The “Office PC” has been a commodity for more than a decade.

Implications and a call to action

This ubiquitousness of raw computer power gives us turbo-nerds a prerogative: There is pretty much no computer from the last 15 years that cannot be put to some use. Webserver? No problem. Minecraft server? Sure, my first one was on an old Vaio Laptop from 2011. NAS? Yeah, especially if it has USB 3.0.

We live in a world where everything is absolutely disposable. Things are meant to be used and then absorbed into the void of uselessness. The idea that our trash goes somewhere is alien to pretty much anyone in the western world. These computers however show us that they don’t have to end up there, new life can be created from these devices.

So please, if you can, rescue these devices from landfill. Get some SSDs and some extra RAM and fix them up. Get them current OSes and software, do goofy things with them, give them out as gifts or sell them for a profit on eBay. Give a Linux machine to your little brother, or sister, or cousin, Use them as embedded devices, the sky’s the limit.

We look to devices like Raspberry Pis as the be-all end-all of tinkering computers, but compared to any x86_64 computer from the last 15 years, Pis are exceedingly anemic. The form factor is compelling, but I would argue that unless you have exceedingly stringent size constraints, any laptop motherboard can give you better results. If you get some of the better ones, you can even toy with graphics acceleration, PCIe peripherals, and so much more.

Conclusions

Computers have gotten insanely powerful in the last decade, but it seems that every generation of new processors and graphics units and RAM feel less like a quantum leap and more of an incremental improvement, a new stepping stone that is even closer to the last than the previous one.

This is not because of stagnation however, it’s just that computers have gotten too good for our own right, and capitalism sort of fails when innovation is not desirable. Having more powerful computers is just something that makes no sense anymore for the average user.

I’m not falling for the Thomas Watson trap here: there will be a time where more powerful computers will be a necessity once again, but that time is not now. The time is now however for giving life to those old computers; they are not dead, and they deserve new chances at life as long as we can keep them running, there is just too much power on tap to leave it rotting on a landfill for the rest of eternity.

So, if for a moment, forget about Raspberry Pis, Chromebooks, and NUCs, and go get a laptop from 5-10 years ago: you’ll probably get the same performance for more than half the price, and you’ll save some silicon from hitting the trash before its time truly comes. Computers have become a fundamental tool for communication and for interacting with humanity at large; if we can get it to more people, maybe a different world is possible.

Categories
Tech Explorations

A server at home: childhood fantasy or genuinely useful?

Ever since I was a child, I always dreamed of having the sort of high-speed, low-drag, enterprise-grade equipment in my home network. For me it was like getting the best toy in the store, or having the meanest roller shoes in class (reference which I guess dates me). It was as if getting these devices would open my world (and my Internet Connection) to a world known only by IT professionals and system administrators; something that would show me some hidden knowledge (or class cred maybe) that would bring my computing experience to the next level.

Anyone who has ever worked in the IT field knows the reality of my dreams: while the sense of wonder is not entirely erased, the reality of mantaining such systems is at best dull. But, you know, I’m weird like that. If I weren’t you wouldn’t be here.

Almost a decade ago, I embarked on a journey of building and mantaining a home server. It has solved many of my problems, challenged me to solve new ones, and taught me inmensely about the nuances of running networks, mantaining machines, building solutions, and creating new stuff with it. This is my love letter to the home server.

This article is somewhat different than most of my other content; it’s more of a story rather than a tutorialized narrative. It’s not really meant to be a guide to build or maintain servers at home, it’s more of a tool for understanding the rationale behind having one.

Baby steps

As the designated “IT guy” in my friend group, I often found myself helping my friends and family with their computing related needs. I started helping others install software, customize their computers, using basic software like Office and stuff like that. We also gamed some, whatever we could pirate and get to run in crappy laptops and stuff like that. Our friend group was big into Minecraft at the time, as most kids were, and we loved to show off our worlds, different approaches and exploits to each other, sharing our experiences in the game. Inevitably, one day the inevitable question came: What if we made a Minecraft server for all of us to play in?

The writing was on the wall, so I set out to make one. At the time I was rocking the family computer, a respectable 21.5″ 2013 iMac. It was beefy enough for me to run the server and a client at the same time, and paired with LogMeIn’s Hamachi (which I hated, but didn’t know better), a highlight of my childhood was born. It barely worked, many ticks were skipped and many crashes were had, but it was enough for me and my group of friends to bond over.

Around the same time my parents bought a NAS, an Iomega StorCenter sporting a pair of striped 500GB hard drives for a whopping 1TB of total storage. Today that sounds quaint, but at the time it was a huge amount of space. For many years we kept the family photos, our music library, and limited backups in it. It opened my eyes to the posibility of networked storage, and after an experiment with a USB printer and the onboard ports on the NAS, I even toyed with basic services to provide. An idea was coming to my head, but I was just starting high school, so there was pretty much no budget to go around, so I just kept working around all the limitations.

At least, up to a point. Within a couple of months, both the RAID array in the Iomega NAS and my iMac’s hard drive failed, without any backups to recover. It was my first experience with real data loss, and many memories were forever wiped, including that very first Minecraft server world. Most of our family stuff was backed up, but not my files; It sucked. It was time for something new.

Building reliable servers from scrap

I was still in high school, there was some money to go around, but nowhere near enough me to get a second computer to keep online forever. So I went around looking for scraps, picking up whatever people were willing to give me and building whatever I could with it. My first experiments were carried out on a geriatric AMD Athlon from a dump behind my school from the early 2000s which wasn’t really up to the task of doing anything, but it gave me valuable information regarding building computers and what that entailed. My first real breakthrough was around 2015 when I managed to get a five year old underpowered Core i3 Tower with 4GB of RAM sitting in a dumpster outside an office block near my house. After a clean install of Windows and some minor cleaning I had, at last, a second computer I could use as a server.

I didn’t know much about servers at the time, which meant that my first incursion was basically an extension of what I’d seen before: using SMB to share a 1TB drive I’d added to it by removing the optical drive and rescuing a hard drive from a dead first-gen Apple TV. I added a VPN (ZeroTier One, it’s like Hamachi but good), printer sharing, VNC access and pretty soon I was running a decent NAS.

I added a second 1TB drive a few months after (which involved modding the PSU for more SATA power ports) and some extra software: qBitTorrent’s web interface for downloading and managing Torrents from restricted networks (like my school’s), automatic backups using FreeFileSync, and a few extra tidbits. I even managed to configure SMB so I could play PS2 games directly from it, using the console’s NIC and some homebrew software.

This was my setup for around four years, and It did it’s job beautifully. Over time I added a Plex server for keeping tabs of my media, And I even played around with Minecraft and Unturned servers to play with my friends. Around 2019 though, I was starting to hit a bottleneck. Using a hard drive as boot media for the server was dog slow, and I had run out of SATA ports for expanding my drive roster. I had been toying with the idea of RAID arrays for a while, especially after losing data to faulty drives. Mirroring was too expensive for me, so I my method of choice was level 5: single parity distributed between al drives, single drive failure tolerance. I just needed a machine capable of doing it. For a while I wondered about buying HBAs and just tacking the drives onto the old hardware and calling it a day. I ended up doing something completely different.

At last, something that looks like a server

In the end I decided that a better idea was to upgrade the motherboard, processor, the power supply and a few other stuff. I added a USB 3.0 card for external drive access, upgraded de processor from a Core i3 240 to a Core i5 650, and got a motherboard similar to the current one but with six sata ports, and got four 2TB video surveillance drives for dirt cheap, along with a beefier power supply to tie it all together. Around this time I got a Gigabit Ethernet switch, which vastly increased throughput for backups. It was mostly used, bottom of the barrel stuff, but it allowed me to create a RAID-5 array with 6TB of total storage, and slightly more room to expand my activities. Lastly, I replaced the boot drive with a cheap SATA SSD.

With it came an actual Plex library, a deep storage repository for old video project files, daily backups, a software library for repairs, MySQL for remote database work, and even a Home Assistant VM for home automation. I kept running servers and making experiments on it. This second iteration lasted me for around three more years, which was more than I expected from what was essentially decade old hardware running in a dusty cupboard next to my desk.

Soon enough however, new bottlenecks started appearing. I was getting more and more into video work, and I was in need of a server that could transcode video in at least real time. Most CPUs cannot do that even today, so I was looking into GPU acceleration. I also started suffering with Windows: It works fine for begginers, and I even toyed with Windows Server for a while, but it’s just way behind Linux distros for server work. It took up lots of resources doing essentially nothing, the server software I needed was clunky, and I lacked the truly networked logic of a UNIX-like OS.

Enterprise-grade hardware

Once again, I looked upon the used market. Businesses are replacing hardware all the time, and it’s not difficult to find amazing deals on used server hardware. You’re not getting the absolute latest and greatest stuff on the market, but most devices are not really that old, and more than capable for most home uses.

From the second-generation server I could still salvage the USB 3.0 card, the power supply, the boot drive, and the four RAID drives. All of those were bought new, and they were in excellent condition, so there was no need to replace them. I wanted something that would last me at least the next five years, which could accomodate all my existing hardware, and had plenty of room for expansion: PCIe slots for GPUs and other devices, proper mounting hardware for everything, and a case to keep everything neat.

I went for a tower server instead of a rackmount mainly because I don’t have a place for a rack in my home, and the long-and-thin package of most racked servers made no sense in my case. After a bit of searching I came upon the HP ProLiant ML310e Gen8 v2: An ATX-like server with a normal power supply, four drive bays with caddies, an integrated LOM, and even an optical drive (which given that my server now hosted the last optical drive in the house, was a must). It was perfect. I also managed to score an NVIDIA GTX1060 6GB for cheap, which is more than a couple generations behind at this point, but most importantly for me, it has NVENC support, which meant transcoding HD video at a few hundred FPS with ease.

Building the current server

My third generation server was built around the aforementioned HP tower, but many modifications had to be made in order to acheive the desired funcionality. After receiving it, I swapped the PSU and modified one of the accessory headers for the server’s proprietary drive backplane connector, so I could power the drives from the new power supply. Apart from increasing the max load from 350W to 500W, it also gave me PCIe power connectors to drive my GPU, which the previous PSU lacked.

Then, I installed the GPU and encountered my first problem: Servers like these use only a couple of fans and a big plastic baffle to make sure the air makes it to all the components on the board. This is a great idea in the server world, it reduces power consumption, decreases noise, and allows for better cooling, but it’s also interferes with the GPU: it’s not a server model, so it’s taller than a 2U rack, so the baffle cannot close. Not to worry though, a bit of Dremel work later I had a nice GPU-shaped hole, which I made sure to make as small as possible to not disturb the airflow too much. The GPU shape came in my favor too, as it redirects air perfectly onto the fanless CPU cooler.

Other than the four drive bays (in which I installed the 2TB drives) there isn’t much place for another boot drive, so I used the empty second optical drive bay to screw in the boot SSD. A lot of cable management and some testing later, I was ready to go.

For the OS, I went with Ubuntu Server 20.04 LTS. I’m very familiar with this whole family of Linux distros, so it made sense to use it here. My servers at work also use it, so I had some server experience with it as well. Once de OS was in, I installed the drivers for the LOM and the GPU, along with building the RAID-5 array again using mdadm. I was using Microsoft Storage Spaces for the array in the previous generation, so I had to rebuild it in Linux cleanly. After dumping the data into some spare drives and building the array on Linux I was in business.

For the software I installed Jellyfin (I was getting sick of the pay-to-win model of Plex and I wanted something with IPTV support), Samba for the shared folders, VirtualBox and the latest Home Assistant VM (don’t @ me about Docker, that version is crap and the supervised install is a pain in the ass, I’m done with Docker for now), qBitTorrent, MySQL, ZeroTier One, OctoPrint, and of course, a Minecraft server. I also installed a few quality-of-life stuff like btop for Linux and thus, my server was complete, at least for now.

The realities of living with a server at home

Despite my childhood aspirations, enterprise-grade hardware has a major flaw with respect to home equipment: noise. I can’t blame them too much, after all, these devices are made for production environments where no one really minds, or vast datacenters where noise is just part of the deal. I on the other hand, like sleeping. The first time I turned the machine on I was greeted with the cacophonous roar of two high RPM server fans. It dawned on me pretty quickly that this noise simply would not pass in my house, so I quickly set about fixing it.

Unlike desktop boards, the OS doesn’t have control over the fans. To the sensor package in Linux, it’s like they didn’t even exist. I did get some temperature readings and there might be a way to address them via IPMI, It just didn’t work right. The job of handling the fans is up to the iLO, HP’s name for a LOM, a tiny computer inside the server that allows for low-level remote management, including remote consoles and power cycling. The homelabbing community figured out years ago how to tune them down, and a bit of unofficial firmware later, my fans calmed down to a reasonable 20%, and I could sleep again. I took this oportunity to put a piece of tape over the POST buzzer, which had no right to be as loud as it was.

Closing thoughts

This has been a wild ride of old hardware, nasty hacks, ugly solutions, and wasted time, but in the end, the outcome is so much more that what I ever envisioned: a device from which I could automate the boring bits of my daily routine, keep backups and storage of all my stuff safely, and having access to all my services from wherever I happen to be. If you’re a nerd like me and willing to spend some time faffing around in the Linux CLI, I highly recommend you make yourself a home server, no matter the budget, device, or services.