Categories
Unwarranted Opinions

Personal computers are done and using a 15 year old computer for a week made me realize it

When I started writing the draft for this article back in March, I was spinning up this narrative that old laptops still have uses as writing machines; devices used for distraction-free text composition, especially if you could get a higher-end one with a good keyboard and a decent screen. This was a mostly uncontroversial write-up on my experience using a fifteen year old MacBook Pro for writing this blog, among a couple of other tasks.

But then I saw Cathode Ray Dude’s video on HP’s QuickLook, and my head was thrown in a flat spiral straight into madness. Despite the A plot being about an absolutely heinous abuse of UEFI and an eldritch nightmare of stopping Windows’ boot process in order to get to your email slightly faster, it was the B plot that put my undies in a twist: The second half of the video is this wonderful opinion piece hinging on the fact that for most people, computers are pretty much at their endgame: for most normal applications, a computer from a decade ago is indistinguishable from another one that came out last year. I highly recommend you watch that video as well.

While I could conceptually wrap my head around it, as the resident turbo-nerd in my group of friends I have been used to chasing the bleeding edge for years, if slightly hampered by budget constraints. The idea of an “office-use” computer from fifteen years ago still being perfectly cromulent seemed absolutely insane: after all, that’s pretty much all I do these days involving anything but what I do in my spare time.

So I set out to prove it, and in the way I stumbled across many perspectives, a new dread regarding late capitalism, and maybe some lessons for the less tech-savvy along the way.

The setup

To put this thesis to the test the experiment was simple: use an old computer for daily tasks and see how we fare. For this, I chose an Apple MacBook Pro from mid 2009. It sports a dual-core Intel Core 2 Duo P8700, a crisp 1280×800 display with amazing colors, and a surprisingly well-kept exterior. Inside, I made some modifications in order to better my chances:

  • The RAM was upgraded from the factory 2GB of 1066MHz DDR3 to 4GB.
  • The battery was replaced, as the old one had died.
  • The original 250GB 5400RPM hard drive was replaced by a 250GB SATA SSD. It’s DRAM-less and I got it for cheap, but it turned out to be more than enough.

This wasn’t some lucky find either, it had sat on my junk bin for a while, and you can find many (usually better) choices of Intel Macs for cheap pretty much anywhere that sell used goods, but this made a good starting point for this experiment. Also the new parts came cheap, it was pretty much what I could scrounge from other devices, only the battery pack was bought new, overall I spent around 100 USD total.

Obviously, this particular computer is officially obsolete, no new builds of macOS exists and haven’t existed for a while, and unwilling to induce a blind rage by wrestling with deprecated software, expired SSL root certificates, and poor performance, I decided to load it with Ubuntu 22.04LTS as it’s operating system; there are perhaps better choices performance-wise, but this will do for testing out this theory.

The realization

The idea behind all of this was using this machine as I would my main laptop (which I bought in 2022) for office-related tasks. This means essentially:

  • Writing for this blog.
  • Watching YouTube videos.
  • Writing university assignments.
  • Watching media for sorting my library.

All of this had to be made without significant sacrifices to performance and/or time spent, and had to be done while using the laptop to its fullest: on the go, on battery, and while listening to music or videos in the background so my Gen Z brainrot wouldn’t get to me.

And yeah, it just works.

Sure, it’s not blazing fast, but it’s perfectly serviceable, Ubuntu offers many applications for productivity and with most services relegated to the cloud, pretty much everything worked no problem using just a web browser. Even connecting to my NAS, doing wireless networking (something old-time Linux users will remember with absolute hate), even the infamous display drivers were preinstalled with the OS. Everything pretty much worked out of the box, with minimal CLI nonsense, so even the standard consumer could get this experience without much hassle.

A lesson for nerds

Look, I get it. Computers are fun for us. We like to take them apart, put them through hell and back, create abominations for shits and giggles, and sometimes even turn ourselves into bona-fide data center administrators of our little kingdoms of silicon.

But for most people, computers are no more interesting than a pen, or a saw: it’s a tool.

No matter how much we complain about obscure CLI procedures, or endlessly pontificate about the inevitability of Linux on the desktop, let’s not deceive ourselves, we enjoy doing this, and we do it because it’s fun.

So great, most people just want to turn on their computers, use them to do their job, turn them off, and move on with their lives. But that still leaves a question: Why a 15 year-old computer is still enough to do this? With the relentless push of technology, one would expect a continual state of progress, as it has been the case for many years in the electronics sector.

But it isn’t, we have just demonstrated that you don’t need that. A Core 2 Duo turns out to be more than enough for office-related tasks, and that just doesn’t jive with our collective idea of “who knows what the future holds?”

The assumptions of capitalism

No matter what your opinions are regarding capitalism, it is undeniable that our modern society is fundamentally shaped by the forces that govern supply and demand, yet most of us seem to ignore that its axioms, the postulates we take as a given in order for capital to do its thing, do not always apply to all industries at all times, especially when they fail to properly account for human nature.

One of these core tenets is the idea of perpetual innovation: the idea that as humanity progresses, so does the economy: all market products are bound to get better over time, and the people who can adequately harness new technologies and techniques will be rewarded with capital.

But what if that idea is wrong? What if there is nothing to add to a product? What if we have made something that is so good, that there is no market pressure to innovate?

In classical economics, we would call these products commodities: goods that hold no value regarding its origin or manufacturer. Things like steel, wheat, and gold are all commodities; steel is steel, and no matter how much you revolutionize the steel industry, people will still want steel plates and beams, you can’t really innovate with that.

If you use your computer for writing essays, working spreadsheets, creating presentations, and the odd YouTube video here and there, computers peaked for you in around 2010. At that point, your computer did absolutely everything you wanted it to do and then some. Don’t believe me? Get a copy of Word 2010 and you’d be amazed at what it can do. Wanted the full web experience? Fully HTML5 and JavaScript powered pages run fine on computers from 20 years ago. Spreadsheets? Have you seen Excel? And of course, if you needed something more, the Linux CLI was very much mature by that point.

The “Office PC” has been a commodity for more than a decade.

Implications and a call to action

This ubiquitousness of raw computer power gives us turbo-nerds a prerogative: There is pretty much no computer from the last 15 years that cannot be put to some use. Webserver? No problem. Minecraft server? Sure, my first one was on an old Vaio Laptop from 2011. NAS? Yeah, especially if it has USB 3.0.

We live in a world where everything is absolutely disposable. Things are meant to be used and then absorbed into the void of uselessness. The idea that our trash goes somewhere is alien to pretty much anyone in the western world. These computers however show us that they don’t have to end up there, new life can be created from these devices.

So please, if you can, rescue these devices from landfill. Get some SSDs and some extra RAM and fix them up. Get them current OSes and software, do goofy things with them, give them out as gifts or sell them for a profit on eBay. Give a Linux machine to your little brother, or sister, or cousin, Use them as embedded devices, the sky’s the limit.

We look to devices like Raspberry Pis as the be-all end-all of tinkering computers, but compared to any x86_64 computer from the last 15 years, Pis are exceedingly anemic. The form factor is compelling, but I would argue that unless you have exceedingly stringent size constraints, any laptop motherboard can give you better results. If you get some of the better ones, you can even toy with graphics acceleration, PCIe peripherals, and so much more.

Conclusions

Computers have gotten insanely powerful in the last decade, but it seems that every generation of new processors and graphics units and RAM feel less like a quantum leap and more of an incremental improvement, a new stepping stone that is even closer to the last than the previous one.

This is not because of stagnation however, it’s just that computers have gotten too good for our own right, and capitalism sort of fails when innovation is not desirable. Having more powerful computers is just something that makes no sense anymore for the average user.

I’m not falling for the Thomas Watson trap here: there will be a time where more powerful computers will be a necessity once again, but that time is not now. The time is now however for giving life to those old computers; they are not dead, and they deserve new chances at life as long as we can keep them running, there is just too much power on tap to leave it rotting on a landfill for the rest of eternity.

So, if for a moment, forget about Raspberry Pis, Chromebooks, and NUCs, and go get a laptop from 5-10 years ago: you’ll probably get the same performance for more than half the price, and you’ll save some silicon from hitting the trash before its time truly comes. Computers have become a fundamental tool for communication and for interacting with humanity at large; if we can get it to more people, maybe a different world is possible.

Categories
Tech Explorations

Fast Track C600: Faults and Fixes

A few years ago, one of my high school music teachers came to me with a deal that was too difficult to pass up. He had just replaced his audio interface, and he wanted to get rid of the old one, which was of course faulty. Having known each other for a while, he knew that I was into that sort of thing and had decent chance of making it work. The device in question was a M-Audio Fast Track C600, a fantastic USB audio interface featuring 4 mic or line inputs with gain control, 6 balanced audio outputs, 96kHz 24bit crystal clear audio, low latency, and S/PDIF and MIDI I/O, along with many other tidbits and little details that make it a joy to use. It was way out of my price range, there was no way I could afford such a high-end device, and yet it was now mine. That was, of course, provided I could make it work in the first place. Today, we’ll delve into the adventure that was fixing it. Unfortunately, I didn’t take any pictures of the process, so you’ll have to take my word for this.

When I got home, I decided to plug it in and give it a shot, not expecting much. Instead of the usual greeting of flashing lights I was met with darkness. It was completely dead. My computer didn’t detect anything either, so clearly there was a hardware issue lurking inside. After opening it up, I was greeted with a myriad of cables routing lines back and forth from the two printed circuit boards that were inside, which looked pristine. No charring, no blown capacitors, no components rattling inside the case. the C600, or as the PCB ominously shouted at me (what I can only assume was the internal project name) in all caps, GOLDFINGER, looked as neat and tidy as the day it left the factory. A bummer it seemed. It wasn’t going to be an easy fix.

A breakthrough

And so it sat on my desk, half disassembled, for months. For one, I was still learning the basics of electronics, so there wasn’t much for me to do at that point. On the other hand, I was just getting into the world of digital sound, and my little Berhinger Xenyx 302USB was more than enough for what I was playing with back then.

Then one day, I decided to remove the lower board entirely (this is the one that holds all the important electronics, the upper one just has the display elements, knobs and buttons, along with the preamps for the inputs, which weren’t really necessary at this point), plugged the AC adapter (which I didn’t have, but an old 5V wall wart coming from an old Iomega Zip drive matched the jack and voltage perfectly) and the USB port, and started looking around the board.

At first, nothing really seemed to stand out, until after a while, when a smell of flux and solder caught my nose. For those who have never worked on electronics before, it’s a very pungent and characteristic smell, usually indicating a component that is way too hot. I started feeling around with my finger until I found the culprit; a tiny 10-lead MSOP package only slightly bigger than a grain of rice. I didn’t know what it was at first, but it had some big capacitors around, so I assumed it was some sort of voltage regulator, but the writing was tiny, and I couldn’t read the markings on the chip. After much squinting, I came to the conclusion that the markings read “LTABA”, which didn’t sound like a part name to me. A preliminary Google search came inconclusive, as expected, even after adding keywords and switching things around.

But then it dawned on me. a few weeks ago while hunting components on AliExpress, I noticed that most sellers usually wrote the complete markings of the chip on the listing, unlike other vendors like Mouser who just stick to the official part name. so I searched our magic word and lo and behold, there was my answer. Our mystery chip was, as expected, a regulator, the LTC3407 600mA dual synchronous switching voltage regulator from Analog Devices. The mystery was not complete however, as the regulator was of the adjustable type, and as such, I had absolutely no idea what voltages I was looking for.

But Goldfinger had me covered. etched on the silkscreen just a few mm away from the regulator, I saw three test pads, labeled “5V, 3V3, 1V8”. I assumed that the 5V was coming from either the USB socket or the AC adapter, while the 3.3V and 1.8V (voltages very common for powering digital microelectronics) were being handled by the dual-output regulator, stepped down from the 5V rail. After a quick continuity check, my assumptions were confirmed. The pieces were starting to come together.

A (not so) temporary fix

For a regulator to get that hot, usually one of two things need to happen. Either a short circuit on the output rail, or an internal fault that requires replacing of the chip. I discarded the short theory fairly quickly just by measuring the voltages. When a short occurs, the regulator usually switches off the output automatically and gives us a voltage very close or at 0V. In our case the output voltages were jumping around erratically, nowhere near the stated voltage on the board. While this was a relief in the sense that there was no problem with the board, it now posed an ever tougher question; what was causing this issue?

For a while I poured over the datasheet looking for an answer. At first I thought it was a problem in the feedback circuitry (the design of this circuit is what sets the output voltage and allows it to correct it as the load changes), but that would only affect one of the regulator subsystems, as each leg had a different feedback circuit. I also thought that the external components of the regulator (capacitors and inductors mostly) were faulty, but again, this didn’t explain why both rails were bust.

So I decided to quit. I’m not an electrical engineer (yet) and without a proper schematic of the board there was no way I could troubleshoot this PCB with my available tools in my house’s washing room. So I ripped the regulator out (It was slightly brutal, as this package has a massive solder pad beneath the package to dissipate heat, that is pretty much impossible to desolder amicably without a hot air rework station, which I don’t have) and went to my local electronics store and bought a 10-pack of LM317 linear adjustable voltage regulators. This million-year-old component, being a linear regulator, although trivially simple to install, has a massive disadvantage; unlike the original regulator which relied on switching the input voltage on and off really quickly, this one lowers the voltage by straight up dissipating the excess power as heat, which in turn means a greater power consumption. This meant both hoping that the USB port didn’t trip its overcurrent protection and adding heat sinks (salvaged from an old TV) inside the case with duct tape and wishing for the best. At least in my mind, this was all temporary. after soldering wires into the board, adding the passives for setting the voltage, and admiring my horrendous creation, we were ready for a test run.

First light, second problem

As I plugged it in, I saw das blinkenlights flashing at me for the first time. I was overjoyed when my computer recognized a new USB device. It was alive at last, but the battle was only halfway through.

For one, it turns out that Goldfinger doesn’t look kindly to USB hubs or USB 3.0 plugs. Both official and unofficial documentation warns the user to get away from both these apparent evils and stick to strictly USB 2.0. Luckily, my workhorse laptop does still include a USB 2.0 port which has given me no issues so far.

I had installed the “latest” (version 1.17, dated mid-2014) drivers available officially from the manufacturer’s website, which gave me issues since the beginning. Unstable on Windows Sound API, clicks and cutouts on ASIO, bluescreens if unplugged, bluescreens for no reason at all, poor hardware detection, you name it. After gouging through what’s left of the M-Audio forums, I found a post with the suggestion of rolling back to a previous version of the driver, which unfortunately went unanswered. So I gave it a try, downloading the 1.15 version (also available from the drivers site) and installing the old version. And at last, it worked.

A quick review, finally

So I’ve been using this interface for about a year now, give or take, and it has been a dream to work with. I’ve used it to record both live gigs and snippets and experiments of my own creation, and even used it a few times for livestreaming.

For me, it’s a perfectly adequate device for the kind of work I do, especially for free ninety nine. The user experience could use a tweak or two, especially the squishy knobs and the weirdly sensitive gain pots, but the build quality is solid, the connectors are a joy to use, and the included software is finicky, but powerful if you’re willing to respect it’s quirks.

Closing thoughts

While this turned out to be a massive project, both in time and scope, many important things were learned. First and foremost, never turn down free stuff, even if it’s broken. Turns out most people throw out things even if the fix is simple. Also, repairing things is good for the environment and usually cheaper than buying new. Second, just because the device you’re trying to fix uses some high-speed component doesn’t mean a 50-year-old component won’t replace it.