It's Only A Game: Lara Croft Won't Save Enterprise Tech – But Jet Set Willy Could

'Member the Apple II? Hobbyists' kit is where business IT begins

jet set willy screenshot

Holy hot sink, Jet Set Willy – could you be our saviour?

Column The twin planets of business and consumer technologies have been locked in a game of Pong for decades. The Apple II was aimed at hobbyists, but catalysed the revolution that put a PC on every office desk.

The GUI needed hardware so expensive it could only come in boxes with corporate-sized price tags, until the Atari ST and Amiga brought it home. And GPUs grew out of gaming – what other use could they have? – until first supercomputers then more modest devices found them excellent for analytics.

As with any cross-species DNA swap, it's hard to predict when the next one will happen, let alone to foretell revolutionary effect. We are so blasé now that developments in the most powerful consumer toy, the games console, seem unremarkable even when the raw figures would make the most cerebral computer scientist from the 1980s weep with desire.

Last week, Microsoft said its next generation of Xbox, the Series X, will have a 12-teraFLOPS GPU, news which merited a few paragraphs of gamer copy. This is as fast as ASCI White, the fastest supercomputer in the world at the turn of the millennium, but frankly as dull as ditchwater otherwise. Nobody really cares about Lara Croft's framerate, or her latency at 8k resolution. It matters as much as RGB LED lighting effects on RAM chips. (Which, my god, are still a thing. They're called Vengeance. Can we just not?)

Happy 30th Birthday, Sinclair ZX Spectrum

READ MORE

Yet quietly and with a fraction of the budget Microsoft spent deciding to name the Mark Four Xbox the Xbox Series X to differentiate it from the Mark Three Xbox's Xbox One X – expensive people spend a long time on such acts of genius – a tiny consortium of retro-geeks have pulled off a trick that has eluded business IT for generations. Let's back up and ask what really ails the real world of biztech.

You've done your time in the trenches, so you know the score. You know the real battles in the backrooms of corporate IT, the real lines drawn on the dry-wipe whiteboards over endless cups of indifferent coffee as last night's hangover dissolves under the strip lights of endless meetings. Nobody's asking where the next 12 teraFLOPS are coming from, more what to do about the old stuff. The legacy hardware and software that's churning away behind layers of cruft, the business-critical databases and frameworks that sit on soon-to-be-end-of-lifed boxes and versions of Windows even your gran has given up on.

In serious cases, the heart of the beast is a tangled chunk of code many, many years old, written in a language indistinguishable from Sanskrit and somewhat less logical. Refactoring is like defusing the contents of Beaufort's Dyke (PDF), the last four CIOs having successively signed off on just enough expensive bodgery to keep it going until they retired. Legacy hardware. Legacy software. "The Migration." These are the true horrors of our world.

So what if there were a consumer toy costing a couple of hundred quid max that was a living, breathing example of legacy system design not just kept alive, but thoroughly revived in a new form, revitalised with new features, new integrations and a future roadmap? Ladies and gentlemen, I present the ZX Spectrum Next.

What can a 1982 revival have to do with 2020's IT woes? It's easy to dismiss it as a passion project, or yet another of the recent crop of cramming a zillion retro games of questionable licensing in a box with a Chinese blob chip, but it has two features that make it much more interesting.

Array with that sort of thing

The first is the seamless, or at the least seamless-ish, integration of many badly documented generations of successive software and hardware into a single platform, by combining the specialist knowledge of a small team of precisely the right people. You can move along the timeline from the monochrome block capitals tape-loading world of 1980 (and before, it has CP/M) to a networked wireless HDMI accelerated system with proper support for all manner of contemporary tech. It invites experimentation and further development.

And it does this through superb use of the second feature, its FPGA chip. Field Programmable Gate Arrays aren't new, and they have had an important role in niche markets. But their full potential, although recognised and often promoted, has never been fully realised. Because they are at heart a sea of digital logic hardware that can be reconfigured in software, they seem to offer the perfect performance solution for any hard task.

Running software on general-purpose CPUs always involves inefficiencies. There have to be abstractions in software recast to general terms, taking longer than dedicated hardware would. But dedicated hardware is very expensive because, well, it's dedicated. No large market means very high per-unit costs, compared to general-purpose CPUs.

FPGAs promise software that becomes hardware. FPGA supercomputers exist and work very well – sometimes. It turns out that programming FPGAs efficiently is hideously difficult and time-consuming; there is no general-purpose optimising compiler technology. Every new generation of processor-intensive task looks hungrily to FPGAs for salvation, the latest being convolutional neural networks – but with scant reward. Outside digital signal processing, where the basic mathematical blocks can be highly refined and reused, FPGAs haven't caught fire.

But what they can do, as evidenced by the ZX Spectrum Next, is build on the mantra that "anything you can do in software you can do in hardware and vice-versa" and provide a custom mix of hardware stack support and flexibility for a specific target. Not only that, they can do it with a very small team over a reasonable timeframe; the Next did take longer than people would have liked, but the team was two core people plus five associates doing it on the back of geek funding. As a proof of concept, it's entrancing.

Can FPGAs help solve your most intractable IT miseries? That's a question only you can answer, but it's one you might not even have asked until now. It's a new spectrum of possibility – and the Next wouldn't be the first hobbyist toy to reboot a whole technology. ®

Sponsored: Webcast: Why you need managed detection and response

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more