I used to make a pretty decent living installing the Linux operating system on bare metal servers.
Nobody does that anymore, or at least nobody brags about it on social media, which is basically the same thing.
It was a good gig. I traveled the world, spent a lot of time in rooms with -really- good HVAC, and became so familiar with the timing of the various hardware and firmware processes that I could reliably run five to ten concurrent installations off a single Keyboard / Video / Monitor shakily balanced on a utility cart – swapping cables and hitting keys without having to look at or even connect the screen – like some sort of strange cyberpunk organist.
Eventually BOOTP and IPMI made the details of the hardware timings mostly irrelevant. Servers were equipped to wake up and look for guidance over the network rather than waiting for some junior engineer to plug up a keyboard or a mouse. I pivoted to making a decent living tuning and massaging the moody and persnickety software that hands out IP addresses, routing configurations, hostnames, and instructions around disk partitioning and default boot orders.
Over time, technology ate that too – so I kept moving up the stack – using tools like Rocks, and eventually Chef and Ansible, to make sure that the servers, now contentedly installing their own operating systems, would also automatically pull down updated code for software services. The very best of these systems developed a certain dynamism – checking themselves out of work from time to time to automatically pull down updates and potentially entire new software stacks.
Eventually, in 2007 (ish) Amazon launched EC2, S3, and SQS. At that point it became completely ordinary to have computer programs defining both infrastructure and software. Over time, containerization (Docker), markdown (Rstudio and Jupyter), and Terraform further blurred the lines. Then Kubernetes came along and strung it all together in a way that made my cyberpunk-organist heart sing.
Rewind:
Around the same time that I was installing Linux on servers, I maintained an HTML file – housed on a particular server in a particular building in Ann Arbor Michigan – that linked out to a few dozen or maybe even a few hundred other files on other servers in other (mostly) college towns all over the world. A few dozen, or maybe a few hundred, or even maybe a few thousand people used that file as a stepping stone in their very own personal quest to find information about a thing.
The thing was college a cappella music like the kind in Pitch Perfect. Forgive me.
Then two guys named Larry and Sergey started a company that used computer programs to trawl and index the internet … dynamically generating HTML in response to arbitrary queries. These days the very idea of a -manually- curated list of URLs seems utterly quaint and bespoke. Wikipedia came along as a way to crowdsource a curated reference for a world full of topics.
Now we have Large Language Models (LLMs) like ChatGPT and the rest, which are frankly brilliant tools to provide not just links into documents but dynamically generated and context-sensitive summaries with exquisitely accurate language.
The Future
Here’s my optimism: At no point in this whole story was I “replaced.”
Jobs are certainly going to change. People (and algorithms) who previously brought value by reading, summarizing, and regurgitating pre-existing text … well … that particular job is going the way of my cyberpunk organist gig from the late 90s. It will be uncomfortable to realize that many things we thought were important and unique – like writing summaries for our managers – are actually fairly mechanistic.
I intend to continue writing my artisanal blog posts for the same reason I always have – regardless of whether anybody else reads these words – I get value from the act of writing. As the old quote goes: “Nothing so sharpens the thought process as writing down one’s arguments. Weaknesses overlooked in oral discussion become painfully obvious on the written page.”
I remain optimistic: No matter how clever I ever got about automating computers to install their own operating systems, I was always left with the question of what we were going to use those computers to do. No matter how effortlessly we manage to summarize what has come before, the question of where we are going will remain.
LLMs and other generative AI technologies are absolutely amazing at recapitulating the past but – so far at least – they have yet to be able to direct us into the future.
And that’s cause for optimism.