Jon Brookes
2025-09-18
in Code, Chips and Control - The Architecture of Digital Sovereignty by Sal Kimmich there is an introduction to where Sal started out in computing. It got me to thinking how I also started out and what technologies were available to me back then.
For children of the late 60s and into the 70s it was a pretty dark place technologically until Mr Sinclair and others did what they did to make general computing accessible outside of corporate giants and Universities.
It was to a University I went ultimately to learn and be a part of computing technology. So strong was this state of affairs that only a few could get their hands on tech or claim to really know what it was or how to use it.
My earliest experiences were of the z80 chip in a store room behind a classroom at my school at the time, a comprehensive, as it is known in the UK and a former ‘grammar’ - a bi-gone era of the ‘11 plus’ in which children were graded at age 11 to be good enough, or not, to proceed to an academic advanced education.
We were only allowed to observe a teacher ‘programming’ the z80 and a handful were allowed to press a few keys whilst under supervision.
The z80 was tiny, more of a calculator than a computer. It was pathetic by today’s standards.
A few months after this lacklustre event I was given a 16K ZX Spectrum and, for me, this was the turning point at which I knew what I wanted to do.
Pressing keys on the spectrum, like the z80, printed commands to the screen. Each key on the keyboard literally had program key words printed on them like ‘goto’, or ‘then’ This was a new concept then and rarely repeated. Perhaps best so, but it gave the chipset a fighting chance at an efficiency of memory over usability. Each pre-set key press needed only a small amount of memory to record a program’s key words. The spectrum keyboard was constructed with a membrane matrix in order to reduce production costs. However this amounted to keys that felt more like chewing gum than a genuine keyboard. It was common for heavy use to wear off the keywords that were printed on some of the keys. Nightmare. We of course type words in natural language, and so too in computer programming today, but this was one of Clive Sinclair’s many innovations that set this work apart from others and made it affordable to us mortals.
For me to relate what I thought and felt about the spectrum is to place myself in the mind of a young teenager that had little understanding of computing beyond what was available to them in films and science fiction. I had read 2001 A Space Odyssey of course (everyone should) and had gone on to other works by Arthur C Clarke that explored the ‘lost worlds of .. ’ and further explained the trans-dimensional journey David Bowman took with HAL, the ship’s computer of the Discovery One.
So the strictures of 16K of memory, and backing storage being an analogue cassette recorder, were not all together impressive to me. But I tried, very hard, to master what I could.
An entirely unresolved ambition was for me to write a ‘game’. Now I realize this as a simulation of space flight in the style of the final approach of the attack of the Death Star in Star Wars, which was for me as a child watching it in a cinema, mind altering.
I never did write that simulation and for various reasons have no desire to do so right now. That may change, but I have bigger fish to fry. So it can wait.
If you could afford it, a printer for the spectrum came as a cludgey roll of silvered paper, similar to a roll of receipts on a cash register (but some might say a toilet roll). The characters were basic and often difficult to read. When stored for any length of time these would fade but most of us could not afford the luxury of such anyway so that did not matter.
I read most code for the spectrum from magazines and transcribed to my own system by painstakingly copying, correcting, failing and sometimes succeeding to write games and simple programs to count to a hundred. I got as far as games like ‘Frogger’, but had to rely on others to help get that to work. A theme that continues. We all need other peoples help, now and then, to get things done.
Code came as a flavour of basic and machine code mnemonics that were in-lined and poked instructions straight into its Z80 3.5 MHz CPU.
Naturally I knew nothing of what was going on and had to take vast leaps of faith to have any hope of success. Invariably I was disappointed but sometimes things just worked.
Mostly, I realize now, I would have been let down by bugs in hardware, software and the sheer instability of a world first budget home computer. Switching off and on again was part of any development life cycle and re-loading 100’s, perhaps 1000’s of times from audio tape would also be a part of the day job.
When I loaded a program I became aware of the sound it made and would have a sense of it working or not. Much the same happened years later when dialling the internet with modems. It is strange how we can hear sounds as patterns and make sense of them. Similar I suppose to reading code or scanning logs. The brain detects patterns and we can sometimes visualise what is going on behind the scenes.
Saving work was fraught with problems. A backing store with audio cassettes were not enterprise ready as would be standard for 2025. Writing things down on a piece of paper would have been the best backup strategy, if I were giving winsome words of wisdom to myself back then. Data corruption of tape backups were not uncommon.
My first real computer was a 286 IBM clone when finally I got some income from my first stable job. I worked in a family owned heavy engineering company making tipper bodies for trucks. I bought it with a loan from my employers. Financially it nearly broke me and it took several years to pay off the loan. The pc chassis stayed with me for some years. I upgraded from the motherboard up to replace almost every part of the box until its shell only remained. I finally consigned the case to landfill post to the Pentium P5 or i586 and there after used tower cases. Laptops are mostly what we use now of course, but I still use tower cases to run virtualisation rigs. Small form factors are the future I believe. Lower power consumption will become the norm and a necessity.
I had started at night school having left a normal education track several years before as I lacked funding and any other support to do so. Entering work was the only way for me to gain independence.
At work I was left to use the Telex machine to send and receive messages but to find that mostly, nobody sent messages with Telex. Later fax machines took off, so thankfully I could forget about Telex. I think today I would have enjoyed Telex more as the console is my happy place.
An Amstrad computer, with a 5 1/4 inch floppy drive with Lotus 123 was the first ‘killer app’ I was introduced to - we kept basic spread sheets for accounting and business information. Primitive but functional.
I graduated with a degree at my local, Staffordshire, University in ‘Technology Management’ - a four year, sandwich BSc with Hons. to find no work locally besides that of a computer lab technician at the same university. I did another 4 years but as an employee of the same.
We were each given a set of rooms or ‘labs’ in which we were tasked to install and maintain PCs, supporting servers, file shares and printers but above all, internet access via a proxy.
I remember the first time the students came in to the labs when, over the summer, I had installed everything on dozens of PCs. It was a miracle anything worked - but it did and I even had some happy customers. Some were not, some were horrible but that is the same with life and we all have our off days. I’ve learnt to understand that things can be going on in folks lives that can simply mean, they won’t be nice and that’s just that. Sometimes folks are just not nice and never will be. Others are genuinely a joy to be around. Life’s like a box of chocolates, after all.
The first and worst assumption I made at this time was that I would need to look into computer security in the future. But right then I needed to concentrate on getting things done and learning computing.
This thinking did not prevail for very much more than a few weeks as waves of virus infestations came into the labs in every level of the building.
Mostly these were propagated and transferred by floppy disk, this being the affordable storage media of the day for impoverished students.
Just some of the common exploits that washed up into the labs I looked after in the 90s were …
Brain (1986): While technically originating in the mid-80s, it continued to be a significant presence well into the 90s. It was one of the very first PC viruses.
Stoned (1987): Another early and widespread boot sector virus that would display “Your PC is now Stoned!” upon infection.
Anti-EXE (1990): This was a stealth boot sector virus that could infect executable files as well.
Form (1990): A common boot sector virus that caused the keyboard to click audibly every 8th time a key was pressed.
Michelangelo (1991): This virus gained significant media attention due to its payload, which would overwrite the first 256 sectors of a hard drive on March 6th (Michelangelo’s birthday), causing data loss.
Concept (1995): This is widely considered the first macro virus for Microsoft Word and quickly became very widespread. It simply displayed the number “1” in a message box, but its existence proved the viability of this new attack vector.
Laroux (1997): One of the earliest macro viruses for Microsoft Excel.
Sharefun (1999): Another Word macro virus that could also spread via email.
Melissa (1999): This was an extremely fast-spreading macro virus that used Microsoft Outlook to email itself to the first 50 entries in a user’s address book, often causing mail servers to crash. It would also insert quotes from “The Simpsons” into documents.
For the most part, people who hacked computers, I include myself in this category to this day, did so with a view to making something better for everyone. Indeed, this is the definition of the activity ‘hacking’, at least in my mind and memory. However the term was hijacked by the media and became synonymous with the persona that is up to no good and ultimately, to do harm to others. It is often for illegal gain of one kind or another.
If you take time to reflect, remember the common phrase of the ‘body hack’ to lose weight, for example, which is a good thing. As too is the term ‘mind-hack’ or ‘brain-hack’, so as to remember a thing or learn a thing for personal betterment.
We tried to differentiate with phrases like ‘white hat hack’ as opposed to ‘black hat’ and so on, but perception is the only reality these days and my understanding or viewpoint on this is likely one for the refuse pile of the past.
Boot sector infections meant someone had just “kicked in your front door”. How you react to that tells a lot to yourself and anyone observing about your mindset.
Some did, and continue to this day, attempt a kind of patch up operation. A ‘clean up’ if you will to make good, mend and do.
Your mileage may vary in this regard, depending on the nature of the exploit and the damage that has been done.
But to be in any way sure and to sleep at night knowing you have done all you can the only recourse in IT post to severe security breaches is in my view, to rebuild.
By rebuild, I mean:
This is often a labour intensive, time critical activity, which if left to the hands of unskilled operatives may mean failure of the service and could be a threat that is existential to the organisation, business or venture.
Back in the 90s I think I did resort to the odd quick fix whereby you could apply a script or hack to try and reverse the damage that had been done, but any rational examination and threat analysis would soon convince you to go the whole rebuild route.
I had learnt to program, from a while back now so it was to code I turned and code I must.
Using code I could automate and by automating the installation process of software and operating systems I and the others that worked with me, were able to reduce the impact of having to manually re-install everything, every time we experienced a security breach.
We of course implemented virus protection as it was available at the time, poor as it was but inevitably, new variants would come along that would drive a bus through our defences.
It is this single paradigm of automation that changed the way I saw computing and support roles of any kind.
The terms devos, secops, SRE and so on are used, often I believe, with an idea to their thinking be in some way new or a revelation. Yet for us, they had to become a way of life pretty much from the go-get. And this was certainly so, if I was to stand a chance of succeeding in anything tech related.
A short list of some worms and macros I had direct contact with in the 90’s and 00’s were …
Code Red (2001): This was a major computer worm that specifically targeted Microsoft’s Internet Information Services (IIS) web servers starting on July 15, 2001. It exploited a buffer overflow vulnerability, allowing it to execute arbitrary code. The worm gained notoriety for defacing websites, often displaying the message “Hacked by Chinese!”.
Nimda (2001): This was another highly impactful worm released on September 18, 2001, which also targeted IIS servers[5][6]. Its name, “Nimda,” is “admin” spelled backward. Nimda exploited known weaknesses in Microsoft’s IIS.
IIS Unicode Exploit (pre-2005): This exploit used malformed URLs with Unicode characters to bypass security checks, allowing attackers to traverse directories and execute arbitrary commands on vulnerable IIS web servers. This “Dot Dot” attack could allow commands, such as listing directories (dir c:), to be run directly from a web browser’s address bar, with the output displayed in the browser window.
The above list may seem a bit one sided. Perhaps you think I am anti-Microsoft in some way, but I can assure you, entirely the opposite.
It is because I worked in the Microsoft dominant world of the 90s and 00s that I knew how important making MS software safe was to economic success.
You can like it or loathe it, but Windows has been around for a long while and is the backbone of most business and organisations.
Mac has its place and I’ve had the misfortune to have to use their products in the workplace but have mostly disliked their strange approach to the keyboard and walled garden of distrust for Bash and other GPL open source licence software.
I am mostly irritated by Windows, Mac and Linux annoyances but ultimately we just have to get on with it and get the job done.
If you’ve enough money to burn on the latest status symbol and that makes you feel good, go for it. I’m not one to criticise.
But if I’ve learned anything, it is that to make a cup of tea, I do not need to boil the ocean.
Efficiency is key to most things.
The weapons in the armoury we frequently turn to involve automation, centralisation of auditing, reporting and alerting.
Worms, exploits and viruses are let in by us, not the attackers. They just find a way. We often give them that path and they take it.
If we do not prepare for boarders, the ships we sail will be taken. Pirates and privateers have been with us for centuries and that is likely not going to change for a time at least.
So whatever tooling we can use to encode our and others expertise and knowledge, whatever automation we can employ to reduce operating costs, so be it.
After all, the attackers are using that same approach often to break our doors in, so we must join in that same mind set if we are to stand a chance of surviving.
We all need to learn to code.
Some say that code is poetry. Does a poet ever stop learning or writing poetry? To do so would be to no longer be a poet.
Coding is a craft, not a passing interest or something we can do for a time and then retire to management and so called ‘higher responsibility’ and ‘richer rewards’. It is a fundamental building block of modern society.
It must become the vernacular, colloquial language that is native to the general population.