Make the most of bad tech while it still makes you laugh
Imminent robot overlords will stop the fun soon enough
The golden age of computing? It’s right now. Enjoy it. Better still, take some screenshots to give future generations something to feel whistful about.
For some, the best era in the history of IT would have been when Charles Babbage thought up his difference engine. Not so. All he did was inadvertently establish the fundamental ground rule in the IT industry-to-come that being the smartest man in the room (as opposed to the smartest woman, who would have been Lady Byron) gets you nowhere. Nobody likes a smart-arse.
The most important skill in computing is, of course, being able to persuade very stupid people with lots of money to give it to you.
The second most important is to find ways of cutting corners so that, even if the result is a bit shit and doesn’t work properly, you just tell everyone that it does, pay yourself a performance bonus and move on to the next project.
For example, I feel quite sure that if Mr Babbage had agreed to build his calculation machine from yoghurt pots rather than custom precision-turned brass parts, he’d have been on to a winner. And if it didn’t work at the end? Well, users could just address a letter to him via Penny Post requesting technical support, to which he might reply to the frequently asked question via the new-fangled telegram:
HAVE YOU TRIED UNCRANKING AND RECRANKING IT STOP
For other people, the golden age of computing was just after Federico Faggin chiseled the first 4004 from the Intel rockface. Nah, boring. Or that brief moment in the late 1980s when nobody was quite sure whether the personal computing market would be dominated by IBM, Apple or ha ha hah Acorn.
As well all know now, Acorn won that battle even though it took nearly 40 years.
Yawn, whatever. Surely the best time is right now. Why? Because computing today is funny. What makes it so? Because, for all its advancement, it’s still as shonky as a 19th century yoghurt-pot calculator… but won’t be that way for long. Argue against it all you like until you are blue in the face, but AI is improving rapidly and very soon won’t be funny any more. Cheer up, Smurf dudes, let’s make the most of what we have today!
It occurred to me that we were living in this golden age of craputing while in the middle of a Zoom* call this week. It was one of those hybrid events at an in-person venue with remote participants joining us via a giant screen. One of those remoting in – an important-looking businessman in an actual office and wearing a suit – raised his yellow hand to ask something. He was called upon to speak and, several minutes later, once he had found the Unmute button, posed the following question:
“Squeak squeakity squeak-squeak.”
That’s what it sounded like to us in the conference room, anyway. The audio quality was perfect but his voice sounded like Pinky. Or possibly Perky.
Now, my younger readers, and non-Brits generally, will have no idea who Pinky and Perky are so let me explain. They were puppet pigs dressed in clothes, and stars of their own children’s TV show in the black-and-white 1960s. Their prerecorded voices were speeded up to give them that now-familiar pitch-squeak. They would babble with a human host sometimes recognisably, often incoherently, and even released 45rpm singles covering popular songs of the day.
Somewhere in my cellar I have a Pinky and Perky EP in which they sing The Beatles. Just imagine. The first version of Twist and Shout that I ever heard was neither that by The Beatles, nor by the Isley Brothers, nor even the original by the Top Notes, but the one by Pinky and fucking Perky.
As with all legacy children’s TV properties, the characters have been given a 3D reboot in recent years but I prefer my puppet pigs to be, well, on the tatty side if you know what I mean. Their well-intentioned, unironic crapness makes them funnier. Here’s how gloriously awful they were:
They are not to be confused with Pinky and Perky in 3D:
Despite attempts by the Zoom host to talk the stony faced businessman through the process of checking his audio inputs, nothing could be done. So the latter just continued with his Pinky and Perky voice, much to our immense satisfaction in the conference room. By the time he had asked his fourth follow-up question, we were pissing ourselves.
By the way, at this same meeting there was another memorable moment – hardly the fault of IT, mind – when during a particularly dull slide deck presentation one of the Zoom participants could be seen to lean forward really close to her webcam. We thought she was going to interject with a comment, whereupon she raised her spectacles, leant in even closer and began picking at her eyelashes.
We were transfixed – as indeed would you too if faced by gruesome eyelash grinding on a two-storey high video screen at a conference. It was like watching Eight Legged Freaks in IMAX.
This raises another aspect of contemporary computing that makes it a golden age of ridiculousness: IT has advanced far beyond human capacity to know what it’s for. For perhaps you and me, that webcam is for relaying live video to another location; for others, apparently, it’s a convenient makeup mirror.
But isn’t this just another example of bad IT after all? I mean, you can send private messages to individual participants in a video call via Chat; so why can’t you speak or webcam privately to them too? Where’s the consistency?
Soon enough, artificial intelligence will sort this kind of thing out with, I dunno, realistic personal avatars nodding thoughtfully on-screen as the Powerpoint drones on, leaving us to use webcam self-view to reapply our eyeliner in peace. That’s not funny at all. There are no laughs to be had when computing starts to make sense.
With this in mind, I thought I would make the most of generative AI tools to have a bit of photo fun while the tech is still rubbish. I promise you, it will be no fun once it gets good at it. Seize the day, my blue-faced pals. And that’s what I did, and I hope you enjoy the results shown below.
The idea was triggered when I received a promotional message from Adobe’s education team inviting me to download a teacher’s guide to using gen-AI tools engagingly as part of a classroom history project. Here’s a screenshot of one section:
I know the image example is small but that bearded fellow on a skateboard doesn’t much look like Shakespeare to me, ‘modern’ or otherwise. I’m guessing it’s Victor Hugo, Charles Darwin or Father Christmas, albeit the last-named is not so well known for his literary output.
Anyway, I thought I could do better than that. So I fired up Photoshop and asked it to generate my own ‘Modern Shakespeare’ using AI. Here’s what I got:
OK, I’m not sure what’s going on with his collar and lapels (it is ‘modern’ fashion?) or why he’s wearing a hat, or indeed why he’s blessing us as if he was the pope. He has a beard, though; Shakespeare had a beard, so that’s a tick. He’s got a lot of hair, which seems odd. Hmm… just to remind you, here’s what we think Shakespeare looked like:
Fair enough, I don’t know what’s going on with his collar either but the man himself was considerably less hirsute. And noticeably hatless. No worries, let’s try again. Here’s generative AI’s second attempt at ‘Modern Shakespeare’:
Until now, I hadn’t thought of Shakespeare as a man with no ears punching himself in the face. Adobe’s correct: this is quite an education. Still, maybe we could try again? Here goes:
Shakespeare definitely had a problem doing up his collar, it seems. And AI is pretty insistent that he should wear a hat although I can’t imagine why. To hide his baldness? Let’s give it another try:
I suspect I am beginning to learn more about Shakespeare than I wanted to. Feel free to use this image for your party games over the holiday break: just show it to your friends and challenge them to guess who it is. The first one to cry “Oh, it’s modern Shakespeare!” wins an extra slice of Christmas cake and six appointments with a psychiatrist.
On a roll, I thought I would let Photoshop’s AI modernise another colossus of English literature. Show me, I commanded, ‘Modern Dickens’.
If there is one thing I have learnt from this educational experiment, it’s that great authors have trouble getting dressed in the morning. OK, as with Shakespeare, this guy has a beard, Dickens had a beard… but is that enough? Again, let’s have a quite reminder of what the real Charles Dickens looked like:
Shall we give AI another go at ‘Modern Dickens’? Why not? I told you this was fun!
Wow! Stupendous beard – and wrists like a gorilla’s, which is what great literary authors need for all that writing they do. In fact, modern Dickens has been writing so much that he’s had to bandage his right forefinger and appears to be holding up the document in front of him with his dick.
Oh go on, let’s have another!
What the (modern) dickens? I get the impression that if you keep asking AI to regenerate the same photo over and over again, eventually it gets annoyed and starts feeding you Gay Shakespeares and Gay Dickenses to encourage you to amend the query. Still, it confirms my suspicion that a Gay Dickens would know how to dress himself.
So hark my words! Stop all this blue-in-the-face fretting! Artificial intelligence is on the cusp of sweeping human intelligence – and creativity, and our jobs – into the dustbin. Before it does, let’s squeeze a bit of fun out of it!
Come on, humes, let’s have a good laugh at shit IT in these final few months we have together before it all goes Skynet!
Alistair Dabbs is a freelance technology tart, juggling IT journalism, editorial training and digital publishing. If he is not taken over by robot overlords by next week, be assured that his column will continue throughout December and the festive season – plus an extra cracker on Christmas Day.
* Other remote conferencing services are available.
I can just about remember Pinky and Perky, as well as Bill and Ben.
But of course nothing came close to Gerry Anderson.
Brilliant investigation Mr. Dabbs, Sir. And thorough. And wistful.....
Isn't it strange how AI creations always seem to be missing something, but when humans are at their most creative it doesn't make sense at all..... until later. Often much later.