Mark my words, GenAI critters… if you know what I mean
“It looks like you’re typing an obscenity. Would you like help?”
“Speak the language of your audience” the traditional saying goes “and they won’t have to run it through DeepL.”
The words we use are important because, in general, people tend not to remember more than a few key phrases here and there – “takeaways” in modern parlance. Summarise, for example, the advice given by Deborah Graham, psychic (it says here) and author of Guide to Attracting and Keeping Your True Love.
“You have to pull the weeds away from your heart and soul, and keep pulling,” she urges readers. “If you can create positive, vibrant energy within yourself, you will emit positive vibrations. But if you believe that life sucks, it will suck, and you’ll never find what you’re looking for.”
So the key takeaways are: keep pulling, emit vibrations and it will suck. Got it, thanks.
To be fair on Ms Graham, she was plugging her book long in advance of St Valentine’s Day when I learnt this. I only just remembered now what she wrote because of the paucity of April Fool articles in the media this week. This in turn reminded me how proxy browser service company and serial publicity bullshitter Geonode released its own April Fool story into the press in early February too, some six weeks before the competition, or indeed anyone else with a calendar, might deem appropriate.
Geonode’s joke concerned a set of (obviously non-existent) augmented reality spectacles that boasted “85% accuracy in predicting love lives”. You wear the glasses as you live your day-to-day life and they remember everything you say and do, while an algorithm mashes this into a profile of your ideal romantic partner. The specs then flash messages before your eyes, suggesting topics of conversation and ideas for what to do on your dates, thereby hopefully postponing the moment when your would-be love interest comes to realise how dull you really are.
Some websites tragically republished the article as-is, and who can blame them given that 1 April was still two months away? The rest of us just looked at the email header and thought “Oh God, it’s Geonode again.” And it would have occurred to us that a pair of algorithmically enhanced AV glasses that can determine what kind of partner you fancy is overblown. What’s wrong with beer goggles? Put a pair of these on and everyone looks like your perfect partner.
Besides, anyone back at the Geonode office watching a feed from the webcam attached to your AV specs as you chug over “latina hamster” vids could probably work out your romantic preferences without the aid of an algorithm.
Annoyingly, Elon Musk’s Neuralink didn’t bother waiting for 1 April either when it announced in February that it had successfully implanted a chip in someone’s brain to allow the patient to move an on-screen cursor with their mind. Darin Parker of adult entertainment platform CamSodo immediately issued an open letter inviting Musk to collaborate on a “hands-free self-pleasure” product.
Maybe Neuralink, CamSoda ad Geonode should get together and make it a threesome. Or even a foursome, bringing some more (real) tech into the orgy, courtesy of researchers at the University of Utah who think they’ve found a way for contact lenses to record telemetry powered purely by a flexible on-board silicon solar cell and, oh yes, human tears.
If only you could see what I’ve seen with your eyes, eh?
You may have noticed my use of the expression “to chug” earlier. I’ve only heard this word used back in Scotland but perhaps it’s in use elsewhere. If you need me to explain what it means, I’m afraid I’ll have to use further onanistic euphemisms, such as “manual override” or “shucking the corn”. There are plenty more where those came from; feel free to consult Roger’s Profanisaurus.
Using everyday slang, rather than just dry, official words, is part of what “speaking the language of your audience” is all about. It’s something potentially useful that programmers designing chatbots have begun to pick up on, and heaven knows they need to. If you have ever tried to “clearly state your support problem” in one of those pop-up chat windows at the bottom-right corner of a website, you will know what I mean. Unless you accidentally happen to type in the kind of jargon that the systems team itself uses, the chatbot will think you’re typing in Martian.
Worse, language is a moving target as people insist on using words and expressions incorrectly, repeatedly, until eventually they become accepted as the normal way of saying things that in the past used to be much simpler to express.
The first proper Londoner I ever met was a student living on the floor below while I was at university in the Midlands, and he had a habit of talking in what I can only describe as London Arse. Like Nick Hornby in Fever Pitch, this guy become increasingly Londonish as time went on because it was expected of him. We’d go food shopping, pick up some apples, then collect some pears, and ask him to tell us what they were just to hear his accent. So he played to that by devising fantasy cockneyisms to entertain us, such as “Aw-wight, do us a lemon, John!” (we never worked out what that meant) and “Ah’m buffing me old bamboo” (i.e. chugging).
Chitty Chitty Bang Bang was never the same again. I knew I wouldn’t be able to look Dick Van Dyke in the eye after that.
Accent, slang and euphemism detection is going to be even more important in the age of GenAI. As things stand, the various so-called “AI-enhanced“ features that I see added to the apps I frequently use are hopeless because of their inability to understand what people are actually saying. If I send the transcript of a meeting through such AI routines to organise it into searchable sections under themed headings, I get delivered a mountainous heap of irrelevantly reorganised bollocks. The AI will always pick up on the tiniest tangental comment and turn it into a Mighty Chapter of Significance in my meeting report, so I end up having to do it myself anyway.
As for such AIs understanding Scottish slang, forget it. If they’d made Spinal Tap in Aberdeen using AI actors, Nigel Tufnel’s amps would never have made it to eleven.
Just imagine when AI is let loose on video recordings of long, boring meetings or training sessions, automatically editing the footage down to just an essential 10 minutes of highlights. Sounds fantastic, eh? Except you can be sure that seven minutes of those highlights will be zoomed-in footage of the attendees picking their noses, as that’s what the AI thinks the meeting was about.
And it’ll be our fault for not enunciating clearly “And now let’s move on to our next item on our agenda which is…” rather than muttering “Oh for fuck’s sake is there anything else?” while the grid of deadpan poker faces in front of you is interrupted only by the occasional nostril-bound finger.
On the other hand, imagine what it will be like when AI does catch up with the way we speak. It’ll be a riot. Personally I cannot wait to tell my computer to go chug itself… and have that computer understand.
Alistair Dabbs is a freelance technology tart, juggling tech journalism, training and digital publishing. He would like to try out those Utah Uni smart contact lenses but worries that if the weather turns wet, everything he has ever seen – things you people would never believe – might be lost, like tears in rain.
I had a 2014 Ford and whenever I tried to enter a SatNav destination and say "city" it thought I said "CD" and began playing music.
It is a crime against humanity that Roger's Profanisaurus and the brilliant Harrap's "Pardon My French" are out-of-print and unavailable as eBooks.