July 13, 2022
-by Frank Macdonald
We have a thing in the house that reminds us twice a day to take our drugs.
It’s probably the kind of thing drug pushers would like to provide to their street-trade clients, except those clients would probably pawn the thing for the price of a fix.
Our thing can do other things besides keep us drugged for our own good, including give weather reports or find an Irish music radio station.
The thing is called a Google Assistant.
Unlike Amazon’s Alexa or Apple’s Siri, a Google Assistant doesn’t pretend to have a personality. It’s just a stupid digital machine that happens to know more than I do. That’s why I chose it over those other corporate things with people’s names.
Calling it Google Assistant lessens the chances of my becoming attached to a thing the way one might to a thing called Alexa or a thing called Siri, in the same way one might grow fond of a dog named Fido or a cat named Pussyfoot. My personal detachment allows me to abuse my Google Assistant when it doesn’t deliver the answer I want to a question I asked. It’s just a thing.
One would think that having an artificially intelligent thing infiltrating the living quarters of real people would be a profitable enough intrusion.
Yet according to rumour or artificially intelligent sources, these things can also eavesdrop on household chatter. According to corporate elusiveness when questioned, the explanation offered isn’t a flat-out denial but a corporate attempt to better offer our digital things a more convincing conversational tone. Apparently, that would make us more comfortable engaging with our artificially intelligent things. Which makes them the corporate version of Canada’s CSIS or the US’s NSA.
The unfortunate thing about people who tinker with things like this is that they can never sense when enough is enough.
In the latest development of these things, people may soon find themselves being spooked by the ability of Alexa or Siri or even my Google Assistant to mimic voices familiar to family members of any given house. Your late Grandmother’s voice, for example, or a family member you buried yesterday.
If a corporation in question, Amazon, for example, is given access to less than one minute of a late relative’s voice; Alexa’s voice can become that voice. Such a talent for mimicry can have far reaching consequences as the artificially intelligent world reaches out from the grave.
Instead of a few musical notes reminding me that it is pill time, I could suddenly awaken to the sound of my father’s voice at the foot of the stairs screaming that if I don’t get a move on, I’ll be late for school. It’s enough to make the hairs stand up on a head that hasn’t had a hair on it since about 1968.
I assume that the developers, who have all the sensitivity of an artificially intelligent machine, think this is a great idea, that people will thrill to hear the voice of a dead mother, brother, or grandparent summoning them to the phone to speak with a bill collector, or reminding them of yesterday’s undone chores, including the need to cut more kindling for the coal stove that hasn’t been in the kitchen since about the time my hair stopped standing on end even with the assistance of Brylcream.
There may be a massive sentimental market that these corporate states have tapped into, sending their researchers out to graveyards where some headstones have an embedded tape recording or CD installed so visitors to the grave can listen to their Master’s Voice. There will be people rooting through their attics for a ancient reel-to-reel tapes of Mom and Dad singing an almost legible Gaelic song.
Mostly though, I foresee family gatherings, usually in a summer backyard around the BBQ, eating and drinking and drinking and drinking their fill, then returning to the house to turn on all the artificial things containing the ancestors’ voices. The result of which they scare the bejesus out of themselves with a séance summoning all the dead relatives to all the digital things around the house, listening to them continue fighting beyond the grave over the fact that someone forgot to leave a will, or someone else called to his dear dead wife by another name: “And who exactly is Siri?!!!”
Late for school or not, I think I will leave my father to rest in peace on behalf of both ourselves, and listen instead to the musical notes that tell us it is pill time in our (still) un-haunted home.