Computers and the older generation
Figure 1: "We are made wise not by the recollection of our past, but by the responsibility for our future." – George Bernard Shaw
Soon we'll be starting research and interviews for a Cybershow mini-series featuring older people who are technology experts in their 70s and 80s.
One of the enduring lies we are told about digital technology is that "old people have trouble with it". Even if partially true, it's not for the reasons you probably think. At a time when the demographic centre of Facebook is passing middle-age, and Gen-Z are throwing aside their smartphones to go to church or the football park, we should examine this myth closely.
That claim, circa 1990, that your "granny doesn't understand computers" was as much a cultural invention as a reality. It suited a broader social project to cast those who were critical, unwilling, or simply disinterested, as being "unable". Such large-scale social engineering was driven by a national need to automate, then to up-skill. It remains highly relevant today in the current wave of political skulduggery around managing diminished employment and living-standard expectations as "AI" is foisted onto the populace.
It brings up many difficult questions about our relation to technology around:
- Work and employment
- Pastimes and interests
- Ability, learnable skills and time
In the 80s I attended "computer clubs" where we wrote BASIC code and soldered chips together. I was still not a teenager, but the people running clubs were distinctly greying fellas in their 50s, 60s and 70s who delighted seeing younger people take an interest. There was an overlap between electronics and computing and other "geeky" interests such as mechanics and model making, amateur radio and so on.
Most enthusiasts came from backgrounds in computing and technology already. Some readers may be surprised to discover that computers did not magically appear with the "home/micro-computer revolution" of the 1980s. My mother had worked at IBM in the 1960s, and it was she, not my father who bought the Sinclair ZX for my 1980 Christmas present. The first computer game I played was designed and written by a 50 year old air-force captain who was my neighbour, using a fridge-sized terminal he had at home for "government work".
A striking aspect of early computer culture was that it appealed to all ages, and this age agnosticism fostered teaching and knowledge sharing that didn't happen in schools. As kids already immersed in a world of science-fiction we were naturally interested in computing. The Cold War and changing global economies of the 1980s set governments upon ambitious projects of digital literacy. In Britain the BBC ran nightly educational programmes and we could download apps via Teletext.
Suddenly, almost from nowhere came a message that older people would need to "catch up" or be "left behind". As far as I was concerned, as an attuned, sociable and observant 12 year old, this had no basis in reality. It was older people who understood technology and if you payed attention to them you could learn a great deal.
Soon a bizarre, patronising prejudice took root. Because of some intrinsic deficiency in their DNA, older people, we were told, needed help with computers. Interestingly, many adults happily played up to this act. At school my precocious talents were lionised. Me and my friends who could operate pocket calculators were "whizz kids".
I was called into other classes to "fix" the computers that teachers claimed they could not operate. I was scandalously left in charge of the first computer systems that held school records, back when most everything was still paper, I was the teenage "IT department".
What was happening during those strange times in the 1980s was "youth-worship". There is a natural tendency of tribes to periodically ascribe undue power to youth, usually as a sacrificial prelude to collapse or war. It's there in the new-broom politics of Pol Pot, Mao and other revolutionary progressive (or at least radically anti-conservative) movements, to eradicate established values, usually in service of an ascendant ruling class.
However, I noticed then as now, that the reluctance of adults was never rooted in aptitude or ability. Many of the older people I knew as a pre-teen were simply sceptical or uneasy about the "digitalisation" of society. Instead of offloading responsibility to young people with glib remarks like "you are our future", they aired concerns clearly and distinctly.
One was my woodwork teacher. Knowing I was an unusually bright kid he attempted to caution me. There were, he said, "enduring and timeless skills, and ephemeral fancies". He suggested my love of mathematics was something I could "rely on", but perhaps the idea that computers would "be the future" required scepticism. I mocked him ungratefully. Half a century later with my educated friends having sent their children to study plumbing, brick-laying and cookery instead of to university to indulge in increasingly precarious subjects, I concede, you were right Mr Brown, and I was wrong. Computing has in many ways been a mis-step and empty promise.
Adolescence requires healthy individuation. That process of separation, to make a fresh psychological space for oneself, changes with each generation, but it is a wave Capitalism has learned to ride since at least the 1950s with the invention of Rock'n'Roll and "the Teenager". Looking back it is embarrassing how easily we were recruited and how easily our young minds, hungry for praise, were turned.
The claim that "The kids are all doing it" is both a predictable and tiresome appeal to the avoidant vanity of proto-adults still to face maturity. The truth is, when it comes to "AI" young people neither know nor care what technology is behind their convenience, which is merely a way to cheat at life which is stacked up to cheat them. It is a matter of survival, no more. If anything they are dimly aware of the hypocrisy of chastising an oil-guzzling boomer generation for handing them a broken climate while their own use of smartphones, datacentres and "AI" similarly constitutes a planet-killing energy drain and source of pollution. They've been sold the same "efficiency" lie as the rest of us.
But this youth-worship is not innocent or without harm. It creates an unhealthy relation of inverted responsibilities. It's something we see in dysfunctional families such as immigrants where older members do not speak the language of their naturalised home, relying instead on their kids to translate official and legal affairs. These children are forced to grow up too soon, dealing with the police, health and social workers on behalf of their parents. The phrase "digital natives" must be seen as particularly pernicious in this regard. The outcome of the Netflix series "Adolescence" is clearly to be held in mind here.
A key takeaway here is that painting "inability" to use technology as related to older age is a way of ignoring or dismissing the huge difficulties young people also have with it. Romanticising the young as "whizz kids" should not be acceptable. Indeed, singling-out any "special disability" groups and labelling them as defective is a great way to dodge the unpleasant truth that everybody now faces great difficulties with technology. Much of that difficulty is coping with the moral injuries it inflicts.
What we have discovered during research and interviews with The Cybershow is that young or old, people are equally curious, adept, or disaffected by technology. However they are treated differently, by custom, culture and the law, in bizarre and incongruent ways, many of which feel distinctly Victorian.
Children, it turns out, are far more terrified and disturbed by creepy technological advances than adults who have built up some fortitude and understanding of the world. Conversely, many of the reasons for which we rightly want to ban smartphone use for children, are also perfectly applicable to adults.
Tearing down these dangerous ageist assumptions is now an urgent task because we face a new problem. "AI", which marks the politicisation of neophyte cultism is plunging us into a "leaderless society". The 2023 Bletchley Park Summit was a sham because it was a nakedly corrupt convention of "AI" investors.
The monster now on the horizon is not post-industrial unemployment, but a "post-political" era following from an "electronic coup". Standing against this prospect we have an out-of-touch cadre of politicians who are dodging responsibility, even thought itself, in the face of complexity.
Older people have seen more technological change in their lives and have a better idea about the effects of that change.
It is not because she is a "Luddite" that my mother passionately hates the "cashless society" which means she cannot park her car, or the inability to book a doctors appointment while standing right in front of the doctor… it's because these are patently stupid and regressive ideas. Our inability to reflect and challenge the failures of technological society is one of many modern 'deep' problems.
What we face now is not so much that bad decisions are made, but that decisions are failing to be made at all. Moreover, our technology culture is obstinate and intransigent. We seem to lack the capacity for self-examination, for to do so and discover failings would upset too many profit interests or egos entangled with a one-dimensional conceit of progress.
Based on what I've seen of the youngest "Gen-Alpha" radical revision and re-examination of the technological project is alive and vibrant. It is conspicuous how they are addressed by propaganda/influene and how their voices being muted. But it is clear that reluctance is not coming from just the "older generation". We are united across ages in a shared interest in technology.