Digital Inclusion Coffee Mornings
Figure 1: "Chat over a nice cuppa"
We speak to many and varied people at The Cybershow and keep close eyes and ears on the things regular folks say about technology.
This month we are joining a Digital Inclusion Coffee Morning in Portsmouth. For this event most of the audience will be older people. We're keen to get some new input. This thread focuses on older peoples' observations, complaints and wishes. Based on earlier encounters we've identified issues around:
- condescension
- complexity
- fear and crime
- trust
- being heard
- education
- effort required
- isolation
- coercion
- ethics
Fact finding and advice
We use Digital Inclusion meetings to carefully listen, analyse and discuss what people find difficult and distressing, and hear their ideas on how to improve things. We can also use some time to field questions-and-answers, offering some practical advice and education on:
- sources of independent help and information
- device settings to change
- alternative software, behaviour and technologies
- issues to take a stand on
- questions to ask about technology
Condescension
"Don't make this just about age"
Many older people are very unhappy that they are treated as a special group at all and feel patronised and talked down to about technology. But that is only because everybody is talked down to about technology (The young don't notice because everyone talks down to them about everything all the time). This is a huge cultural obstacle to face.
A popular but mischievous contemporary frame is that "old people need help" with technology. It's addressed in these two blog posts on age and tech and digitalmaturity.
As a broad claim, we find that older people are:
- more reflective
- have seen more technology change in the world
- have more experience of the meaning of change
- are (and feel) responsible for making the world we're in
- feel greater duty to shape a positive future
People use technology in different ways and have different needs which change through different stages of life, career, wealth and health. Inappropriate and even dangerous technology happens when it is not thoughtfully chosen. This happens when it's forced upon types of people who are "expected to use" one-size-fits-all solutions. As an obvious example consider the use of money;
Older people use financial technology for more serious things.
- significant savings
- health and insurance
- wills, home sales, pensions
Younger people attend social events and spend casually in smaller amounts.
We therefore expect that in a trade-off of security and convenience, younger people would choose convenience whereas older people would favour security. No one, fixed system can offer this. In a natural market we would expect banks to offer a diverse range of product choices. In reality they try to funnel every customer toward the same industrial, "efficient" solution.
This is just one example, but it shows how we need to make technologies that are more:
- flexible
- diverse
- inclusive
- interoperable
Yet we still talk of a "banking industry" - that is to say commercial banking is industrialised. Given modern technology it could be, and if you can afford it occasionally is, highly personalised. Banking is just one example amongst many life activities that are affected by our unsatisfactory approach to designing technology with the breadth of human life and politics in mind.
Complexity
"Tech is too complex."
It's another way of saying technology has "bad usability". This is often heard in the context of security. Over-complex security fails. Complexity arises from wannabe "features" (things we don't really want, but some 'bright spark' thinks is cool). Engineers call this "feature creep". It's often driven by out-of-control creative marketing and "innovation", for example; operating system makers force "AI" on people whether they ask for that or not.
In 1980 my radio and TV had three or four knobs. A five year old and a ninety-five year old could work them with equal ease. My washing machine had four controls. A kettle or toaster needs a single switch.
Device usability, as immediate affordances, regressed in the past 40 years. Finding domestic appliances that aren't over-complex internet-connected spying devices which incidentally do something useful is a challenge today. It takes my mum 10 minutes to switch on her TV using 4 different remotes.
Fixing complexity
None of this happened because engineers got more stupid.
Whereas it once took a button press and less than a tenth of a second for the TV to switch on, it now takes seven steps and as much as half a minute to "log on" to the BBC iPlayer. Those steps add no functionality. They add no security for the user. They are there to make sure that viewers have paid for a TV license and to stop viewers in other countries watching BBC content. They are security for the service provider. Which is pretty much an indirect way of saying they are security against the user. Almost every other service on the internet now comes with similar baggage, additional cost and labour that is offloaded and imposed on the user, and against the users' interests.
As a rule, engineers still design things to be as simple and efficient as they can. But then UX gurus, marketing and PR, publishers, optimisers, engagement experts, middle-men, financial schemers, "intellectual property" lawyers… all come along, seize control and ruin digital products.
Should the law stipulate basic complexity requirements like those for accessibility? Probably not. Would that kill innovation and dumb-down all technology to the lowest common denominator? Whatever you think, it's clear markets have badly failed to balance the rights and needs of customers to "have stuff that does what they want".
The central crux of utility has reversed. Things are designed for the utility of the supplier, and since we now have monopoly suppliers, the end user is told "take it or leave it". Increasingly the wise choice for your online security, peace of mind and mental health is to leave it.
Apple led with advances in User Interfacing (UI). But Apple, took a domineering and sometimes disdainful approach to users, where "stuff just works" - but at the price of only working the way Apple says it must. Apple offers a simplified "walled garden" but then exploits its closed ecosystem so that almost every aspect of computing is leveraged to extract more money.
From a security point of view I often recommend Apple to much older users who have plenty of money and very few technical needs.
The problem is not actually too much control, but the lack of it. Tech follows the whim or ambition of vendors foisting inappropriate, seemingly arbitrary technology on them, which is made very rigid on purpose.
An overlooked aspect of technology is fidelity, in other words "Do what I want".
Computers and robots are characterised in culture as faithful servants. If they are not tools to do our bidding then they are servants of someone else who seeks to control us. So-called "smart" devices notoriously do what they want, not what the user wants, and as "AI" is added to more products we will experience more unexpected and treacherous behaviour.
All kinds of UI, including "AI", sits between the user and the actual computer, to interpret simplified commands and hide complexity. This will always trade off against fidelity. The ability to instruct a computer to do exactly what you want is more or less gone from main brand products like Microsoft, Apple and Google. Such products are reduced to pre-set buttons (apps) that do things like "take a photo", "pay a bill", "send a message".
A question is, "Is a device that does what it wants secure and trustworthy?"
What happens when you decide you don't want it to do some thing any more, or to change how it does it? Do you have control, as you would expect, over your own computing device?
The ability to disable features is highly desirable if not essential. The problem with many products isn't that they don't do enough. It's that they don't do the things people want. They lack fidelity.
However, manufacturers ensure inflexibility by imposing restrictive software, and often do so under a false pretext about security. In a nutshell, computer security is more about the things that you don't do, so being able to pare-down system functionality has a high positive impact on security.
Some manufacturers and tyrannical governments want devices that do not allow user choice and configuration. Their features are not for the benefit of the end user, but in service of deals they have cut with "partners" in advertising, data harvesting and surveillance. Therefore software and hardware vendors lobby against Software Freedom.
Software Freedom also guarantees you can add features to your own devices. Your rights to run whatever software you like on devices you own is as fundamental as freedoms of your body, mind and enjoyment of your home. Some nefarious people think otherwise and would take that right from you if you let them. They want to control the only shop that you can buy from.
People use freedom-respecting software because it allows them to opt-out of imposed functions. Democratic governments must support these sorts of basic digital rights by supporting Software Freedom if they value long-term political ideals.
Moreover, freedom permits security. The freedom to inspect and fully control your devices is what makes your security possible. Anyone who claims they need to 'lock-down' your devices to make you safe is a serpent.
Impermanence
"It's never the same from one day to the next"
Tech is always changing. Way, way too much! Whereas tech makers claim this as "progress", in reality other factors operate, such as competition to too hastily launch products, reluctance to maintain and support goods in order to create new sales.
In technology, like the theatre, there are only a handful of basic stories. Computers can send messages, edit text and pictures, make calculations, store data and a few other things. All of the applications ("apps") you see are ever-changing combinations and variations on these themes. A lot of functionality is contrived. Converting media from one format to another. Elaborate authentication and log-in schemes.
Companies make money by endlessly rearranging the chairs. The appearance of things changes, the colour schemes, the layouts, the menu structures. Poor security becomes a feature if you need to convince the user that you're supplying "value" through endless updates.
Solving impermanence
There is no cure for change.
However, mandatory long-term support, open standards and inclusivity of older tech is a massive step toward solving this. The real lifespan of most technology is 40 or 50 years if treated well. Physical decay of electronics is inevitable.
But our obsession with re-inventing the same thing every 18 months, and designing systems to break deliberately, or even sabotaging them out of commercial greed, must stop. If only for the sake of the planet.
We cannot endure a technological society in which half of the people are forever "left behind and catching-up". It cannot be remedied by forcing people to adapt more quickly, or by forcing ever changing regimes of updates and security policy on people. It must be fixed at the root by making creators of technology responsible for its full lifecycle (potentially many decades).
The only practical ways to do this is are:
- revolutionising software engineering to make companies build products right before they're allowed to release.
- ensuring end-users have the means to take responsibility to fix things themselves if they choose
Much of this is already a solved problem if governments adopt and enshrine as laws the principles of Free Open Source Software, giving:
- proper, full documentation of devices
- modular, easy-repair designs
- spare parts availability
- full source code and ability to re-program devices
- complete control over access permissions (no backdoors)
These issues are addressed under various "Right To Repair" movements which are now spreading across the globe and yielding concrete regulation and legislation.
Governments can help by regulating interoperable standards, modular design with repairability in mind, standardised power and communications interfaces. Organisations like ISO, NATO, EU play a central role here.
The days when a company like Apple can remotely sabotage a device must be consigned to the scrapheap of history. "Updates", even with tacit consent, is not a good enough reason to void the basis of anti-hacking laws like Computer Misuse Act. Apple or Google hacking my device is no more acceptable than me hacking into those companies to inflict my will, criminal or otherwise.
Trust
"You can't trust anyone nowadays"
Example: Banks authenticate customers. Customers don't authenticate banks. They should.
We all get very confused about trust. Especially managing degrees of trust, and changing our minds about it. Identity is not the same as trust. But it helps. Trust and anonymity are quite compatible if incentives are right. Trust is not transitive. Alice may not trust Charlie just because Bob trusts Charlie and she trusts Bob. There is so much to say about trust, but we never get a formal education to guide us in these things.
Trust applies to groups or whole nations as much as being between individuals. Norway is considered a "high trust" society where people can leave their shops unattended with an honesty-box. A low trust society is weak, hostile and costly for everyone.
In Britain we have become a low-trust society. People are not taught how to evaluate and establish trust. They should be. They are not well advised. Almost nobody wants to step into that debate because to comment on trust is an invisible highly-political issue. The closest we get are occasionally the advice given by Metropolitan Police (Fraud Squad), NCA or NCSC, and all of these organisations make mistakes despite doing their best.
A hindrance to reliable and secure computing is suppression of information. Security professionals publish and share knowledge of vulnerabilities to fix things quickly. But manufacturers and bad governments don't like that. For many and complex reasons they like "consumers" to be kept in the dark so as not to "lose confidence" in the technological programme. They don't want the public to get "too clever", because as the "Arab Spring" in the golden era of democratic technology showed, an empowered population can take power from the established order in rapid and unexpected ways. Having a few walled-in social media monopolies suits governments over distributed and devolved technologies like email and the web, because bad governments can impose collective punishments, simplify bulk surveillance and force block policing of speech.
All major security revelations follow a similar lifecycle. To start with those who raise the alarm are attacked and labelled as mischievous or paranoid. Yet almost every "paranoid delusion" turns out to be true, and worse than imagined. Manufacturers apologise (do damage imitation and bury the story quickly as old-news) and the cycle begins again. This "patch and pray" culture has existed for over 50 years and seems unlikely to be replaced by any revolution in software quality.
Sadly this means not only can we not trust our devices, we cannot trust the information and advice we're given about them. We need to shop around for alternative views and experts.
As a rule our technological society makes individuals highly legible, through platform accounts, location tracking, KYC and payment tracking, face-recognition, tying people to devices, constant surveillance… This makes people very vulnerable. At the same time large companies keep themselves and their officers invisible and immune, whether by a stonewall media perimeter against the public, shell companies in offshore tax havens, systems of deflection and PR, unfair (and often illegal) terms of service.
This balance of power needs readjustment. It's a process we call Civic Cybersecurity.
It's getting to be common sense that you "just can't trust people in tech". Everything about it is becoming seedy and scuzzy. For those of us who've devoted our lives to working in tech that's a painful mark to carry. Every time a BigTech company are in the news for their latest blunder its a slur and embarrassment to regular engineers and computer scientists.
In Europe, Apple, Google, Microsoft, Amazon, Facebook… are hardly celebrated as amazing companies that bring value. That was 2005. Twenty years on they are tolerated as unfortunate necessities at best. At worst they are quite despised. Our advice from a cybersecurity standing is to avoid using any of these US based companies so far as is possible. But it is no use if they are simply replaced by European or British versions of digital bandits and warlords. We need to establish fundamentally different laws on this side of the Atlantic for companies that provide digital services to society.
Fixing trust
One does not fix trust, one builds it.
In most cases, trusted entities that have defected (lost trust) cannot ever recover it. New systems need building around the ruins. It is vital for governments to provide stimulus for competition in these areas and to promote new technologies that disrupt the status quo in favour of smaller players. We need:
- new and secure software distribution platforms that are independent of big vendors.
- a revolution in the "trusted software" industry to diversify and stimulate new authentication schemes, distribution of trust, new models of authority and certification that take power away from BigTech.
- restoring control and choice around technology to the citizen and ensuring charters and rights protect personal devices and personal data.
Receptivity
"Nobody listens to us. It's all about profit"
Companies and organisations, including banks and politicians, don't listen.
- They already have an agenda (and they want tech to fit this)
- They lack critical knowledge but pretend otherwise
- Lack respect for peoples' struggles with technology
Dismissive attitudes are common. You are "an outlier". You are told "It works for everybody else".
Various sorts of tech "gas-lighting" and marginalisation are well documented. For example the UK Post Office "Horizon Scandal" was marked by consistent disinformation/propaganda campaign against the victims (sub-postmasters) that "Nobody else has this problem".
Companies like Google and Facebook are notorious for abysmal customer support, ghosting and stonewalling customers.
In general there are huge problems of unaccountability around almost all areas of digital tech.
Fixing Receptivity
If people don't listen to you then maybe:
- you're not important to them
- your language is not strong enough
- you're talking to the wrong people
It is said that you can never convince someone if his salary depends on believing otherwise. It is not in the profit interests of BigTech listen to people. They "know better than you".
The idea that if we pay for a product like software or devices then its suppliers should listen to us is history. Belief that "the customer is always right" vanished at least 30 years ago. The customer of digital technology is caught in a pincer movement between necessity and crumbling quality, while remaining largely ignorant of what drives either. The only practical ways out are:
- move and find a new space to build fresh technology
- do it yourself, take control
- remove necessity/dependency
- use force/coercion (government) to increase quality
The answer then is a mix of all these strategies - choosing different entities, different kinds of people with which to do business about technology - using code and systems that restore self-control and give you a bigger say in how things work - widening or bypassing the choke points used to force use of defective or treacherous technologies - and lobbying, recruiting and shaping the political process to bring BigTech to heel.
The alternative is community. We forget that the political philosopher Alexis De Tocqueville identified three distinct aspect to society, not just government and business, which were minority powers, but the huge body of civic society that originally made up American and British culture. Today Free Open Source Software accounts for the majority of all software working in the world.
It's amazing that the software we all use is mostly created by citizens, your peers, communities of unpaid people around the world. The software revolution that put it in the hands of the people already happened.
It's just that:
- The message didn't get out, and as much as it can BigTech suppresses it.
- BigTech companies steal it anyway and then control what they put their name to.
- Governments and other industries help keep BigTech in power.
Commercial software got between the makers and end users, For example, Apple's operating system is great, but it's derived from a completely free system called BSD. Likewise Google's operating system is based on Linux, written and distributed for free by Finnish programmer Linus Torvalds. You can still obtain and install both of those systems (BSD and Linux) on your devices for free. They are arguably better than the branded alternatives. In doing so you get to talk directly to the developers. You can sponsor them with money. You can ask for features, or if you're skilled contribute directly to making the software people need.
Confusion
"I'm not really sure what I want"
- Nobody has ever asked
- Nobody has ever really explained
In the 1980s "Digital Literacy" was the core policy of all governments to launch the "information age". Today digital literacy/education is an abandoned project. Markets do not work to educate but rather thrive on ignorance. People are expected to take-up whatever is put in front of them and adapt to it "intuitively". There is no political mechanism for expressing technological demand and preference. There is very little in the way of technological planning or policy at all!
We have rather lost our way with technology. Push people a bit and they'll say its there "to cure cancer" and "make things more efficient". But we don't really have a game-plan about technology. We let tech shape our values rather than designing tech to serve our values.
There is no realism in tech. We are caught between a world of wishful thinking and grim truths. There is little space for grey areas, moderate thinking, subtle argument. Zealotry and secular religiosity are the norm.
There is no mechanism for checks and balances on extreme claims made about technology. We suffer all manner of specious claims about the security of products, green-washing about replacing paper receipts (around which there is no scientific consensus), the endless claims about "AI". These issues are too subtle to be dealt with by Advertising Standards, but too politically important to be ignored.
Fixing confusion
The age old antidote to ignorance is education. All democratic political thinkers fall back to "education, education!" eventually. However, it is no longer enough.
We must simultaneously attack censorship and disinformation. Tech giants increasingly control the message platforms for news and communication. They suppress criticism and spread distortion. They increasingly control the technology used in our schools. Ridding schools of Microsoft, Google, and other "edutech" powers with political interests is as urgent as getting smartphones out of schools and should be a priority of any democratic government with a long-term plan about technology.
Really we need a new age of digital literacy. One that is tailored for the hazards and opportunities of this century. We should teach cybersecurity to kids as young as 5 in order to shape their trust expectations and critical faculties.
Technological labour
"It's all so much faff and work"
For lots of people it's just "all too much bother". Technology is supposed to make tasks easier, but once layers of authentication and security ritual are added, key-cards, multi-factor, biometrics, secure portals… many people just put off interaction.
The technological society imposes labour on citizens, but offers no support or remuneration. You are "expected" to know how this or that thing works. You are held responsible for its faults, and punished by exclusion and belittlement. Burdens include:
- purchasing and maintaining devices
- installing software you've no reason to trust
- relying on services and platforms that are:
- expensive and for-profit
- privacy violating
- insecure and untrustworthy themselves
- unreliable
- opaque
- remembering too much (cognitive overload)
- exposed to indignity, intrusion, coercive control
Some important questions:
- Do the imposition of digital technologies now outweigh their utility?
- How are users empowered to fight back and define technology on their own terms ("markets" are not a viable solution).
- What can governments do to mandate fairness, choice and flexibility? (What happens when governments are the problem?)
Fixing Labour
It's not clear that we ought to "fix" the effort needed for technology. While one should attempt to minimise bureaucracy, and the UK like Finland and Estonia has world-leading e-governance, a certain amount of effort, attention and responsibility goes with the benefits of technological society.
The commercial "cult of convenience" (which actually leads to more work and anxiety) has a lot to answer for here. Technology allows work to expand to meet its possibilities, and wherever that is profitable it will happen. The problem is with what David Graeber called "bullshit jobs". There's an enormous amount of makework, unnecessary digital toil that consumes billions of human-hours or labour. Much of this is ostensibly created to foil fraud and crime but has lost its meaning and become arcane ritual. As explained above, it has turned inward, against the user themselves. This worsens as companies are allowed to unload work on to end users instead of helping them eliminate it. We all now work for Apple or Google against our own interests of freedom.
It's our opinion that users should be encouraged to take responsibility for their own systems, including (especially) their own security, with help from knowledgeable and benign authority. Software Freedom in concert with high interoperability and benevolent advice allows people to complete their work and use devices to live their lives in their own way. It also stimulates markets and new tech innovation.
So long as civic software is created by large for-profit companies it will always tend toward drudge and toil, because there is no profit in setting people free from digital labour. "AI" will only mutate that labour, intensify it and bring ordinary people ever more into the service of BigTech.
Isolation (and support)
"Don't know who to ask for help"
Knowing who to ask for help is important in life. It's part of genuine community. Not feeling alone and isolated is knowing that other people care and share similar problems.
Many people essentially "lack the capacity" to deal with the complexities of modern tech. Disability, reduced faculties, low IQ, learning difficulties, language barriers and age all account for an enormous spread of capacity.
Once every neighbourhood had a "computer guy". About a third of all people rely on a close relative or friend to administrate their digital affairs. They do so in an informal way without any legal deed of attorney. The idea that people "never share credentials" is in reality quite laughable and ridiculous.
This breaks the "individualist" ideal model maintained by organisations, governments and banks.
Real technology is a network (parents, children, brothers, neighbours), but:
- Old networks have disappeared (phones, post-office).
- Older people have smaller networks (distance, people die).
- Social media companies deliberately disrupt alternative social networks (sabotaging email, hyperlinks and interoperability)
In the real-world, to the extent it still exists, police and civic leaders are:
- underfunded, over-stretched
- not trained/educated in cyber
- taken in by the dominant techno-Utopian narrative
Who can help?
- Charities?
- NCSC
- Boudica (civic cybersecurity)
Bullying and tech-shaming
"That's obsolete!"
We've never had so many words to use against adequate, working, serviceable technology. Today everything is:
- deprecated
- unsupported
- not best practice
- obsolete
- end of life
All of these terms mean nothing but someone's subjective point of view, or more likely the profitable policy of a tech company. To make something obsolete is to force the user to buy something new. Almost all "obsolete" software and hardware just keeps working. To thwart people using old technology many companies now actively sabotage your property, which they can do by malicious updates or using a network connection,
Another tactic they use is relentless shaming and advertising that appeals to fears of "being left behind" or "being left out". As a computer scientist and engineer I take a different point of view. On the contrary, software that is old is reliable, well tested, and has fewer bugs. The operating system and text editor I use are 40 years old.
Our use of technology is not purely "rational" but is very much shaped by how we feel and how others make us feel about it.
Blaming a person who uses technology for the shortcoming of its design is a very old tradition in computer technology going back to the nerdy arrogance of early "Help Desk" culture. Mark Zuckerberg famously calls his users of Facebook "Dumb Fucks".
Our society strongly rejects racism, sexism, body-shaming, homophobia, mocking disability and so on. Yet cruel, dismissive and smugly superior attitudes to peoples' ability and appetite for technology are commonplace and even officially sanctioned.
Technological bullying is a complex and common harm in itself.
No technology is simply a neutral "tool" which a person may take or leave. Technologies represents distinct viewpoints on reality. They are inherently political. Each configuration of tech touches on one or another subtle agenda or hidden belief.
As an example: Even though it looks after your money as the customer, your bank does not trust you. It expects you to authenticate in a one-sided trust model. It makes you do the work. It's assumed that you trust the bank. That made sense when banks were actual buildings (branches) with real people. Today your "bank" may be nothing more than a computer program that runs on a highly insecure (all smartphones are untrustworthy) device. Yet you probably have no idea how to authenticate your bank, Your bank probably has no method for doing this, and naturally resists any request to perform. A highly skewed power-relation is built into the dominant technology.
Therefore we cannot entreat technologists to be more compassionate or sensitive, or admonish techno-bullies, because their motives are not simple "hate" or "identity" as lie behind common prejudice. The ad hominen (blame the person) response is not an error of perception, it's deliberate. Its aim is to undermine the users' confidence in their own mastery and judgement, in order to then step in and provide the "correct solution".
Technology is so divisive that far beyond the common commercial tussles for dominance and profit there is a religious quality at play. Philosopher John Grey sees digital technology the same as any historic religion like Judeo-Christianity or Islam. It is our "secular humanist religion". This leaves us actual computer scientists in a very uncomfortable position.
Accusations you'll commonly experience: You are:
- Old
- Obsolete
- Out of touch
- In need of help
- Left behind
- A "Luddite"
Nobody but a fool says they fully trust any technology or system because they completely understand it. Even the greatest experts lack such breadth of knowledge.
The capacity to be critical and sceptical about technology is a rare superpower. Only the best security people have it. Those civilians that do are quickly marginalised by an almost instinctive group-think.
"Luddite" is a word commonly applied to critical thinkers about technology. The "Luddite" is the intelligent and reflective person who is cautious and concerned with a technology because she understands its power and consequences. Others simply go along, exhibiting trust without thought, attention or verification. The "Luddite" is invariably the more technologically intellectual of the two.
Pejorative use of the word Luddite is frequently made to imply dissent from some "widely accepted norm". There is none. For a glimpse of why communications security is not a solved problem with any "best practice", this paper on secure messages should convince you that there are dozens of trade-offs each satisfying individual needs, personalities, circumstances and trust models. What applies to communications is typical of all other areas of computer technology. All companies want you to believe that their system alone is the proper one and that you'd be crazy to disagree or exercise another choice.
Fixing bullying
So long as there is power in technology there will be abuses of that power.
Anything that levels the power of the customer (user) against the power of the supplier or manufacturer, is welcome at present. Of course that may change to be the other way around one day. Software Freedom goes a long way to redressing this balance however it is skewed,
We must establish the basic right of citizens to use technologies of their choosing, and to make new choices where no acceptable ones exist.
This requires a sea-change in how commercial companies and governments view technology as a civic and political force.
Governments, local authorities, schools and hospitals can lead by example by procuring only Free Software and ensuring that rights to interoperability, repair and digital self-determination are respected.
Morality
" I don't think it's right that…"
Digital technology is an ethical battleground.
People have cares beyond utility. Vendors like to bracket these aside as "moral"concerns and apply the solution of "Public Relations" (PR). However these concerns are often more important than operational facts, and so people are not assuaged. They include:
- corruption and greed
- monopoly, massive profits and inequality
- ties between politics and tech
- government snooping and "surveillance capitalism"
- health effects
- cognitive (positive and negative)
- emotional (lifts and stresses)
- physical, posture, exercise. eyesight
- social effects
- dehumanisation
- local community erosion, division
- loneliness, separation
- cutural
- impact on arts and sciences
- bro culture, technofascism
- anomie, alienation
- disruption, insecurity
- environment
- obsolescence, e-waste, recycling, reuse
- energy consumption
- sustainability, resource depletion
- human rights
- child labour, slave labour
- inequality imposed by technology
- privacy and peace
We are only just starting to see public realisation of this, through issues like mental health and use of smartphones, alienation and unemployment, new kinds of terrorism, crime and threats to civic peace. The public conscience is lagging behind the curve by about two decades. and it has few if any guiding principles.
Fixing tech ethics
Some years ago now the Economist identified a phenomenon it dubbed the "Techlash" - a subtle but widespread reversal of attitudes toward digital technology as an unquestioned benefit. I wrote a short book Digital Vegan observing the many harms I saw technology bringing to our lives and suggesting that better selectivity of software and hardware should be a priority.
Technology is politics. A future in which a vanishingly few billionaires decide the lives of everyone is neither desirable nor sustainable. It is likely a path to new and horrible forms of war (including sectarian civil war and gang crime) rather than any sort of Utopia.
In a sense we are forced to be free in that we are forced to retake control over common tools of life. Failing to take political control of technology may be surest way of dooming us to have none, of regressing to a post-technological world.
It is therefore those who love technology the most and still cling to an optimistic vision (like Sir Tim Berners-Lee and myself) who should wish for the downfall of Microsoft, Google, Facebook and the rest of "Big Tech". The techno-fascist coup in US America should be taken as a heads-up. It shows us the trajectory we are on, and the future we do not want but still have a chance to avoid.
As I wrote in Digital Vegan, the 'Arab Spring' was an anomaly, we cannot use the tools of social media and the internet for social change because they are owned and controlled by the enemies we seek to remove.
Key to any progress is turning up the heat in traditional public affairs, in real forums and politics. Politicians (Members of parliament) are among the few remaining folks who see the value of in-person visits, making house-calls, holding surgeries and consultations, paper communications, phone conversation, billboards, leaflets, and public speaking. They do so out of a belief in "inclusivity".
Most could just as easily, cynically focus their efforts on the 66 percent in the middle of the bell-curve. There are enough votes in normative digital dependants who never look up from their phones.
Politicians can abandon the young, old, disabled, different, dissident, neurodiverse - but the wise ones know they would be abandoning the future, wisdom, innovative, and the corrective checks and balances needed to keep society stable and on track. For that we must follow Crick in a defence of politics, and create no less than a vibrant new era of technological politics.
Join us
We'll next be at Anchorage Lodge Community Centre in Portsmouth on Wednesday the 24th of September 2025 from 10am to 12pm.