Bum education

obey.jpg

Figure 1: "Got a bum education, double-digit inflation, can't take the train to the job, there's a strike at the station" – Grandmaster Flash And The Furious Five (Sumner, Barnes, Jones, Olivier, Miller)

The pitiful situation at Staffordshire University upset me a bit this week. Can a university sink lower than replacing cybersecurity professors with "AI"? I'm sure they're working on that right now. The results will be a disaster for us all.

I've not written much on higher education in a while, partly because I've served my time in the trenches defending education from "professional managers" and "policy makers" bent on sabotage. Despite the kindness and patience of my editor Paul at the Times Higher I think we both realised "Outraged of Academia" was a stuck record.

And that was before "AI".

The narrative of "progress"

I've been writing about this sort of thing for some time. But words have limits. Even when people can absorb depressing news we have scant capacity for effecting change, even self-change.

Negativity about technology is a recent arrival in my life. My generation are natural born tech optimists. When we were children, men walked on the Moon. The Internet was created. We are a generation filled with big ideas and big hopes.

Half a century passed, and we arrived in 'the future', a shabby and miserable version of what everyone expected. Like unboxing a defective toy at Christmas, it is not the same toy that smiling kid was playing with in the TV advert, just a disappointingly smaller plastic version with some of the bits missing. Then comes the sinking realisation. We didn't really want a Captain Magneto Danger Rocket anyway. We were only worried the brat next door might get one.

Don't we always frame technology this way? As an "arms race". To be "competitive"? A competition for what? To be the first to drown in our own waste? Doesn't this disppointment serve us right, for being greedy, envious, unoriginal imitators? Or does some blame lie with sly manipulators who sold us fluff and empty dreams? The face of a disappointed child is the mental image I have every time I hear the word "AI", spoken in that cocksure, square-jawed American twang. But even today's manipulative evil geniuses are less than expected. Less interesting, flamboyant, and good looking. The dull-witted "tech bros" we got are unexceptional and ugly specimens in "Ming The Merciless" cosplay.

Technology as spiralling mass hysteria has the unsettling potential to draw even rational sceptics like myself into disaffection. One must not merely reject duff technology, but also those other people who have been hypnotised and now see anyone reluctant to join them as a threat. Nothing angers the pathological neophyte more than a rejection of his "solutions". So "AI" is splitting society. The typical non-thinker sees only two options, those who are "racing to embrace the future", and sour Luddites living in woodland shacks who will be "left behind".

This causes the most reasonable voices to be shut down, or simply drowned out in the noise. Voices like those of psychologists and education experts who point to the massive harms already emerging from the use of "AI". The problem with the "narrative" of progress is that there is none, just an undignified clamour like a Black Friday feeding frenzy.

Thus I am compelled to write more on this topic, because if anything we experts are not nearly vocal enough at this very serious juncture.

Looking back on the past ten years in print I think THE and other publications have been as critically honest as they could be, and all of my own bugbears have proven valid. Each day I read something that amplifies things I wrote years ago. The myth that "every generation feels this way as they grow older" has served to silence alert minds. On the contrary, when I was a child older people were extremely enthusiastic about technology, and that is why I eagerly learned from them. The phenomenon of stubbornly wilful ignorance is very recent thing.

The Atlantic piled-on today describing how universities are "Preparing to Self-Lobotomise". The US Ivy League is in a delusional frenzy to shove "AI" into everything, a "dangerously hasty and uninformed response to technology". Those "making decisions" (they aren't, they're doing what they're told) seem so detached from reality.

Even though I sniffed at reviewing Gerry McGovern's latest book 99th Day: A Warning About Technology - because I don't want to hear more about the end-of-everything - it's hard not to think this portends a genuine risk of civilisation collapse within the next decades. Yet the end will come simply enough, not because of climate, disease, hackers or cyberwar, but because we'll all be too stupid to operate or fix any of our technology. The fruits of two centuries since the Industrial Revolution will just die with a whimper in our stupefied, mildly amused hands.

What do we tell the kids?

The new programme of digital literacy we so desperately need is quite the opposite of what's being pushed in schools and universities now. Across the Western world young people are being misled by idiots who have no idea how to use technology. In urging restraint, Justin Reich, director of MIT Teaching Systems Lab seems to be the last sane person in academia. Instead of brainwashing young people into "embracing" (shudder) "AI", we need to be teaching proper scepticism and critical thinking.

So this is not really about "AI". It's about what we teach. Frequently I meet ex-academics who are incensed at what happened to our profession. Many like me devoted their lives to teaching, only to find themselves in a world that no longer values it. The extraordinary descent of academia is absolutely real but not yet understood by parents who might still believe that sending their children to university would be a good thing.

Perhaps because we set our standards too high, alienation seems to disproportionately affect the best teachers. Like John Taylor Gatto or Sir Ken Robinson, it is the passionate, innovative teachers who are most exposed to hostility. Those capable of maintaining quiet mediocrity, mechanically deferring to Turnitin or ChatGPT, or giving lip-service to some unhinged government policy are safe.

Surviving professors are those who can keep their mouth shut through online powerpoint "meetings" about obscene and inhuman policies to disenfranchise students. For those of us naive enough to care about education the rewards are intolerable workplace conditions, bullying, lack of respect, and zero prospect of career advancement.

Worryingly, those who remain in academia show clear signs of the ill effects. They have a flinching air of victims of violence, their tone is deflated, browbeaten, mumbling, a prevaricating mess as they umm and err about any "uncomfortable" (important) topic their so-called "bosses" might disapprove of. These experts should be lighting the way - not cowering in fear that an "inappropriate" comment about Google or Meta might hurt their research grant money. I see a profound and deeply troubling implosion of self-confidence amongst those who we rely on as "thought leaders".

Who would it serve to destroy the universities?

Vocal critics of disintegrating institutions are not enemies or outsiders, but those who most love and identify with what is being lost. Edward Snowden did not walk out of the NSA with his Rubik Cube because he wanted to be a celebrity, but because he loved the intelligence services and saw them descending into unconstitutional criminality. Some might argue that was naive - "of course the intelligence services act criminally".

We're losing the universities, it's that simple. To a good approximation we've already lost the spirit of the university, as a place of free-thinking enquiry, disputation and intellectual honesty. What remains is buildings, brands and "digitalized" slop. Would anyone argue that it's naive to suppose the universities value knowledge? Would anyone say "of course the universities would throw teaching and research under a bus to flatter the faux 'progress' of the tech oligarchs"? There are no "whistleblowers" in academia because the criminal disassembly is happening in public view.

When all the articles in what are effectively mainstream professional forums denounce higher education, complaining that teaching is in precipitous decline, that students no longer value college education, that research is fraudulent, and when the only upbeat prose comes from paid puff-pieces by Edutech companies, everyone knows something's gone wonky.

The smart ones got out long before me. I held-out like the loyal soldier, desperately clinging to hope that higher education would turn around and my patient defence of human values would be vindicated. In the end it is only the students who hold any power.

I tried to militate my students by giving them all "A grades" to point out that the assessment regime, amidst inhuman automation, cheapskate cuts to student support, widespread cheating and abject hypocrisy of the faculty, made the whole course a sham. It made them happy, but bless them, they were all too timid and disorganised to really use that cue to mount an effective protest. The students at Staffs seem to have a bit more chutzpah.

So I say with a torn heart but great confidence that the problem at Staffordshire is quite typical, and has nothing to do with "AI". It's that higher education has become intellectually and morally bankrupt. It would be unfair to single out Staffordshire, a mere symptom of deep systemic decay. UK university education is now little more than an extractive racket, having abandoned its social principles. Capitalists took the engine of Enlightenment and turned it into a cheap jingle-jangle money machine.

We can't teach cyber

Cybersecurity is a vast and complex subject. To cover it in less than 5 years is a real challenge. Where do you even begin when the basic entry requirements are already having a good bachelors degree in computer science and mathematics?

Let's get one thing clear - "AI" and cybersecurity are in direct conflict. Everything about (language model) "AI" is making computer security worse by subtracting quality, clarity and accountability. It favours attacks against humans in phishing scams while offering little on the defensive side. Even commercial developers of automated security products I've spoken to are retreating from using "AI" as a marketing term. In reality they use tried and tested deterministic and statistical algorithms like decision-trees, clustering and multi-dimensional analysis, but feel bullied into adding the words "AI" to their sales pitch. It is therefore an affront to the foundational values of cybersecurity students to push this pop nonsense on them.

Cyber defence is intellectually and ethically demanding, encompassing organisational dynamics, human psychology, electronic and software engineering, warfare, espionage, information theory, criminology and political science. Very few people are capable of teaching cybersecurity and those who learn it are thrown into a new world. Some people handle that knowledge rather badly and when faced with the reality of non-existent job markets, with deluded bosses hell-bent on replacing cybersecurity people with "AI", they turn their knowledge to less savoury or outright criminal pursuits.

"And who can blame them?!", I am forced to concede. When self determination and basic human dignity is criminalised, we are all criminals, or worse. Should I give a second thought to those students I trained who were clear and unapologetic about their plan to return to Nigeria, India, or Eastern Europe and become scam artists? Western society sold its own security for a few wretched pennies.

Technological society is massively over-extended beyond its intellectual capacity to defend itself. That is absolutely our own doing, and it is the result of a conscious collusion between states and corporations to sell off education and weaken the knowledge and power of the technological citizen in order to transfer power to a handful of monopolies. Now we will reap what we've sown.

The cybersecurity education "market" is filled with watered-down apprenticeships and fast-track courses from those who claim they can teach it in a few months. They promise "new careers". This is nonsense and the students they turn out will only make things worse. Such programmes are dangerous and do more harm than good.

The other side of the market is "naughty step" training to discipline and punish employees, as compliance ritual, or as misguided and sometimes sadistic bullying by IT departments.

Neither of these approaches does anything much to improve computer security.

The relegation of human intelligence under the guise of efficiency is really a bid by the unintelligent but rapacious to insert machine "intelligence" as a control layer. Perpetual insecurity, confusion and deferential appeal to authority is a feature, not a bug in this project.

These misguided projectsy serve a dishonest agenda of trying to patch up and give the appearance of doing something amidst a cybercrime epidemic, the root causes of which are;

  • truly abysmal software quality
  • global technical debt measured in trillions
  • a terrifying shortage of people who know what they're doing
  • a population enchanted by magic and "convenience"
  • oversized predatory tech monopolies
  • government policy that amplifies and colludes with tech monopoly and which marginalises and punishes morality and care.
  • over-reaching, ill-conceived social projects that are beyond government capability to safely roll-out ("track and trace", NHS apps, "digital ID", etc),
  • too many corrupt relations with US BigTech
  • systematic state attacks - foreign and domestic - on the whole project of cybersecurity.
  • obscene global economic inequality, exclusion and domination

Besides, who - other than those of us who are passionate believers in the centrality of education to sustainable civilised society - would work for a fraction of our potential salary?

University professors have become dispossessed gig workers, grubbing along with apps and casual contract portals that feed on desperation and predate on social conscience. We've had our sense of care weaponised against us. We've had our identity as teachers stripped away.

Who but those who see the extraordinary urgency of what we teach would care so much? We are easy victims.

And now you know why there are "AI" teachers at Staffordshire University… because like everywhere else the real ones burned out years ago.

Fall of the faculty

So many accounts of the decline of Western education can be brought forth from all sides of the political spectrum; Ben Ginsberg's "Fall of the Faculty", Wendy Brown's "Undoing the Demos", Alan Bloom's "Closing of the American Mind", and many, many more. Whether you blame bureaucracy, capital, post-modern culture, technology or whatever, all I can say is that I've personally witnessed the decline over about 40 years and would sum it up as a profound change in "atmosphere";

We transformed universities from communities of knowledge filled with enthusiastic learners into zombie shopping malls, nasty, barren, anti-intellectual corporate warehouses filled with burned-out neurotic power junkies and "led" by lost, morally vacant husks. The cultural position of the university has changed so radically from the rigorous, humanist liberation pedagogy I enjoyed at UCL in the 1980s - suited to intrinsically motivated young people stretching their minds to the limits of understanding and creativity - into soulless degree factories pumping automated propaganda into terrified kids to make them docile and unthinking corporate slaves, all while sucking as much money out their family and future as possible. The modern UK university is an unrecognisable shadow of values I associate with education. It's an embarrassment and a horror show.

Besides, university pay is disgraceful in the UK, like all of the public sector. Working conditions for teachers are appalling. People with our in-demand skills can't afford to teach and so end up working for companies like Google and Facebook, which only churn out more substandard technological trash, making everyone less safe.

Since 2013 I've been working on cybersecurity syllabus variations aimed at different sorts of cohorts. One syllabus is founded in practical computation, beginning with microprocessors, networks and memory. Another is grounded in logic, discrete maths and game theory which approaches information security in a formal way. Yet another syllabus comes first from psychology, social and political sciences, and so on… there is no one correct way to teach cybersecurity. Most of this work will be lost forever, because "thinking about things too much" (as one impostor deanlet put it) is not what the order needs. I am done. I am still shocked at how anti-intellectual the universities have become.

Most of the cheap courses are thinly veiled sales pitches for commercial products by companies like Microsoft and Cisco. At best they are commonplace configuration guides for proprietary cruft. They are Mr. Fox's Guide to Hen-house Security.

After Snowden's leaks in 2013 it felt clear that the battle for Western liberal democracy would be fought in cyberspace and against elements within our own society. The computer industry had gained a whole new parasitical layer of cyber-arms dealers selling access for commercial espionage, voter influence, obtaining military secrets, and suppression of dissident speech. The whole subject of cybersecurity is really a struggle around who gets to define what computer "security" is, who it is really for, and to what end it operates. At present it is the monopoly industries of Big Tech and the worst elements of authoritaian government who have the upper hand, not the people who are their victims.

Outside a few prestigious courses at Cambridge, Edinburgh and some Russell-Group schools in London, few places taught "cybersecurity" in 2013. As a proofreader I was fortunate enough to make friends with Prof. Ross Anderson at Cambridge. He is someone who had a profound positive impact on me and everyone whose lives he touched. Amongst the many things I learned from him was that security engineering is not a purely technical field so much as a struggle for privacy, dignity and autonomy by ordinary people. Psychological and social security are as important as any firewall rule or virus checker. I don't think it's unfair to say that Cambridge found him irritant. COVID got him before forced retirement did.

I've experienced similar hostility at places I taught. There is a concerted effort to push a dumbed-down, half-baked and deliberately insufficient model of computer security. It's why I quit the university system to focus on real cybersecurity, real self-development, research and full-time writing. It is the same way I obtained my PhD, by working entirely outside the broken university system, with the help of faith and the generosity of others, and paying close attention to what is happening in reality.

Everything you need to know about the truth behind cybersecurity education is that giving people empowering knowledge to defend themselves does not sit well with dominant groups. The "insecurity industry" is huge and fast growing. It operates beyond international laws. Clearly, if people knew real computer security they could no longer be sold snake-oil and empty promises, and their awareness of back-doors, surveillance, sneaky data exfiltration and global industrial espionage networks is not "good for business as usual". A natural and justified loss of confidence in the "smartphone convenience" lifestyle would hurt too many established interests. In fact all that is occurring is wealth transfer from the vulnerable to those with digital privilege and power.

Right now regular people are becoming aware of what the impact looks like. Co-op, Marks and Spencer, Heathrow, Land Rover… and what the costs look like. Land Rover cost more than a few fully-equipped major city hospitals. It would be jolly naive to solely blame "cybercriminals" for the state of Land Rover when we've absolutely engineered the situation leading to it, not just by neglect and economic ruin, but by systematically devaluing essential but inconvenient knowledge.

We can do it!

Effective cyber defence is absolutely feasible. It requires a lot of people, good people with long term commitment to difficult learning and immense emotional fortitude. It is also a long-term social programme that starts with educating children about the technological world starting at age 5.

But such savvy, well informed and naturally sceptical citizens would not suit the status quo.

Our quick-fix culture of convenience doesn't cut it. Kentucky Fried education from watch-with-mother videos and "AI" rubbish is absolutely not delivering on it. It needs to be home grown too, because you really cannot outsource trust. If I were in government right now I'd be a little bit worried. How much does the British government really care about the security of the British people?

That's why what's happening at universities like Staffs is a scandal. It's squandering precious opportunity in a time of need. Bush league universities are scavengers in a no-margin market, they have no vision, courage, creativity or patriotic sense of identity or duty.

For those in employment, insufferable "cyber essentials" lunch-hour training comes in the form of chirpy "AI" generated slop. Training videos are tepid, uninspired, cargo-cult ritual and tickbox bingo. It is also ineffective.

Mainstream educators follow like a little dog behind the brass band of BigTech, licking up crumbs. Meanwhile in academia there is too much scattered specialism orthogonal to the task space, too much bean counting, and too much dick waving. And there's absolutely no leadership.

Accordingly, BigTech has aggressively slipped its tentacles further into the academy, not just in the guise of "research partnerships" and "technology transfer" (publicly subsidised research that benefits foreign corporations) but fully setting, and limiting the anointed agendas. This stops anyone from getting a truly big-picture of cyber. It is an enclosure of the narrative and an act of hostility by consumer communists who demand ignorance to pave their road to power.

To champion real cybersecurity one must not only care deeply about cybercrime and its effects on people and the economy, but one must also be ready to fight against fear, ignorance and greed, along a front that runs right through the centre of our society; against a predatory tech industry, against surveillance capitalism, against an education system that has abandoned hope of educating and against incompetent political figures who are at best opportunists out to make a quick buck or worse, fully in bed with our enemies. This is why I wrote cybersecurity is a resistance movement.

How did universities go so wrong?

Noam Chomsky correctly called out "academics and intellectuals" as cowards who are always the first to cuddle up to fascism. Make no mistake; "AI" is a powerful manifestation of fascism, as are many laws created in the name of security, which make us all less secure.

So long as academics can be beaten into the belief that any and all technology is "progress", they will willingly lay their values and life-work at its altar. Cleverer, tacit supporters of "AI" are not so gauche as to openly applaud or denounce it, but are firm in their silence and air of nonchalant pseudo-ignorance about any issues outside their own microcosm. For them it's "just a useful tool", as if such "useful tools" ever existed in a vacuum.

UK academics lost their nerve to defend educational principles and resist socially egregious research. Remember "Cambridge Analytica" is considered a scandal regarding the US company Facebook, but it is an absolutely British affair. All this happens within universities that spend millions on so-called "ethics" departments.

I've fought in this mud filled trench too long, spending decades as the pariah, the lonely ridiculed conscience of the faculty, alienating myself, disadvantaging my "career" through hundreds of meetings of tedious corporate masturbation and apologetic about "saving money" or "building partnerships". I'm exhausted and shell-shocked. Love of money ruined British universities and everything they once stood for. I hope the vandals are happy now?

Growing a pair

Perhaps this question still hangs in the air; Why do academics never stand up to their "bosses"? They feel permanently insecure - that they are impostors somehow getting away with it and cannot believe anyone would keep paying them for what they do. This carefully cultivated fear keeps academics docile and compliant. Ones legitimacy, indeed (if one is not careful) one's whole identity, comes to rest on the blessing of the institution. Institutionalisation is a capture of attachment. Escape means breaking that attachment and seeing beyond the institution.

Writing in the 1980s, technological ecologist Ursula Franklin identified this capture undermining humanity and integrity as deliberately preparing the ground for technofascism. It is a logic that displaces moral human governance.

The fact is that what most academics do is immensely valuable. Actual human thinking. That's why it's attacked. We should all defend those who are most attacked as "useless" in areas like social science, law, history and politics. They are mocked as "woke" and "irrelevant", to be twisted and hammered into the Procrustean shape of "STEM". Yet Gould critiquing C.P Snow's fence building highlights this petty squabble as corrosive to scientific progress in general. Today, such blindness has become an existential threat.

Marginal academics may well be the last hope for our crumbling humanity. Yet I have rarely met an academic with the courage to defend their principles against technology even as technology experts. I feel a huge sense of disappointment at these colleagues. University leadership - I choose my words carefully - offends me to the core of my being - because it is so very fake, duplicitous, and lost. Effete, self-interested, directionless power without responsibility masquerades as "public" interest.

In academia, intelligent people who know better choose to be morally ambiguous and treacherous, shrugging sacred principles and weighty responsibilities they've been entrusted with. Self advancement and survival politics is the only game.

One of their key errors of moral reasoning is "If I don't do this awful thing, someone else will".

If that is the case then your standing, your reputation, and all those impressive publications amount to nothing.

Knowledge is indeed a cornerstone of society (another is mutal care). But knowledge alone is insufficient because it has many corrosive negative forms. It must be tempered, backed up by moral values which select knowledge. Good, nurturing knowledge must be robustly defended against its weaker, destructive forms which congeal around power. Defence of true knowledge by enlightened disputation is what we have lost in universities, through a deliberate and systematic devaluation of humanism.

The appearance of "AI", which is an attempt to reframe all digital technology for control and automation, is merely a symptom that highlights a tragic loss of confidence we all have in the human project. It is also a symptom of the loss of direction we have as technologists. It's essence is political control via "fakeness" and individual aspirational thinking. We made science into reality TV, a "Technology Love Island" for celebrity billionaires.

Ironically, the people who claim themselves hard-boiled realists from the iron-clad camp of STEM, become dreamers, fantasists and "visionaries" infected by wishful thinking about the future of humanity and the role of technology in it. Such people should never be let within a clear country mile of serious core social structures like education, health and government which require the genuine hard realism that comes with broader philosophy.

Burnout

People in cybersecurity and in academia are burning-out at a frightening rate. I quit my post at a provincial department because my daily experience of unconscionable abuse of the students got too much. I did not recognise at the time I was suffering moral burnout. I had become "part of the problem", someone complicit in ruining the lives of young people by colluding in a ruse. Selling them lies. It is the worst thing any real teacher (concerned with the growth and well-being of others) can experience. A deep moral injury. I left at the height of my intellectual powers, at the time I have the most to give back to society. I did so because I could find no leadership at all amongst the ruins of academia. Any peers or bosses I could respect had already quiet quit and joined the Great Resignation.

Now I've had time to reflect, rehabilitate and talk to other refugees it's so much clearer how the whole structure and management of higher education has been manipulated to drive-out people like us. Everyone with a shred of morality or pride in their work became "unmanageable". I am very proud indeed to be unmanageable to such derelict values and never again will set foot in toxic institutions.

I just wish there were a way to get the message out to young people that they will not find devoted tutors, mentors, coaches, guides, or anything much of value in the UK university system any more. You'll be better off with some old books, self-discipline, a basic search engine like DuckDuckGo, and a slop blocking filter. That's my professional opinion as someone who loves the university system and spent 30 years in it. Ignore at your peril.

Can we regain control?

A problem with the one-dimensional, hopeless philosophy of Technological determinism, is it presents a paradox. A goal of science is to obtain predictive power, and we do so to gain control over our environment. The determinists then claim we have no power over science and technology itself.

I think it's becuase those who gain control do not know what to do with it. Control demands discernment. And since they have none, they wish to give control to a machine as a way to 'bank' it.

This is the road to learned helplessness and obsequious deference. I think that universities like Staffordshire feel they "have no choice" but to "embrace AI" because their leadership has bought into this weak credo.

So what should universities like Staffordshire do? Doubling down and defending trashy cheapness is face-palm cringe! I've seen those sorts of videos, and its something your soul can never unsee. No one is going to gracefully smug-out the falling edge of the "AI" bubble. Useful or not, tools have their place.

I suspect Staffs will miss an opportunity, since it lacks the courage and foresight, to publicly renounce "AI", remove it from every classroom, office and laboratory, and jump up 20 places in the university leagues. They'd have first-mover advantage in a domino cascade refreshing the legitimacy of some universities. What better way to send a clear message to Keir Starmer, Rishi Sunak, Nick Clegg and the other tech-impostors. Conspirators of the Bletchley Park sham "safety summit" (a stain on the reputation of Bletchley Park) have erred and led us all down a cul-de-sac to backwater mediocrity.

Barring some paradigm-shaking mathematical upset from Oxford or Cambridge, we have utterly missed the boat as AI leaders. No, Britain's future lies not in it's use of "AI" but in its response to it as a mature nation. Only by concerted public will might we see Britain lead a graceful, principled reversal on risking our entire economy on the misadventure of shoehorning "AI" into critical systems.

Let those obscene university course fees pay for some actual lectures. For goodness sake just buy some credible textbooks and have the students actually read them. No - there is not an environmental problem with paper, a renewable resource vastly less destructive than energy-guzzling data-centres. And if that worries you, earlier editions of Anderson's "Security Engineering" are FREE to download here. That would be a start!

Then, dump Microsoft, Google and all of those tired old intrusive, insecure, untrustworthy and quite frankly rubbish US tech products whose real purpose is US control and industrial espionage on the UK. Remember, we invented the computer (Babbage) and the entire theory of computation (Turing), so I'm sure we still know how to make and program some new ones to serve the people instead of terrorising and depressing them.

Britain could lead the world in computing again, if and only if it can find the moral courage to follow a different path to Silicon Valley and the mistakes of the past two decades.

If I may, I'd presume to offer genuine sympathies and apologies on behalf of my profession to the Guardian interviewee "James" who "wasted two years of life" on a program delivered "in the cheapest way possible". You deserved more. You are like so many students who came through my classes, and the guilt still haunts me. There was once a time when I felt proud of what I did as a university professor. We are all victims of this war on humanity.

If I had the resources I'd personally love to offer those 41 students from Staffordshire University's failed course an opportunity to study with me for two years. We'd build systems from scratch, hack and harden them, right down to the bits and bytes of network packets and memory. I'd teach them proper cybersecurity, stuff that would turn hairs white and change their outlook on business, people, society and governance. But the spreading of empowering knowledge now meets much resistance. The obstacles to setting up a school are formidable. I'm not sure I have the energy for it any more. But best of luck to those chaps and ladies.

However, at The Cybershow we know there are still many people out there who feel passionately about building a humane, sustainable technological society. People who understand, support and agree with us. If any Staffordshire students, lecturers, or students from other universities want to join us on Cybershow to talk about experiences of poor quality education, or the inappropriate imposition of "AI", come and air your voice, we'd love to hear from you.

[Valid RSS]

Copyright © Cyber Show (C|S), 2025. All Rights Reserved. Site design 2023-2025: Ed. Nevard

Podcast by

Want to be featured? Have an idea or general query? Get in touch contact us by email

Date: 2025

Author: Dr. Andy Farnell

Created: 2025-12-03 Wed 12:42

Validate