Security Scarecrows: Why cybersecurity without pentesting is like piloting without a plane.

pilot.jpg

Figure 1: "My other plane is at my mom's house"

How would you like a job as:

  • A data administrator with no data.
  • A mechanic without a spanner.
  • A builder without a hammer.

I could go on… but you get the idea. Cybersecurity without the tools for the job is an impossible and stressful task. But this is what's happening across the business sector in the UK. Why this madness?

Documented skills-gap

I've some personal experience of this, but it’s also something I hear on the security grapevine as it becomes commonplace.

The job market keeps growing as more companies advertise for cybersecurity skills. As more cybersecurity positions are created, employers expect to get computer scientists who’ve spent years at university or in senior IT positions. In reality, those kind of people are busy with work on much higher salaries.

This gap has led to renewed demand for education. We saw an increase in qualified cybersecurity personnel during the dark days of the pandemic thanks to funding for open and free education. Even with that boost the skills gap is still growing. Post pandemic, as we head into recession the take-up of non-free education is slowing. This will increase the cybersecurity skills-gap further as the jobs market expands.

graph.jpg

Figure 2: "UK labour market 2023, UK department of science innovation, and technology, 2023."

A global quality issue

It's not only the UK where we have this skills gap. Cybersecurity staffing is a problem that's part of a worldwide skills shortage.

Despite the economic wish to replace humans with AI, cybersecurity requires a mindset that constantly escapes the existing parameters of technology and policy in order to keep pace with attackers. The skills-gap will not be solved only though new technology. Indeed, AI likely makes the situation much worse, but that’s another story.

Not only do businesses struggle to find qualified staff, they also seem to struggle when it comes to technical competency of those qualified. A recent labour-market report found:

"49% of cyber firms have faced problems with technical cyber security skills gaps, either among existing staff or among job applicants" – UK department of science innovation, and technology, 2023.

The report also gave us a breakdown of which skills were lacking with security testing or pentesting in highest demand. This shows that not only is there a dearth of graduates, those who graduate often lack the real-world practical skills for the job.

Mismatch of desires

This would be bad enough, but HR departments are also famously awful at specifying job descriptions in technology. They ask for experience in software that doesn't exist yet, or went out of date in the 1980's. They focus on named products and brands rather than skills and principles. For example, they will fail to understand that someone with years of experience configuring pfsense packet-filters, Linux iptables, and Cisco routers will manage just fine with "Fortinet" despite having zero official "experience" with it.

They ask for bizarre combinations of skills that one in a million people might possess. And they often just fail to understand what some skills are and how they relate to organisational tasks. Misunderstanding of penetration testing is almost guaranteed.

Consequently, companies often feel the need to advertise for cybersecurity engineers because they feel pressure, through looming regulation or because they've experienced attacks. Only then do they realise the actual positions they have available to fill are IT help-desk and system administration roles - with an added contractual clause to "keep the company safe from cyber-attacks"!

The result is skilled staff who are hired but cannot exercise their talents. They must ask permission, and generally face overwhelming obstacles to use their skills to fulfil their contractual obligations and meet their responsibilities.

Traditions of pentesting

In this essay I'd like to raise some questions such as:

  • What is it about pentesting that leads to such distorted trust relations?
  • Why is there seemingly such scant understanding of how it works?
  • Do the skills we teach on Ethical Hacking courses actually lead to real jobs?

Traditionally, pentesting has been done by outside contractors. This mirrors the working model of "perimeter security". If we are on the "Inside" and attackers are on the "Outside", then it makes sense within that paradigm to hire trustworthy outsiders to pretend to attack us.

This arrangement also offers a "neutral" actor, one whose supposed disinterested position helps with some Principle Agent and Insider Threat problems.

In martial arts we call it "sparring". We occasionally need benevolent opponents to test and strengthen us. But kung-fu is not fighting all day long. Much of martial arts is internal work and solo body-work. An organism needs exercise.

As security engineers in an organisation we don’t sit around all day waiting to be attacked. Instead we try to fill our days with "what if's" and "let's try", exploring the edges of our security, testing and hardening it. This touches on the classic problem of the standing army. One cannot just "magic-up" a defence when under attack, but have them quietly disappear during "peace time". Defence must be maintained and remain prepared.

Sadly, as practitioners we frequently find ourselves sitting around doing jobs unrelated to security, heavily hamstrung by policy designed to keep a restive security team from "creating trouble". We need to ask permission and attend endless meetings despite having solid competence and surety that our routine work will not affect operations. This leaves no room for proactive security - perhaps the only kind that really has value.

It is frustrating for anyone contracted to do work "pro-active in name only", work where the responsibilities they’re required to competently discharge are blocked at every juncture. Resistance comes from many parts of the organisation, not just regular office workers for whom security means friction, but also from IT managers or executive officers who have "banned pentesting" as a unilateral policy.

A structural problem?

There is often a de-facto lack of clarity around ranking and communication within infosec hierarchies. Cybersecurity sometimes sits in a quantum state somewhere between a quality role at the highest level of importance to the organisation - and a janitorial job for geeks in hoodies, overalls and hi-vis.

For fresh cybersecurity recruits it’s a jaw dropping moment to hear, "This is your job, but minus the tools, support and remit to do it". As a new cybersecurity worker it may seem confusing and worrying, trying to find your place in a disorganised hierarchy. It is then shocking to see those same decision-makers pay an external pentester eye watering sums to do your job.

Sometimes all they get for their money is an auto generated report from automatic pentesting software, which an undergraduate can achieve in their first term. Sometimes you may be asked to "look over" and "add input" to the report, but again, without the means of verification that would would count for reproducible evidence in a scientific study.

So what is going on with organisational trust relations here? If our own highly competent, qualified personnel sit on the sidelines wasting their years of experience of keeping systems safe, there must be a reason.? An analogy I often use is that of a security guard who is bound to their office chair and has no possible way of ensuring if the doors are locked, the windows shut and the alarm is on. Essentially a security scarecrow.

guard.jpg

Figure 3: "I just wish I had something to do!"

With this security guard analogy, pentesting is like the ability to check the locks on the doors, shut the windows and check the alarms, in a digital sense. If management would take away our flashlights, radios and masterkeys they injure our eyes and ears, taking away or ability to keep their systems safe. There is only so much that reading security reports, categorising risks and writing new reports (that nobody will read), can do against a determined attacker.

I’ve been doing some thinking and digging in response to my own experiences of being made a "security scarecrow".

It's a reality that sometimes organisations are hiding corruption or plainly unethical behaviours. They're worried about an internal security audit uncovering something. And if guilty, they are right to worry, because we will find things that raise serious questions and progress them, as I have myself. I imagine every other vigilant security engineer has or will during their career.

Using an outside team is a way of sanitising and containing security routines that are now required by law in many businesses. After experiencing this first hand I have spent the past year discussing the issue with directors, technical managers, IT managers, security professionals, academics and cybersecurity engineers who have experienced or pondered this.

Egoless security?

My conclusion is that there are causes rooted in combinations of incompetence and ego, in organisational misalignment and misunderstandings around the role of cybersecurity teams.

Decades ago, with the publication of "The Psychology of Computer Programming", Jerry Weinberg gave us egoless programming which had a massive impact on software engineering. It's a credo that encourages developers to embrace a humble and collaborative approach. Sadly, at the organisational level, no such philosophy has yet touched security engineering.

There is often a mismatch of views when it comes to the rising and new(ish) technical field of cybersecurity and traditional IT managers. Though some of them are indeed computer demi-gods, and some IT administrators could write an operating system, there is a tension of values between IT and cybersecurity. This is where we need to do most work to get constructive communication if not alignment.

Despite the creation of many different offices, tiers and status hierarchies the question of, overall, where to put cybersecurity within organisations remains muddy. Cybersecurity is actually a wide quality issue that touches much more than just “information”. It is a proactive and interpersonal role. It spans all levels of an organisation from the psychological safety of other employees to the physical security of hardware in a rack.

So much for misalignment, which is a symptom… returning to the question of trust, which gets closer to the cause, let's visit the question of ethics and "optics".

Certainly it's possible that unethical activity is afoot, as the ongoing British Post-Office scandal reveals. Within systems of business there's the ever troublesome PR motive for burying negative exposure, and the desire of legal teams to minimise liability risks.

These weigh heavily on cybersecurity decisions. Cybersecurity is a high risk long term affair that can be kicked down the road by C-suite executives who expect to be long gone by the time it bounces onto someone else's watch.

Avoiding visibile short term embarrassment gets put above long term strategy. Sometimes the source of embarrassment may be merely a lack of investment and regulation compliance. Companies don't want it obvious that they're not really spending on digital security.

Cybersecurity professionals pick computers and networks apart at the forensic level. We need to understand their full complexity, what is evident and what is hidden, and it often takes us decades of experimentation to get a handle on complex tools and theories to do that. It is the opposite mindset – truth seeking - to that of "bury the bodies and move on".

Cybersecurity is really a quality issue.

This raises the question of whether cybersecurity even belongs adjacent to IT at all? Should it not be a company wide quality management role?

IT intersects cybersecurity, but they are not the same. A good cybersecurity engineer should be excellent at IT, but there are many highly skilled IT professionals, at the top of their game in development, administration or operations but without up to date and effective cybersecurity understanding.

Like any other role that combines investigatory vigilance and analysis such as detective work, forensic accounting, auditing , it requires authority, access and buy-in (trust).

This makes it a little muddy as to whether cybersecurity is seen as a specialisation to be added to extant roles, or an entirely fresh layer within the organisation. Consequently plenty of toes get trod on.

That was indeed highlighted as an issue across the entire workforce in the Skills Gap Statistics UK report. We see that;

"92% of businesses say that having a basic level of digital skills is important for their employees".

Better quality through training

Out of those same employees almost three in five workers (58%) say their employer "has never provided them with training to improve their digital skills." This includes digital skilled employees such as DevOps and IT. Very rarely are cybersecurity skills paid for as personnel development.

Basic training might seem better than no training, but it may create a false sense of security if teams are given only a lightweight cybersecurity training with no practical or technical component that makes sense within their own daily practices.

Not much sticky knowledge comes of the one hour mandatory seminar that boils down to the rhyme:

"This is a virus, this is worm, Don’t get phished, or the company burns."

No pentesting. No fuzzing. No reverse engineering. No sniffing and protocol analysis. No spoofing. No exploitation. Not even the most tepid demonstrations of exploits. The list of what we should do but don’t do on even the shallowest cybersecurity awareness programme is long and dispiriting.

This doesn’t add much to the set of skills really required to keep systems safe. It feels akin to the difference between an archaeologist and a metal detectorist.

This is concomitant with an institutional fear of all hacking tools, operating systems, practical tasks, and of the employees with skills. It is a manifestation of the old philosophical problems, "The Critic's Dilemma" and the "The Sorcerer's Apprentice" (and similar tropes of Goethe's) that in systems-speak we call "systemic mistrust".

screenie.jpg

Figure 4: "How is airgapping working out?"

Common goals in security

Although everyone is aiming for the same broad goals – the security of important systems – there seems to be a dearth of joined-up thinking at the leadership level as to how cybersecurity fits into structures. Without that, there is little hope for buy-in.

Systemic mistrust works in all directions. The fear that an ethical hacker will steal data or corporate secrets is perhaps understandable, but only in a workplace that is already institutionally insecure and has low systemic trust.

That happens when organisations have a high churn, poor wages, low commitment to employees and are basically poor quality working environments all round. We should not suppose only small or badly run companies suffer from this. Many large organisations simply have a culture of being "mean and nasty", and this lack of ability to cultivate or even value relational loyalty is their great weakness.

In reality, cybersecurity professionals mostly adhere to a strict ethical code guided by professional charters in engineering or science, such as the BCS, and often have financial and criminal record checks carried out regularly as part of their employment, so why the mistrust?

For all of the reasons raised, a deeper cause of all of these issue lies in the epidemic mismatch of power and competency. In only a decade the entire world has realised it is perilously dependent on digital technology which is unsafe. Where power relies on technology, but lies behind the knowledge curve, it feels insecure. Perhaps because the top of large organisations are sluggish and staffed by people who are loathe to delegate or devolve power. IT systems therefore become a proxy battleground for old rivalries and power-play.

As I said earlier, competent and secure managers making decisions would never in their right mind baulk at regular internal pentesting! It is the main tool for the cybersecurity task. From a purely economic standpoint having technically skilled staff in-house cuts costs on the security budget over hiring in endless consultants. So what politics are in the background?

Cosmetic security

Another angle on this is that cybersecurity staff are being hired as paid scarecrows for cyber insurance policy and regulation. How many companies are working not towards safer systems nut merely the appearance of safer systems?

Again, ego also plays a pivotal part in the perception of pentesting. Pentesting is simply a method by which to test the system's ability to withstand a cyber-attack and highlight vulnerabilities. These are found mostly in externally provided software. Where this software is provided by big-tech vendors, nobody wants to ruffle the feathers of those contracts.

When pentesters find weaknesses, it is not an assault on a system design or the system designer, or the system manager. We are not judging anyone in the process of pentesting. We are simply finding and closing, in an objective way, those open doors that make everyone's workplace insecure. If they happen to be fault of a billion dollar service provider, please don't shoot the messenger or ask us to "forget what we found".

Finally, there is the very troubling prospect that cybersecurity itself is being usurped by a "cyber-regulatory industry". Such a direction seeks to factor actual security out of organisational life entirely, replacing it with a matrix of mandatory oversights, audits, processes and "approved" software.

Such a state of affairs can forgo any actual security of the kind actively and autonomously chosen by organisations. This is superficially appealing to organisations wishing to wash their hands of cybersecurity costs.

Conclusions

Even in 2024 the skilled cybersecurity practitioner is still trying to find her position within organisational structures. If you find yourself in a similar frustrating impasse then I hope these reflections might help you to communicate your concerns and needs with others.

Ultimately, cybersecurity professionals are in high demand and the best advice might be that if you are wasting your skills then move to a company that better values pentesting, cybersecurity, openness, ethics and you as a quality-oriented person.

Document References

UK labour market 2023, UK department of science innovation, and technology, 2023 found at https://assets.publishing.service.gov.uk/media/64be95f0d4051a00145a91ec/Cyber_security_skills_in_the_UK_labour_market_2023.pdf

Skills Gap Statistics UK 2023, Oxford Learning College, 2023 found at https://www.oxfordcollege.ac/news/skills-gap-statistics-uk/

Article by H.Plews MSc, MBCS, Company Director at Boudica Cybersecurity, Jan 2024.

[Valid RSS]

Copyright © Cyber Show (C|S), 2024. All Rights Reserved.

A Podcast

Want to be featured? Have an idea or general query? Drop us an email: enquire@cybershow.uk

Date: 20 February 2024

Author: H. Plews

Created: 2024-02-27 Tue 21:39

Validate