Why Cybersecurity Requires Process Engineering
Surviving Armageddon
For those of us who follow cybersecurity trends, we are in a time period of struggle and angst. I don’t remember it being quite so grim before. Where ever you get your information, even if you don’t work in cybersecurity, the feed is full of stuff going wrong. Do you see what I’m seeing?
- Increased volumes of attacks, not just ransomware, but general data theft and other mischief, that are impacting multiple organizations simultaneously
- Attacks that have severe consequences — loss of life, business-killers, critical infrastructure impact
- More government regulations — mostly quite sensible, but landing on organizations without the means to comply
- Cybersecurity layoffs — at a time when companies need security more than ever, heads are rolling in the name of efficiency not risk management — a bill that will come due whether the security teams are there to support them, or not
- New company liabilities that land directly on the Chief Information Security Officer — who often lacks the liability protections of other c-suite members
This isn’t just the warning shot across the bow, it feels like we are in a full-on war. Between attackers and defenders, regulators and companies, security teams and business leadership.
“The more technological a society is, the greater the security gap is.” — Bruce Schneier (2012)
In Cybersecurity, our focus has historically been on defending against “the attacker”. We’ve put all our energy into defense against the TTPs (tactics, techniques and procedures) used by external attackers to infiltrate our systems to cause harm. Our boards and c-suites ask what we’re doing to protect against these external threats, our security leadership is putting resources towards identifying IOCs (indicators of compromise) to more quickly identify when something is going wrong, the security vendor ecosystem is spending billions on protection, detection and response capabilities, and we celebrate security researchers who ethically identify and notify companies about zero day vulnerabilities. Great.
Except it’s not great. Companies, even super-rich-have-all-the-resources companies are falling victim to external threats (cough, Microsoft, cough). No matter how much we detect, there is something that has slipped through our defenses and is dormant, waiting to strike (or is already striking). Security leaders strategize how to tell their leadership that they need more resources AND there is no amount of resources that will confer certainty of no breach. We’ve been doing this for DECADES, and improvement has been measured in inches while our foes are running marathons. Something isn’t adding up.
“We have met the enemy and he is us.” — Walt Kelly (1970)
Recently, the industry has thrown in the towel and recognized that all our detection/defense tech isn’t doing the job we want it to. Instead, we now exhort our companies to be “resilient” (where I started my career 20+ years ago). Bad Stuff will happen, but we need to have enough defenses in place that when things go badly they don’t go SO badly that the damage will be catastrophic. We’re trying to normalize security incidents so they don’t impact our share prices, or our critical business functions (too much). It’s a pragmatic approach to the current landscape, for sure.
The problem occurs when a single cyber incident can cause devastation, not just at a single company but across the ecosystem of suppliers, employees, customers and government entities. As the sophistication of attackers increase, the likelihood of a devastating singular event rises too. We’ve seen glimpses of it with previous events like WannaCry and others. We haven’t learned.
To quote Bruce Schneier (again), cybersecurity is a process. Not a tool, not a policy, not a right… a PROCESS. Does it use technology? Absolutely. Can we “do” cybersecurity without understanding technology? Absolutely not. But to think that the solution to our cybersecurity risk is more technology is missing the mark.
What do we know about processes? We know all processes work because somewhere along the way they interact with HUMANS. Even generative AI. Most of the attacks we are seeing aren’t “sophisticated”. Instead, they trick a human or two (a help desk operator, a systems administrator, a financial analyst, a CEO, etc., etc.), and all those defenses are for naught. Security pros have been onto this for a while (and BTW, I’m not blaming the HUMAN — I’m blaming the process). We’ve been trying to plug this hole with security awareness training (and we all love security awareness training) for years. Has it had impact? Yes. Has it been successful? No.
Instead of solely focusing on external attackers, it’s time to look at our internal processes, (dare I suggest process engineering), as individuals, companies, and societies, and see where they are leaving us vulnerable to a doomsday attack (or a small compromise). Consider:
- The process of incident reporting — how many different regulatory agencies and state/local stakeholders have different regulations that take resources away from defenders?
- The process of onboarding/offboarding employees/contractors/partners — who owns the end-to-end process, and why isn’t it security?
- The process of board oversight — is it really feasible to have the audit committee, or the technology committee, understand systemic cyber risk and actually govern it?
- The process of cybersecurity funding — is it OPEX, CAPEX, COGS, or something else? If it’s “everyone’s business” to care about security, why isn’t it funded that way?
- The process of education — how are we training our students to be digital citizens who understand what cyber-hygiene looks like? How to we train developers to use secure code, and ensure their code is secure?
- The process of enabling security in commercial apps — why is it easier to do insecure things than secure things, and why do our product managers put more effort into making apps “usable” in all areas except security?
I could go on, I’m sure you have something to add to the list, too.
When we talk about cyber careers, we talk about technology roles (a variation of engineering) and “non-technology roles” (a bullshit term but it’s code for “Governance, Risk and Compliance” -type roles). Perhaps we should talk about a third kind of role: Process. Not only would these roles look at all the company processes to find ways to make them more secure, they would also look at macro-policies to ensure the entire societal ecosystem is moving in the same direction.
Because whatever we’ve been doing up until now hasn’t worked — isn’t it time to try something new?