Often daily business in companies starts with a fresh compliance audit. Everything seems to be measured, documented, regulated. It’s secure. But is this a sufficient security strategy? Based on my experience in regulated sectors, like banking and insurances, I would suggest this feeling of being secure is a trojan horse . This will be a rant – discussion appreciated.
But the feeling!
As said before, most compliance strategies consist of regulating and documenting “things”. Strict processes, gateways, law of demeter (LoD) , need-to-know principle are fundamental core planes of every well-intentioned compliance story. Things are written down, mostly on printed paper, and forwarded to the commander(s) in chief. Your own responsibility for implemented security, the lack of or grievances regarding “everybody is doing it wrong, but compliant”, is conveniently moved away from your desk. It’s now in the SEP field , as Douglas Adams postulates – you passed the buck.
Daily business and the compliance officer’s bucket
But wait! We all do DevSecOps , don’t we? We feel responsible for the things we do! We care! So, regarding the DevSecOps cultural change in most businesses, why don’t we care about the reintegration of compliance in this culture of feedback, discussion and caring? It seems odd to have a compliance officer running around, not involved in actual teams, not discussing her findings and requirements with the team. This officer could even ask the team to automate her key measures and melt down her drift for paperwork. But…from the compliance officer’s viewpoint, this does not feel safe. It could be dynamic, it could be fast, it could be paperless. But in view of long-lasting paperwork for compliance re-audits – that will happen a year later – this optimization of feedback and workflow does not feel secure. Why? The buck could not be passed! The feedback cycles, provided by and to the team, in such a fast feedback-based process, would require a dedicated decision making based on measures. Decision making does not feel secure, decision making only feels secure when it’s done by others. Those that apparently hold the buck.
Compliance failure and hiding things
Based on the DevSecOps cultural change in most companies that successful applied the principles, one of the key points was the “embracing of failures”. This seems odd for a compliance officer, as a failure in compliance is not accepted in most regulated companies. Be compliant or die! And as mentioned above, compliance decisions are always right – compliance never fails. When compliance fails, the project failed, not the compliance regulatory. There is no feedback. It’s a one-way show. So what happens if a compliance regulatory rule is not met? It’s mostly hidden “between the noise”, in hopes that an auditor will get distracted. It’s not addressed openly and transparently. This is contrary to the culture of DevSecOps and even more against the culture of InfoSec and security in general. Every issue that’s hidden is a possible real risk, not the kind of compliance named abstract risks, a “real” real risk. It’s the kind of risk that takes you to a InfoSec breach. But we’ll pass the buck…for the feeling of security.
Blame compliance and do real security?
In real security, every measure and every action is based on being transparent and the enforcement of visibility. If I can’t see a security mishappening, it’s asking for more. But compliance took care: they measured the risk, the probability of happening, the amount of damage and the systems that are affected, right?
What risk anyway? How much of a risk is it if your front-zone-dmz -webserver gets breached? Not much, one might think. We do have multiple DMZ s, firewalls and hideous network subnetting to distract any given burglar. Compliance documented every single system well, mangled it through the whole process. But what about lateral movement ? Or the breaching of the connected database, sitting right after the first DMZ . Or the other adjacent systems? Is the risk added or multiplied by those systems? What’s the probability of occurrence of a breach? How do you base a decision about probability on an unknown quantity? Do you have any base quantity inheriting a statistical significance about breaches happening from a given CVE or entity? If so, please send me this data! It’s worth more than the next weeks’ lottery jackpot numbers…
What about the “six-month-old software component” containing several CVEs? They are not seen because compliance doesn’t care. At the time the software went into production, it _was_ compliant! We already passed the buck! We will come back before the next “release”, in six months.
This is neither DevSecOps nor security, it’s the documentation’s standpoint, based on the sole date of a release. Nothing more, nothing less. But DevSecOps won’t stand still. In well-played DevSecOps environments, the next release will happen in about 20 minutes. Will you be compliant at any chance? No, because you got the wrong assets. You judged the compliance of your system completely ignoring time. But there is no compliance tardis. Time moves on, so will your attack surface and CVEs targeting your software.
Static compliance and pelican crossings
The perception of implementing security with compliance is biased. It’s delusive security. The “feeling” of security. Like the “feeling of control” when you try to hit the buttons on a pelican crossing. The buttons on those really make a difference, right? So why blame compliance? The buck is passed, they’re only documenting.
Most InfoSec breaches happened in the last decade mostly based on ignorance or the compliance “no risk” or “we take that risk” policy. Why do people decide in such a way? Most of the time, when people deal with unknown unknowns like probabilities or damage sizes or impact to business or adjacent systems, they tend to either underrate the risk because they can’t comprehend and therefore underestimate the full complexity of the system(s) or the measures to mitigate the found (and mostly scary) risk is so costly that it would have an impact on either business itself or at least the current project. So people pass the buck with those decisions. What I don’t see (and can immediately grab) isn’t existent! A “risk in future” isn’t real for the human brain. It feels safe, it’s far away! But this is even worse than doing no compliance at all: it’s an act of negligence. Like Equifax.
Ostrich mode enabled!
“We are not a target” is an even worse kind of thought in the compliance sphere. So why is this sentence ever heard? It’s about complexity of reality. No one could imagine scenarios of breaches. Just take a few threat actors, such as secret services, organised criminals, hacktivists, sc1pt k1dd1s , insiders, automated malware, the famous security researchers or even a random discovery and their motivations like boredom, political agenda, personal profit, collateral damage, financial profit or bad reputation. Just do the combination game. For the sake of it, how could someone with a decent level of sanity “take that risk” or declare “we are not a target”? Most of the time, when we do InfoSec in such areas, we hear the panacea of sanity: “But we do compliance!”
To the rescue – klaatu barada nikto
In such environments it’s hard up to impossible to change the way of thinking. But how could we move on? Simple as that:
- Hope is not a strategy, concentrate your security on real risks, time-based, fast
- Remove risks instead of mitigating them, don’t wait for compliance to happen
- Implement “evil stories” in your daily work
- Stay in control – implement visibility, traceability, and capability
- Never “take risks” – NEVER! EVER!
Concentrate your daily work based on the assumption that a breach will happen, regardless of the applied compliance risk. Assume you can only influence the containment and the possibility of lateral movement. Focus on your input and output interfaces, 80 % of breaches are based on misuse of those. Build your “risk awareness” around your real assets, your people. Enable them to do security-related work, enable them to withstand compliance officers. Enable them to care.
And last but not least, enable your compliance strategy to comply with the fast moving DevSecOps feedback cycles. Do DAST , do SAST , build your Red Team !
You will be a rebel! Embrace it!
More articles
fromKevin Wennemuth
Your job at codecentric?
Jobs
Agile Developer und Consultant (w/d/m)
Alle Standorte
More articles in this subject area
Discover exciting further topics and let the codecentric world inspire you.
Gemeinsam bessere Projekte umsetzen.
Wir helfen deinem Unternehmen.
Du stehst vor einer großen IT-Herausforderung? Wir sorgen für eine maßgeschneiderte Unterstützung. Informiere dich jetzt.
Hilf uns, noch besser zu werden.
Wir sind immer auf der Suche nach neuen Talenten. Auch für dich ist die passende Stelle dabei.
Blog author
Kevin Wennemuth
Head of IT Security
Do you still have questions? Just send me a message.
Do you still have questions? Just send me a message.