Chief Information Security Officer
In this CISO Conversation, the NTSC chats with Eric Seagren, Chief Information Security Officer of Oceaneering, about shifts in the CISO’s role from a technical to a board level skillset, the need to talk cybersecurity risk with the organization, and ways to make public-private cyber threat intelligence sharing easier.
In 2018, it was significant that the NIST requirements for government contractors changed. After taking effect in the New Year, those requirements make security compliance and best practices a bigger factor when bidding on and doing government work. In reality, I'm sure it helped but I'm not sure it had the desired impact. That’s because the management of those requirements is still immature. I’ve heard within the industry that while a company’s security posture may factor into the bid process, it’s hard to understand how it factors into it. Government agencies appear to largely lack ways of incorporating these new requirements into their existing management processes. There is currently no standard for incorporating a government contractor’s NIST compliance posture into the bidding process.
In the private sector, I think overall cybersecurity awareness has improved. At the executive level, people understand the importance of cybersecurity across different business functions and we've taken steps forward so that awareness at least exists, which has not always been the case. At this point, it seems most everyone understands the importance of cybersecurity to an organization. Worst case, even if no one actually became more secure in 2018, awareness has definitely improved.
I think we're going to see a continued shift away from the technical aspects of the CISO role. To a large extent, we’re already there. Many CISOs have people under them serving as subject matter experts in various areas of security. I think we will see a continuing trend where the CISO role becomes more high-level, with a focus on cross-discipline and business knowledge, and emphasizes soft skills. In other words, a role that looks much less technical than even today. A CISO doesn’t necessarily have to understand all the moving parts of information security and how they work under the covers. They just need to understand how to conceptually frame those moving parts for the other executives, how the parts work together, and the ROI to the business.
Think of it like talking to a doctor. If a doctor starts talking medical jargon, it doesn't make any sense to the patient. Similarly, the CISO is really there to translate security into plain language for non-security executives so that informed decisions can be made. Ultimately, all security is about quantifying risk, right? As CISOs, we're in the business of risk management. Nothing is ever 100%. It's not black and white. All we're doing is trying to make judgment calls about the ROI of risk mitigation investments.
Cybersecurity-related risk falls under my purview, and I just need to translate that into terms others understand. In most cases, I don't necessarily have to work with a different risk and compliance group who are not subject matter experts on cybersecurity. An average auditor, generally, ends up with an accounting and finance background rather than a technical background. In my experience, most interactions involve cybersecurity providing subject matter expertise to the risk or compliance groups rather than the other way around.
When you start talking about GDPR and data privacy, you’re involving compliance groups. Risk and compliance departments are the ones usually signing off on privacy regulations and then gathering input from technical people in IT. In my experience, the people checking the compliance box often don’t understand how those systems process PII work under the covers.
For example, if you have an HR system, HR is going to be very focused on the security roles within that system. That’s fine, but their scrutiny is probably going to stop there because they don't understand all the moving parts. That HR system probably leverages a database, with database administrators who have access to the PII. What about the PII as it lives within your backup system? Who has admin access on your SAN or VMware environment? All of this could expose PII. So, an auditor will say, “Let's do an audit of everyone's roles within the HR system.” Afterward, they say, “Okay, we're done. We're compliant." But in reality, a long list of systems and processes with access to the same PII might not be in scope.
Similar to the NIST standards and awareness I mentioned earlier, I think we need to see something similar with privacy compliance. I don't think many people looking at PII and privacy concerns even understand where they need to be looking. A huge gap exists in terms of awareness, and it's not anyone's fault. I'm not throwing them under the bus. It’s just that they don't know what they don't know. That needs fixing if we're going to get closer to real compliance.
We need education and awareness for the people performing the assessments. These auditors need a much deeper understanding of the technical landscape in order to effectively assess compliance.
I've seen public-private cyber threat intelligence sharing not work. I have heard many complain that the “information sharing” is entirely unidirectional. Sometimes when there is actual sharing of data, what the government provides is so heavily sanitized that it serves no value.
First, those participating need information back from the government that is rich enough to be actionable and of value. If you can't give me something actionable, it’s all but useless. I've seen cases where someone from the government sector said, "We saw something somewhere that might indicate malicious activity, but we can't tell you what, where, or how." What am I supposed to do with that?
Second, we need to make the process of information sharing as easy as possible. In most cases, the people trying to share IOCs or other threat data have a day job and obligations from their employer—and not to provide the government with data. To this end, the process should be as simple as is technically feasible. If it’s not an easy system, all these efforts are moot and it's not going to get used. Make it easy for people to contribute, especially because they're going above and beyond to share cyber threat intelligence and they need to feel like they're getting something back.
If I had an information sharing system that was both easy to use and provided my team with real, actionable threat intelligence in return, I would try very hard to leverage it.
I see a lot of resumes and about one candidate in 50 are from women. The resumes are either not making it through the screening process in order to reach me or they're not applying in the first place. Obviously, this is part of a long-standing issue about women not entering IT overall, not limited to a shortage in cybersecurity. This is still happening while we're looking at a big talent shortfall in cybersecurity—which is a really hot field, and it’s going to be a hot field for quite some time. I think the problem originates with a need for more STEM education that starts with K-12, which is a much larger problem than just talent shortfalls within cybersecurity.