For many, cybersecurity is seen as a cost center that reduces risk to the business. This can be oversimplified to something akin to how HR reduces people-related risks but comes with layer on top of layer of complexities ranging from technology to physical buildings and, of course, people. Regardless of organizational size, cybersecurity leadership requires a top-down approach, leaving room for discussion at the board level and aligning it with business goals.
This week on AZT, Neal and I chat with Kris Lovejoy, Kyndryl’s (IBM spinoff) Global Security and Resilience Leader, former CEO of Virginia-based BluVector, and a former IBM CISO prior to being made GM of their security division. Having danced the line between startups and mega-enterprise organizations, there are few others who could so adequately discuss the role of cybersecurity leadership within modern organizations and why having a competent person at the helm is critical to the business (not just to reduce risk). We also play a bit of RSA buzzword bingo.
Leadership Rolls Up Hill
One of the more unique attributes of working in cybersecurity is that there are more unknowns than knowns. Your older parents, senior staff in less technical roles, and many consumer groups don’t have the level of exposure that younger generations have to technology and therefore are at a disadvantage. This isn't to say there aren’t expectations, but there are memes about CEOs not being able to turn something into a PDF for a reason. Cybersecurity is significantly more complex than other technology fields, too, because the threats organizations face are constantly changing.
Philosophically people know that companies need to secure their people, systems, and data, which is important, but as an infosec leader, much of the pressure is about educating others from the bottom to the top.
“The one common thing that I always come back to is this sort of the point of empathy. You know, at the end of the day, we work with. That it's, and you know, whether you are in cybersecurity or any other industry, it's all about relating to people, understanding what people care about, helping people understand why they should care about something,” said Lovejoy. “And then as a leader, having you kind of follow up the, follow you up the hill, even though it may not be something that they wanna do. So leadership is, is about that. It's about sort of creating that empathy and getting them to move with you, rowing in the same direction regardless of whether you're in a big company or in a small company.”
As Lovejoy noted, empathy is at the center of leadership. Understanding that people are at the core of cybersecurity is a critical focal point, and ensuring they know their role will always create a stronger defense. This falls in line with the Zero Trust concept of trust but always verify, further ensuring that these gaps may be covered by people, processes, and technology who do understand the risks at hand. However, empathy needs to be seen at all levels, and none should be above the policies in place.
Someone on your C-suite wants admin access? How many developers have access to keys? Do your contractors have access to your shared Google Drive? Remember, it’s bottom to top, top to bottom, with no exceptions. Access is always need to know. On Lovejoy’s note, Neal muses back on some inner workings that are all too common, where a CEO may fail a phishing simulation but not receive the same kinds of consequences as your independent contributor or manager.
“No one can hire enough humans to manage the complexity that most of our organizations have. So, by God, simplify the infrastructure so that you don't have as many people that have to manage it so that not as many people are gonna make mistakes. Because guess what? Technology plus people equals you're gonna make mistakes,” said Lovejoy.
Lovejoy acknowledges that organizations want the coolest and newest shiny objects, but this adds significant complexities, and cybersecurity or IT budgets rarely run parallel to supporting this level of infrastructure.
“And so we've gotta think about this in a profoundly different way. And again, going back to kind of that question of how do you navigate small or large organizations, at the end of the day, security is about people,” said Lovejoy. “It's about people that make mistakes and to fix the security problem, we have to figure out how to address that. And again, tools are not necessarily the answer. Sometimes it's as simple as being empathetic to how users engage with your technology and simplifying that experience and simplifying the underlying infrastructure so that the people that are managing it can do so in a more, uh, sort of simple and effective way. I know that sounds easy. I get it, it's hard, but it's just looking at it from a different perspective.”
At RSA this week? Neal is floating around the Cyber Lounge if you want to say hi.
Leadership rolls uphill, and that requires empathy at all levels
How critical infrastructure funding is established and approved today remains a significant blocker to agility
Zero Trust is a mindset, and successful implementations of it are rare
There is a generational perspective divide when it comes to defining what role privacy plays in our lives
Weekly Cybersecurity Headlines and News
Most of the content about Zero Trust is opinion-based, but here are some impactful news stories from the past couple of weeks.
Could ‘Zero Trust’ Prevent Intelligence Leaks? | You should be side-eyeing this headline
This transcript was automatically created and is undoubtedly filled with typos. As usual, we blame the machines for any errors.
Elliot: Hello everyone, and welcome back to another episode of Adopting Zero Trust or a z t. Uh, I'm Elliot, your producer alongside Neal Dennis, your co-host, uh, and the mouth of our episode. And today we are going to be chatting about, uh, cybersecurity from the top down. Um, in a moment, uh, we'll introduce you to our guest, Kris, who works at one of the largest organizations that we will probably ever chat with and certainly have to date, uh, about cybersecurity in general, which is just, uh, as Kris lovingly called it a minute ago, it's kind of like a mind boggling experience.
So it is, it's difficult to kind of wrap your head around, uh, how a 90. Plus thousand, uh, employee count organization can, uh, properly prevent cyber, cyber attacks and build in these programs. Um, but that is what we're gonna be chatting about today. Obviously we have RSA coming up and both Neal will be there.
So we'll jump into that a little bit. But Kris, um, you know, if you don't mind, can you give us a little bit of background on your, uh, quite extensive LinkedIn, uh, resume?
Kris Lovejoy: Sure, happy to. And, um, Neal Elliot, thank you so much for having me. I'm, I'm really, really, really excited about being here and just having, uh, pragmatic discussions, I guess, if we can say, um, so, uh, yes, I am with Kyndryl. For those of you who don't know what a Kyndryl is. Um, about a year and a few months ago, uh, I, IBM split its services. Component off. And so that's about 90,000 people. And so we were spun off into a completely independent company. Um, our focus is, or historically has been on IT outsourcing. Um, but we're moving, you know, from, or pivoting from IT outsourcing to more technology services. Um, and so what we're delivering, um, to the market is really management of mission critical systems.
So we do a lot of work with very large companies, you know, financial services, other critical infrastructure industries, um, in, you know, managing infrastructure yet includes, you know, pretty much anything mainframes to cloud infrastructure, refactoring applications, network transformations, um, you know, digital workplace experience.
You know, my role at Kyndryl is to do really two things. One is I am the leader of the security business unit, so that's the group that kind of builds and delivers security, resilience capabilities. And then second, I am, um, responsible for commercial risk. So, you know, all of the customers with whom we work, um, they have a set of contracts that we are, you know, sort of adhere to.
Plus they also pose a risk to us. And so my team looks after sort of the implementation of control around those, um, organizations. And so, you know, just by way of background, I've been in space. 30 some odd years and have been, um, on lots of different sides of this, uh, equation. Um, I've had the blessing of being at three startups, um, and, um, with successful exits.
There's, I still don't know how that happened. Um, I've been a ciso, I was at the IBM for, um, many years as the, uh, chief security officer and ran the business unit. Had some other, um, stints along the way, but, uh, now I'm here at this 90,000 person startup and trying to keep my head from bursting into flames every single day.
So it's where I am.
Elliot: Love it. Uh, so as we had mentioned, that is, uh, obviously a very large organization or startup. Now, since you've spun out from IBM in the most recent years. Uh, and having been exposed to IBM in general, I think you had a little break in there where you were kind of working with some other organizations.
But, um, what did that look like moving from those smaller orgs to now nothing more, nothing less than essentially a mothership of just people and infrastructure. Um, you know, how, how do you wrap your head around something so gargantuan and shifting from something smaller to larger?
Kris Lovejoy: You know, it's, it's funny. It is, it is way different being at a smaller, at a smaller startup of, you know, 15, 20, 30 people than being at a 90,000 person startup. Absolutely. I think interestingly, there is a kind of an inverse relationship to drama and smaller companies. So when you're in a smaller company, you tend to focus, you've got like a, you know, a silo of focus.
You focus on one thing and the people around you, you tend to get really, really, really close to, like, weirdly close to. And so, um, and when you're an executive within a smaller company, every, the people with whom you work their problems become your problems. And so it's kind of, that family dynamic is almost closer.
And when you're with a larger organization, you are. The surface of the problems that you deal with. You get a little bit a d d, you're dealing with so much, but you're only scratching the surface. And so, and you have a larger team and you're working with a lot more people, but the drama isn't quite as exhaustive.
So I would say, you know, weirdly, when you're speak, you we're small companies, large companies. I'd say the real difference is the extent to which you'll kind of engage with the humans in a, in a lot of ways. Um, and I, I kind of, when I'm with the startup, I love it, but I also, I get kind of overwhelmed by the emo those of, you know, people who've worked with me, they know I kind of lead from the front.
That's kind of my thing. I really like to get my hands dirty. And the, um, being, it, it's a pretty emotional experience for me. Um, and then when you work with a bigger company, you kind of miss that. So it's like you're never really in the sweet spot. But, um, I've learned to navigate between the
Elliot: So I think from a philosophical, philosophical perspective, it's gonna, probably the majority of how we'll take this conversation is, um, I think obviously at the end of the day, you know, preventing attacks, securing organizations, securing the brand, carrying the people, and technology, you know, that's the simple answer, simple thread between something, you know, a five person company and 90,000 person company.
But, um, I'm wondering if you sort of have like a, a systematic approach that you are able to just navigate between them, regardless of organizational size. Obviously there is that one thing that you might do in a small org, but, um, is it, you know, a focal point of prevention? Is that like the key player that you try to bring in?
Or, you know, is it even just starting with like bringing the value of cybersecurity to the organization? Um, you know, I'm wondering if there's maybe a thread that you kind of start around like cybersecurity at a leadership level.
Kris Lovejoy: Yeah. And so, you know, and it's funny because I, I, I kind of abstract myself as a leader from myself as a cyber practitioner in a way. But the, the two things come together. So, um, one of the things, you know, as a leader, let me just say whatever, you know, sort of business that I'm focusing on, um, at any given day.
The one common thing that I always come back to is this sort of the point of empathy. You know, at the end of the day, we work with. That it's, and you know, whether you are in cybersecurity or any other industry, it's all about relating to people, understanding what people care about, helping people understand why they should care about something.
And then as a leader, having you kind of follow up the, follow you up the hill, even though it may not be something that they wanna do. So leadership is, is about that. It's about sort of creating that empathy and getting them to move with you, rowing in the same direction regardless of whether you're in a big company or in a small company.
Now, cyber security, I think, is one of those areas where the, the ability to be a true leader in that kind of the dynamic of creating empathy around an issue that people. Feel a lot of anxiety about but don't necessarily understand that is, that is something that I've kind of learned to practice and has made me, in a way, a much better leader.
Because, you know, cyber is one of those, you know, areas where there's just, you know, it was, it was interesting. I was talking to, um, somebody on the, um, Homeland Security at one of the senators who's on the Homeland Security Committee, and he was talking about the fact that th this is the case. It creates cybersecurity, creates anxiety for him, but he just doesn't understand it.
Right. He just doesn't get it. And so, and it, it just struck me that it is so true that to help people. Understanding cyber in implement, in identifying and implementing controls, being part of the Army to do that. The leadership of those people, the leadership of those businesses, that's all kind of one and the same.
So hope, hopefully that makes
Neal: I, I think per, I think it's a good approach to the construct and, you know, I think there's a special place in, or there needs to be a special place in, in corporations for people who can have that kind of conversation. And be a little bit more empathetic towards the plight. Cuz you know, on the tactical, technical side, we always, you know, younger years would make fun that if the CEO opens the spam, you know, that you're sending him for test and trial, right?
If he opens up your little test email, well he obviously can't kick him out of the company. But if the guy three steps before hi underneath him does what's the, what, what's the remuneration, what's the actual problem that you're trying to solve and how are you going to solution it with that person?
Because you can't tell the CEO not to open up his emails. You just have to figure out how to work around it. But the guy's three steps below, you're gonna punish probably unjustly for the same exact mistake that you can't fall through with up and down the entire chain. And I think that lack of, of capability every direction ends up causing that gap in reality.
Cuz the CEO never sees what, what the punishment is for doing something bad, unless it's a legit high impact invent that does legitimately impact the dollars of the company, right? So there is a disparity as you grow between the, the
Kris Lovejoy: Yeah, and more, I think more importantly is, you know, in, in this particular field, and I, you know, I talk about this a lot, is the fact that, you know, most organizations today kind of invest in the security person, all accountability for security. Right. So there is an assumption that somebody is gonna wear a magic cape and they're going to magically with a bunch of ferries, they're going to fix the security problem and it ain't gonna work.
You know, and, and I was just having this discussion just this morning, um, with a CERT organization who is, you know, they were sort of saying, okay, well, you know, we're having a problem hiring people and you know, it's so complex. And I said, okay, stop. Just, let's just stop. Let's just talk about this act. The actual problem here.
You know, for me, the problem is most businesses don't. A recognize that, you know, let's talk about the last few years. They bought a lot of technology for existential reasons. During the Covid period. It was oftentimes individual business units that acquired that technology without an overall architecture or strategy.
Meanwhile, they did this on top of a whole bunch of legacy infrastructure that they've amassed over the years. So this is. Strategy less it is architecture less. You've got all of this stuff that they've brought into the organization. Guess what? Stuff that complexity introduces risk. That risk has to be managed by humans.
No one can hire enough humans to manage the complexity that most of our organizations have. So, by God, simplify the infrastructure so that you don't have as many people that have to manage it so that not as many people are gonna make mistakes. Because guess what? Technology plus people equals you're gonna make mistakes.
I, I, it's, you know, it's just statistical probability. So, you know, I, I keep telling people, I, I get the need to build and want more tools and to invest in more really cool capability and to wanna hire more people. It's not gonna work. It's not gonna work. There isn't enough money out there in the world for us to spend on security and get to Nirvana.
And so we've gotta think about this in a profoundly different way. And again, going back to kind of that question of how do you navigate small, large, at the end of the day, security is about people. It's about people that make mistakes and to, to fix the security problem, we have to figure out how to address that. And again, tools is not necessarily the answer. Sometimes it's as simple as being empathetic to how users engage with your technology and simplifying that experience and simplifying the underlying infrastructure so that the people that are managing it can do so in a more, uh, sort of simple and effective way.
I know that sounds easy. It it, I, I get, it's hard, but it's just, it's looking at it from a different perspective.
Neal: I I think that's wonderful actually. Cause, I mean, this will transition us into the zero trust portion of this chat here in a few seconds, but I, I, I think the perspective you bring it, it's, it's awesome. It's something that ironically, not poetically rather, um, a couple webinars I've been doing for my current company, nine to five, you know, we, we see very similar things where people have obviously, like you said, bought a lot of things over the covid years.
That they probably shouldn't have bought or that they thought they should buy, but they don't know where it actually lies. But then they've got all this legacy, they've got all this new, they don't know how that works. So to your point, people are now attempting to throw actual people at the problem. And the one thing I remember the most about going through my project management professional certification is you don't just start throwing people at a new pro at the same problem.
You gotta actually figure out how to actually solution the problem before getting more people in the room. And the last piece of that that I love. Yeah, the, the last thing I like, you know, there, there's a quote that I'm not gonna be able to remember from scratch without be front of me, but there's a gentleman jcr lip lighter from the sixties that wrote this, this wonderful paper about man computer symbiosis and how at some point in time we're gonna have to find a way to work together.
And only through truly understanding how automation, and in his case orchestration as well, can come together with the human can we actually realize the closest thing to your, to your statement Nirvana. And anything that we do, we're never gonna get there, but we have ways to make it better. And to kind of transition us into the fun part here, I, I think kind of the impetus behind this with the Zero trust mentality is, you know, we should be able to allow the CEO and three stages down and have the same, you know, flows that they need day to day.
And the downside is on a security people, we have to figure out how to set that up. But I think with Zero trust as a mindset, you know, if we set up the CEO the right way, when he clicks on that email, nothing should happen at the end of the day. Right. And same thing with the guy who's down three layers deep that clicks on a very similar email and it actually is malicious.
Nothing should happen if we set up the right trust environments around zero trust mentality. So anyway, I I, I love your, your approach to the, the facts here and, and the way it goes at it. Um, so in that vein, kind of thinking about that from a hierarchal perspective and the zero trust mentality, you know, being able to build into legacy and current infrastructure and maybe consolidate to where.
Now so many things talk to so many things. Right. And I think it's kind of the impetus with zero Trust is one of the tenants is things should only talk to the things they're supposed to. And if they're talking to something they're not supposed to. Why? So, I mean, from your perspective at this scale, you know, and you mentioned critical infrastructure, maybe we'll go down that one too cuz I love o t it relationships.
But from your perspective, how at this size of a company, um, how do you think that that breaks down to kind of start applying that, that human interface loop along with the actual security arm piece and get people to understand, we gotta fix people, but we also need to make sure the right tech stack is actually starting to be built around all that.
Kris Lovejoy: Yeah. And I think there, there's a couple of this
you're picking on a, um, a big topic and I, and what I would say is, you know, please take everything I'm saying through with the filter of different patterns applied to different corporations with, of different sizes, with different cultures, different risk tolerances, et cetera.
So there is no perfect answer for any given organization. Um, So I would say that flat out. The second thing I would say is, and you didn't ask me this question, but I'm gonna point out something that is fundamentally impacting our ability to protect critical infrastructure. And not all critical infrastructure, but a lot of infrastructure like water, utilities, other, um, quasi-public or even private organizations that are regulated, highly regulated.
What people don't understand is that the economics of how organizations, let's take like a water facility, how they build their plans, their capital plans in two or three year, um, cycles, bring those plans to the government, get them approved, and then can't really. Within that window, that is a real problem.
And so what I would say is for a lot of those or institutions that use ot, it, um, and are seeing this convergence, we got a problem. And the problem made security, the problem is how we fund the security. And so they are just simply not agile enough. And until the government, um, takes a look at sort of the funding process for capital associated with cybersecurity within these rate cases, we're not gonna fix this.
But, uh, let's park that for another day because I can get on my, my high horse on that subject. But let's just ta talking about, um, zero trust. Zero trust. It. It's really, I love the, you know, what you just said, Neal, about the CEO double clicks on something and nothing happens. That's how I describe Zero Trust.
Quite honestly, it kind of encapsulates it, right? Because there is most, um, the C-Suite, they don't not understand what this means. Right. They don't like, they don't get zero trust. And I, I hear either the people that get it really glom onto it and then they don't understand how hard it is and what it means from a technology perspective.
Or they say, Hmm, zero trust. That sounds like a really bad thing where I'm not gonna trust my employees and I trust my employees. Right. So you get that sort of, that dynamic. Anyway, I, I liked, I liked your description. Um, what I'd say with Zero trust implementation is it is still a mindset. I think it is, I would, could probably count on one hand successful zero trust implementations. Um, and I would say that I have not seen any large enterprise introducing zero trust successfully across the entire organization. Where I am seeing it successful is where the customers a recognize one fundamental truth, that moving a legacy infrastructure to cloud does not equate to a modernization or transformation program. Now, I say that because a lot of zero trust, successful zero trust implementations have been implemented within a cloud infrastructure. And the cloud infrastructure is largely a fairly uniform infrastructure. Hybrid cloud obviously makes things harder, as I mentioned before, where you don't refactor an application to be cloud native.
It makes it, IM. So, again, where I find, um, organizations successful is where they have implemented zero trust within the context of a modernization, a digital modernization program where they understand they've got the bill of materials for that service, they've implemented the controls within that context, and they've created a, an environment in which, you know, zero trust can be recognized.
Hybrid legacy, lots of on-prem legacy. That makes it really hard. And the organizations that I see tackling that even somewhat effectively are the ones that are starting with kind of that minimum viable business service mindset where they're saying, I, I care about the service, I'm gonna deconstruct that, and I'm going to implement zero trust in and around the business service as opposed to trying to tackle my entire infrastructure.
Neal: Yeah, I think that's, I mean, well said. So it, it's. It is a weird thing. So I, I'm on a, I'm on the vendor side of the house now. I used to be a more full-fledged practitioner across a lot of different things, and we get similar, similar questions around merging legacy on-prem with new cloud mentality stuff.
And I, I would, I definitely agree that if you go through the cloud providers, the nicety of it is pick your flavor of cloud, but once you're there, it, it's the same structural base. So you know what should be talking and what should be talking and what, and you should have the same ability to not necessarily copy paste, but have a good idea of how to copy paste your security practices and principles across that environment.
And to your point, the moment you throw in something from your enterprise server farm, I mean it might as well kind of get thrown out the door, uh, to some lesser extent, uh, from y'all's perspective, you know, from addressing on-prem pieces, you know, and, and that growth phase, you know, is, is kind of the mentality there.
Hey, it. It is really kind of a good time and place to have a legit modernization towards cloud infrastructure for the sake of both maybe cost, but also maybe for the sake of security or, or you know, are you able to kind of coach both sides of it and say, okay, on-prem, but if this, and we can make that more modern, as well as approach it with that hybrid
Kris Lovejoy: You know, it's interesting. What I'm seeing a big trend, uh, toward is a lot of organizations are actually undergoing modernization programs, not simply because they're trying to kind of improve whatever they get to that business outcome, reduce cost, that's always a, you know, major driver, but it's really the security resiliency sort of positive outcomes that they can get through that modernization program, which is actually. The thing that gets the program over the precipice, if you will. Um, so simply said, people are modernizing because there's no other way for them to achieve better overall security and resiliency. They cannot do it with the complexity that they have in place. They do want to achieve, um, kind of a zero trust mindset.
Um, and they also recognize that ain't gonna be possible with what they've got today. And so I do think, you know, it's, it's, you know, for me, when I go talk to clients, you know, most of the time they're expecting that I'm gonna tell them about some new wizbang tool. You know, Hey, are you using AI to do better threat detection? It's like, no. Are you able to patch your systems? If the answer is no, go simplify. No. Like, start in the basics.
But, um, it's, uh,
but yeah, that, that's the conversation I, I'm having with, you know, organizations is zero Trust is not possible unless you have a good identity management system. Do you know your identities?
Do you know where your assets are? Do you understand your networks? You know, it's like, come on, let's focus on the
Neal: I think yeah, that's nail on the head right there. I think that's been a wonderfully recurrent theme for us with most people is basics. And most people missing out on that. And I, I think this is a Disney movie in the making, you know, like The Miracle and some of the other ones where they, you know, the sports team shows up, pick a flavor, they all stink, but what do they do?
The every thematic of every one of those successful sports team movies is they went right back to the basics and then they, they were successful, right? At least for the first season. Then you find out two years later, they all died in a plane crash or something. But, uh, warm winding. But they, uh,
they had a good year, uh, back to the
Kris Lovejoy: side, right?
Neal: But I know, uh, but they did great when they went back to the basics. Now they're a little carnivorous. Uh, but that being said, you know, I, I think that that's kind of the wonderful echo of all this stuff. And for my 20, what is this, 20, 23 for my 22 years of doing something remotely related to security in tone, all this other stuff, uh, I, I've seen the same exact questions every year and they very rarely get answered the right way or in a timely fashion, like you mentioned.
You know, can you control your identity accesses? Can you control the people in the environment from a digital fingerprint perspective? Do you know what your assets are? And we've been asking that same question since we've plugged something in. And the solutions that try to support that are basically the same things that were there 20 years ago.
They haven't really, in my opinion, gotten a lot better at doing that job. I think the human in the loops have gotten a little better, but the technology for doing that hasn't per se. So yeah, I mean, I, I think that's wonderful. I think it's right in par with everything else. You know, worst step one, step one is back to the basics.
Get to the line and do your calisthenics and then you'll, you'll shoot a goal hoop score of whatever it is for your sports analogy. People out there, uh, I'm not a sporty person,
Kris Lovejoy: Yeah. No, I, I tend
Neal: But that being said, no, it's wonderful. Um, on that same vein,
Kris Lovejoy: say I'm much
more, I'm much more,
Kris Lovejoy: yeah. I use the Lord of the Rings analogies most of the times, and most people are like, what? What?
Kris Lovejoy: Anyway.
Neal: Trivial pursuit, Lord of the Rings champion right here.
Kris Lovejoy: Oh, there you
Neal: with, with a kindred spirit. Uh, uh, that being on that vein though, you know, back to the basics. I, I think, you know, kind of keeping in that, that trend there, uh, you know, we talk about identity access management, we talk about, uh, asset control.
So, I mean, you know, moving things to a cloud environment systemically, I think from what I've seen, kind of means you're inherently just by proxy, more aware of what it is you're, you're doing business with tooling wise because you're paying a third party to host your product. So at least you know roughly what it should be versus some random engineer in a closet setting it up.
Right. Uh, I had a question to that, but the Lord of the Rings reference totally threw me off. So we're good to go. Uh, uh, moving down the road with it, uh, thinking about, you know, some of the additional parameters there. So let's say we do get the basics out of the way. We, we figure out identity access, management control.
How important to you from, from the growth phase of this, if, if we can solve that one problem, how much do you think that impacts the larger picture as you start to build and progress? Because I see people that go way out here and forget about the, the, the digital access control pieces and they do something else like trying to just secure a server to a server and they forget about the people in the loop and what all that
means. So, I mean, kind of fixated on the,
Kris Lovejoy: yeah, and I, I would say it's fundamental. I mean, and you know, my head at least, everything on the internet, whether it's a P person, whether it's a, you know, an IT asset, an OT device, whatever it, you know, any device, um, you know, particularly in a 5G world, that device has an identity.
And so we've gotta abstract the fact that, you know, one of you know, yes. Different assets that interact on networks. Have, there's a physicality to them and there's a sort of, within the context of that physicality, there's a certain type of behavior that one would expect and a certain way in which that thing is going to interact with all other things. But these things on the internet, whether that be human, uh, sort of a, you know, sort of a carbon-based life form or a silicone-based life form, you know, these things are becoming much more similar over time than un alike. And so what I would say is sort of this concept of, um, of being able to think about. Technology, a carbon-based and silicone-based life forms as having identities and being, and I hate to say it, but assets that are, have to be sort of deployed and maintained within a repository. Um, and then having to think about sort of the seams between those things and how those things interact. I think we've gotta kind of change our mindset.
And so I would say that, you know, uh, my prediction is over the next few years we're gonna see kind of that concept of identity management, asset management really come together in a really, um, significant way. And I do think that, you know, we, as we are, you know, the one good thing I can say about AI is it is forcing us to rethink kind of silicone versus carbon based.
Um, and you know, sort of the interactions between those two. And I can also say, you know, I've got, You know, for those, and I'm sure you've kind of upped to speed on some of the medical, um, you know, sort of device, uh, sort of things. I mean, the wear with wearables, et cetera. I mean, we're entering into a new space and, and I do think in some ways because the barrier, because the, the boundaries between these things are kind of, um, disintegrating in a way. It's, it's, it's gonna make it a little bit easier for us to manage things in a, but we'll see.
Neal: No, that's neat. I, I, I think it's an appropriate aspect, uh, with, well, we just had a wonderful interview about kind of cell phone technology and, and some. Eschelons of, you know, doing away with the phone number per se, and having that digital Id be you. And so when I go to dial a number, I'm not dialing a number, I'm literally dialing you in a persona perspective.
Um, and then I think with blockchain and the growth of everything there, that, that digital fingerprint of who you are and being able to call that digital fingerprint, whatever it is, as an asset control mechanism or, or management perspective. I, I, I do think we're not too far off from that reality, loosely based.
And then, you know, we've got these, these behavioral analytic type security appliances out there. And, and Elliot should be, probably haven't brought this up in like, at least the last five interviews. Uh, but we've got these things that do like the biometric fingerprints of a user, right? So trying to do away with passwords and things like that for security purposes.
And at the end of the day, it's still ones and zeros, it's still a fingerprint of some sort, whether it's a password or your, how you type on a keyboard. But I, I love the idea of, well, The security guy who likes privacy doesn't, but the guy who likes the idea of security and being more secure and controlling that fingerprint, likes the idea of being able to come on, have voice, have fingerprints, have keyboard type patterns, and then the, the even weird things like the reflection of the light or the power that my wifi is registering.
Things like that all become kind of an enablement for who I am as a persona and be able to use that in my security profiling and, you know, echelons of complication later, someone will eventually figure out how to duplicate all that stuff. But I, I think that's kind of an interesting way of doing it. My phone plus me is an identity.
Me without my phone, I can't do anything, um, in a roundabout world.
Kris Lovejoy: Yeah, I think that that question you're bringing up an important point though, is that kind of contention between privacy and security. We just have not, you know, I, it's, it's really interesting, you know, for me, as you know, and it's a gen, it's oftentimes it's a, it's not just a cultural thing, but it's a generational thing.
And I'm seeing kind of privacy for, you know, younger Americans is equivalent to control, right? So they've been kind of schooled, particularly with social media that, you know, privacy to me is the ability to control the data that I share with others, you know, whereas, you know, you're looking at Europe, it's much more about anonymity.
I don't, you know, it's gonna be interesting when we get into the debate about the national privacy law here. You know, when it comes more, you know, more up for public debate, it's gonna be interesting to see which way the chips fall. Whether we become more like Europe or we can take stay true to that kind of post nine 11, you know, security kind of outweighs privacy. It's gonna be interesting to see what happens, but another one of my subjects that I love to talk about is prognosticating on that area. Who knows?
Neal: Well, if you're at, at rsa, uh, I would love to pick your brain on the OT side of the house as well as this one. Uh, you know, the bill that, that is obviously rightly pending right now for discussion in Congress has a lot of ramifications for, for our privacy. But like you mentioned, different topic.
Kris Lovejoy: Yeah.
Neal: So, no, but you know, it's still just in vain though. It, it's still one of those deals with, I, I'm a firm believer whatever you're doing in a corporate office is not private or, and it shouldn't be private if it, especially if it potentially impacts the security of that environment. And this could just be the fact that I work at a couple, three litter agencies where nothing is private when you're there, no matter what.
Um, they even track when you leave your desk. Um, and other fun things. But, you know, and then I think your home life, you know, back to the own your data, own your access to that data. What makes you, you, I, I think that to me is a very good. Tenement of where some of these zero trust implementation type companies, startups, whatever you want to, whatever they are, are trying to look at things where they have multi pieces, multi-layers to what makes my fingerprint me.
And if I don't want to give you my actual fingerprint for my digital id, okay, well then maybe it's, you know, maybe it's a voice print, maybe it's none of those. Maybe it's nothing physical, maybe it's a couple other different echelons. Right. And I, I think for me that that's kind of the right approach for now is, you know, you've got a factor of like 10 things.
If you only fill out three, the trust is down here. If you fill out four, you steadily go up the trust echelons of what can and can't be done. Um, but yeah, it's gonna be a unique problem set for sure.
Kris, on, on the rest of this. So from. Kind of thinking more about the implementation side before we go down there, r say rabbit hole. Uh, we, Ellie brought up some fun points. Startup versus large scale corporation and things like that. I, I think fundamentally, like you mentioned, people problem and if you've got a little more to say or you wanna share about that, I, I love that mentality.
I find that that's, that is a hard hurdle to overcome is the trust within the people. And then the last aspect of that, you talk about, um, you mentioned kind of loosely about, you know, what happens to people when, when that person gets overwhelmed in the security stack, right? And old school thing is, if you're the ciso, congratulations, go get a new job.
We've seen a little less of that thankfully. But, uh, I think part of the people problem is making sure people understand that the security mechanism as a human is fallible. Um, and that there's room to grow,
Kris Lovejoy: Yeah, no, and I, you know what I'm seeing, it's interesting. Um, a lot of the, um, sort of, you know, we get regulated by, you know, pretty much every regulator that's out there, and there's been an increasing trend, um, to start really focusing on governance and management of cybersecurity within three lines of defense. But in changing, it used to be that, you know, security was kind of the first line of defense. So think about security as kind of implementing, operating the tools. It's the guidance is now that security needs to really be in the identification of risk, the identification of controls to manage the risk and the monitoring of those controls.
That the first line function is not actually security, the first line function or can be a little bit of security, but the first line function is really just the employees, the IT organization, et cetera. And I think that mindset makes a lot of sense, is to give everybody the responsibility of managing security as opposed to just like investing in that one organization. Um, but I, I did wanna talk, uh, but I did wanna just touch on tools though because I do think technology does have a really important role here to play. And I think that there is just an amazing amount of innovation that is occurring within the security sphere. I see it every day. I do think it is a real tough job to be a, in a startup in security right now, to be honest though.
Um, you know, if you think about how, um, The, the market is, has been evolving. A lot of these security startups, they, why do they pop up? They pop up because they wanna solve a very specific, very narrow problem. And the reason they do it that way is because that narrow problem can be paid for by a business case that was established.
Either because a, a company had a crisis and they need, they're bleeding and it's like the, the crisis dujour like ransomware. And they're gonna fix that problem better than anybody else. Or alternatively, there's a new regulation out there that says you have to do something. Right. So I've seen it, you know, over the years, these popups, it's like security intelligence this month.
And you know, now this, I bet rsa we're gonna see, you know, talking about rsa, it's gonna be resilience. Everybody's gonna be talking about resilience. And I, I I feel for these companies because they solve a narrow problem, but they have to get market. And so they say they do everything. And so it gets to a point where, you know, I get, I can't even tell you how many vendors are reaching out to me in any, any given day.
And I swear I, I try to be a really nice person, but I can't talk to all of them. I just can't. I don't have enough time. And I know that a lot of them have really good technology, but by God, if you don't actually have somebody who's paying you money to use that technology and that person isn't willing to talk to me, I can't talk to you.
So, you know, I think that we've got a situation in the security market. I don't know what the answer is, but there's way too many tools out there, too many good ones, and I feel terrible, but a lot of them are gonna, are gonna continue to struggle because there's just not enough market to go.
Neal: Yeah, I definitely agree with that. You know, I think that was the one unfortunate thing outta Covid was I think we were really close to a course correction from that perspective to have more consolidation of concepts. And I think courtesy of C O V I D, there was a lag cuz I, I am a victim of part of this temporary lag from a startup I was at, uh, where we couldn't get our funding for the first, you know, for our, our whatever.
And uh, that being said, after about six months of covid woes, everything kind of went, did this. And then every like, well, is it really gonna go down? Is it gonna go down? And then it just kind of went up for almost two years straight again. And I think all these technology companies, Should have been consolidated or, or thrown to the curb just because they didn't have client base or they couldn't build their brand.
I think they got a, a bit of a buffer courtesy of the covid growth that some of 'em probably shouldn't have gotten. Uh, but I agree. I think there's a reckoning not just from a financial closure issues, but just in general. I think the reckoning is upon us a little bit here and, um, and I think it's gonna be a good deal.
I think it needs to happen. So hopefully, uh, yeah, we'll see
Kris Lovejoy: it's pretty heartbreaking though because there are a lot of good companies with a lot of good people, with a lot of good technology that actually solve really important problems. But unfortunately, there's only so many companies that can solve
Neal: Yeah, that's true. So on that note, you mentioned, uh, I guess maybe, I dunno if Ellie, if you had anything specific from an RSA perspective, but I feel like this is a wonderful transition and resilience, this resilience that I, I'm very curious to see what the token word is at at
Elliot: the head. Um, obviously in years past it was zero truss plastered and everything, and I'm sure you guys are fortunately see that. But, um, yeah. Kris, we'd love your opinion on what you feel like, you know, is resilience gonna be plastered on every brand? Do you feel like it's gonna be zero trust all over again?
Uh, where, where do you feel like you're gonna see, like the hype circle really ramp up this.
Kris Lovejoy: Yeah, I think you're gonna see, I still think we're gonna see a lot of zero trust. I think particularly with CISA coming out, you know, with their new guidance just today, I think, you know, you're gonna still see a lot of it. Um, but you know what, I, I think it's actually good because I think it's becoming more tangible.
People really understand it more, so that's a good thing. Um, that's one. I think resilience is gonna be a big one. Um, I think because of, and there's a number of different reasons for that, and I thinks, I'm not unhappy that we're talking about cyber resilience. I just wanna make sure we're talking about it the right way.
Um, and not having everybody just slap the word on their, you know, on their banner again, but Dora, you know, and some of, um, the governance requirements, even the stuff that's coming out of like the s e c for, you know, boards of directors worrying about, you know, cyber risk, not security, cyber risk broadly, and anything that can happen to impact your digitally enabled services, that's a good thing.
So I think resilience is gonna be a big code word. I also think ai, you're gonna begin to see a lot more of that. And I also think we're gonna begin to see, and I really, it's like, I wish I could put a stake in the heart of this one, but quantum resistance, right? I think we're gonna begin to see that. I don't think anybody knows what it means, but I think we're gonna begin to see that pop up on, uh, people's shingles as
Neal: I saw at, uh, Gartner s RM last year. I saw two Quantum, so. At Gartner last year, and it's always intriguing when Gartner brings something into their, their conference from a whatever weird nomenclature it is, cuz then, you know, they've, they've obviously got something in their quadrants built out for whether they published it or not.
So, yeah, that, that'll be a fun one for sure. Uh, yeah. But do you think resilience? Uh, so I, I'm, I think we've, we've been around a decent enough time to where we see things that we used to do 20 years ago are now what we're doing again now, and we're just rebranding some of it labels wise, nomenclature changes.
Do you think resilience and, and the framework that people are building is just a new way of trying to just get back to a compliance driven mentality at any level? Or do you think, you know, that like the old early two thousands, nineties compliance, mental.
Kris Lovejoy: No, I actually, I'm actually a little bit more hopeful. Um, I see it as one of the problems I think with security and what has made us, it, it's kind of our own fault in a lot of ways, is we made our field so specialized and so narrow that it became really hard for business leaders to understand and to get behind, right?
They don't, you know, when I talk to business leaders, they don't, they don't know what they're spending money on and they don't know what they're getting out of it. It's just, it's just a problem. And, you know, at the end of the day, the management of cyber is, uh, of, of security is about managing risk. And so it's being able to identify that risk, to understand how it can impact you to implement the controls, to manage that risk, to monitor, measure and report, and then recover when things go wrong.
It's, uh, it's a risk management process, I think resilience. Is a, it's the first time I am seeing coherent business line discussions about bad stuff that can happen to it. One of the bad things that can happen, cybersecurity attacks can be realized and something bad like ransomware can, you know, sort of impact my organization. Business leaders are not thinking about resilience within the context of its security, it's privacy. They're thinking about it is bad stuff is gonna happen and it's the process of managing Now, Neal, the danger to your point on compliance is that a lot of, if you look at the national security strategy and some of the technical guidance that we believe is gonna be coming out of Dora Ns two out of Europe, what we're seeing with, you know, India with some of the, you know, requirements, they're very prescriptive. They're prescriptive because the regulators are feeling nobody's paying attention to the risk management process, right? So I don't, I'm not sure that I would say resilience is driving the compliance orientation. I think it would've happened anyway. I would position it as resilience may add a layer of pragmatism around compliance because it was gonna happen. I think resilience just becomes a way we can imp increase the aperture for business people to think more broadly. That's number one. And two is I am hoping Dora, um, the first technical guidance that comes out is really gonna be around a cyber risk management process as opposed to specific technical guidance.
And if that's the case, it won't be out until June. But if that is the case, then I think actually it will be a really
Neal: Nice. No, Kris, that, that's good to hear cuz like I said, I see all these things and that there is a, a, there's an ice out there for business. Uh, Business resilience and I forget what brc, I'm not gonna say their name the right way cuz I haven't looked at them in a while, but I, I think No, that's, that's good to hear though.
I am glad for that take. I, I appreciate the insights on that cuz it's one that I haven't really driven down personally, hardcore. I, but I am, I am obviously concerned that it goes around full loop to where we just get back to that compliance. Unfortunate but that, that's smart though, cuz you talk about from actually having a legit conversation at a leadership level about what these technical aspects can be and what's impacting you more holistically.
And as an Intel analyst personally, that's the kind of conversation I've always been trying to have when I've been in an enterprise solution. I've always been trying to sit down with the C-Suite to figure out what they think their actual financial risks are, the risks in general, map those out to those technical use cases and have a bigger discussion with the teams as a whole.
So the CFO pick a C person, they all have their own things that they think are risks, right. And. Trying to figure out, not just from a SOC perspective, but across the entire infrastructure, how digital and physical real world things can impact them from an intel persona's perspective and map all that out from a resiliency type mentality.
Kris Lovejoy: Yep. And
yeah, and I think what, you know, what I worry about though, you know, is. I, I worry about nationalism and I worry about, um, you know, talking about compliance. I think one of the biggest issues we've got right now is just this incredible balkanization of privacy requirements, data localization, digital sovereignty, AI ethic, you name it. I mean, it is just, it is a mess. And so, and each one of them is becoming more and more prescriptive in nature. And so I think, you know, for most larger corporations trying to deal with this, it's, it's a nightmare. It's a total
knew a guy. Who's, no, I'm just kidding. Uh, I know, and, and to your point, that that's gonna be a, a wonderful world of new startups trying to solve that same problem once again, or transition into it, right? Uh, yeah. It, it'll be weird to see what comes of it, uh, but it's nice to know that conversations are finally going, what you think to be in the right direction.
So that, that, that's, that's good to know. Um, it, it is a legit worry of mine. So on that note, rsa, once again, throwing this back out there, trends and analysis, resilience. I'm gonna go hunt down all the quantum booths because I am in, I'm intrinsically curious about what they claim they're doing or what they're trying to mitigate.
Uh, especially now that China claims that they can break. Like, what was it? Uh, rsa, where were they at? To 2000, to the 20, 48 bit? Um, supposedly, but it's still financially not feasible. Uh, so
yeah, it'd be fun to see all that stuff.
Elliot, what else you got for us, sir?
Elliot: so obviously we've got rsa, we've got top down at the organizational level, so definitely covered on the philosophical angle. Um, you did touch a little bit on ai. Uh, I was wondering if we could just, you know, pivot back that direction a little bit, only cuz I know that is a topic of for Neal.
Um, I'd say in probably like the last month, we've seen a couple of headlines come. Uh, and we do typically avoid headlines, but like, basically organizations are incidentally dropping proprietary code and things like chat, G B D and stuff like that. Do you, you know, since we're kind of, uh, on that kind of compliance and regulations and conversation, do you feel like AI's a little bit too far ahead of where it is today and you know, as an organization of your size, um, are you kind of paying attention to that and putting any kind of guardrails to per maybe like users from accidentally dropping in stuff when they don't really know where that
Kris Lovejoy: Yeah, I mean we're, we're prohibiting the use, explicitly prohibiting the use of like chat G P T, et cetera, within Kyndryl for commercial purposes. What we've done is we've established a research team that's looking at the various use cases that in which we would want to sort of adapt and apply the technology or, um, and this is specific to generative ai.
Not all forms of ai, but generative ai. Um, so we've restricted that use and, um, we're beginning to look at the use cases and then exploring how our customers are gonna begin to use, use it. Um, you know, my, I've. You know, my, our, my last company, um, cons, um, the Blue Vector was, um, a kind of an AI-based threat detection, uh, tool that we had built.
Um, and the way it worked is it was, and just a quick vignette on, on the subject. So what it did is we basically, we had, um, access to a corpus of malware. Um, and so from the threat intel community, and what we did is we took it, it supervised machine learning reinforcement. Um, so theoretically generative if you took the guardrails off.
And so what we would do is we trained the model on what is malware look like, and then we got access to all the golden images for most of the major vendors, and we put that into the system. So you had a good probability model where you could predict good or bad, um, software based on kind of the, uh, the model that we trained. Now, the problem with the model was that, you know, oftentimes legitimate. Name brand vendors would offer us golden images of code that we thought were written really badly. Is it good software or is it bad software? Right. So again, you know, what I would say about AI is ai, I instantiates the values of the creator. And so we as humans don't, are not necessarily good at understanding and, um, evaluating the veracity of the data. Particularly today we're seeing it again and again. The veracity of the corpus of data that is kind of streaming at us. Um, AI I think is incredibly valuable, um, from the, you know, the potential that it offers.
Um, I think generative AI with no rules, no guardrails. Is problematic in today's world because I don't know how you can, again, going back to the, it represents the values of the creator. How do you teach people what those values are? How do you teach them to interpret the outcomes based on the values of the individuals?
And again, it was individuals who selected a corpus of data in which to train that model. I don't, I, I just don't, I don't understand it. And so I do think that some level of scrutiny has to be applied before these kinds of technologies are available for mass scale, um, use, particularly when it comes to, uh, you know, sort of any kinds of technologies that can potentially drive disinformation misinformation.
it scares me to death.
Elliot: I absolutely love your take, and fortunately you answered it in a way that I was kind of hoping to separate. The AI that consumers know, like chat, g p t, Dolly and those other things, which they're all getting to play with this new toy. But realistically, AI has had proper models, like financial institutions use, AI powered models that'll like reach out a couple of months out. Uh, I think I was just chatting with some other organization who were at a very large auditing group and they were, you know, AI's been around forever, but you wouldn't use AI to like model out for years in advance. And that's the difference I think, at play here. And you, you nailed it. It's like, you know, obviously organizations to be really smart about what they're using and how they're employing it, but, um, there is like the consumer toys that people are getting to play with. Um, and it's just like that lovely divide where like you all, you are building a research org to make sure and being proactive and figure out how to best use it before you're opening up those flood gates. And I think that'll help reduce risk, you know, un uh, unnecessarily, uh, preventing any kind of issues
that could stem
Kris Lovejoy: Yeah, but I think it's gonna be hard because I do think the technology is out ahead of the humans right now, and I don't know where we, I don't, you know, I'm not necess, I'm not worried about, you know, like how taking over the world, you know, but I, I do worry, I do worry about disinformation and misinformation.
I, that, that scares me.
Neal: I, I will say I've gotten chat g p t to make me ransomware. So, uh, out of
Kris Lovejoy: There you go.
Neal: cur, you know, it's got protected words, right? You can't literally, you, you can't go in there and tell it, Hey, make me ransomware, right? It, it, it's in its protected algorithm where it's a block. But what you can do is tell, to make you a wonderful piece of software that's able to encrypt things with the shop, pick a number, um, as well as build a server for you to host this particular file on and be able to remotely access, uh, and some other, I'm giving people ideas, but I think it's important for people to know, uh, you know, I agree.
It's way out there. Be it's ahead of where we need to be from a security perspective. And it, it's, it's fun. Um, but it's, it's got a little ways for us to go before we can keep up with it security wise. So, uh,
everybody's gonna go out there and go make their own ransomware
package, but you're
Kris Lovejoy: yeah,
Elliot: Thanks, Neil.
Neal: I'm not liable
Elliot: There you
Neal: what you do with Chad g p t. Uh, no. So on, on that note last, I know we're kind of up on time, but
Oh, go ahead.
Kris Lovejoy: No, no, I was just gonna say, I, you know, I do think though, going back to sort of that where we started about sort of empathy and this being a people thing, you know, I've been saying for some, you know, for a while that I think, you know, security is, is more, almost, it's gotten to a point where it's technology is important, but the psychology, the ethics are almost equally important.
And so, you know, we, as you know, cyber leaders, we have to learn to navigate that kind of world of, you know, humans and feelings and intent and purpose and mission and values and all that stuff. Just as we are in, you know, sort of learning to navigate technology because the things go hand in hand.
Neal: It that is true. I think that's a certified ethical hacker when it. When it was the first one to the mark, I'm not gonna call it Not worthwhile. It's still worthwhile, sir. Uh, that was the whole impetus behind half the class was, I mean, ethical hacker was to teach you the, or at least allow you to understand the moralistic implications behind your actions, both legally and, and general morally speaking.
And I, I think, you know, there, there's, there's a generational gap where people pre, you know, people who grew up with a computer versus people who found a computer in a dumpster and decided to tinker with it kind of thing. Eighties, nineties, computer people, eighties, nineties people grew up breaking things on purpose and hoping to tell people how to fix it.
The late two thousands and on kids grew up with it just implicitly there. And if they broke it, they found a way to make money illicitly. And there is a moral compass that's a little different from the intent. And, uh, I, I, I like, there's still some classes out there, like at U T S A. To, to name drop a little that actually has a class specifically on the moral and legal implications of your actions in cyber.
And, um, I think that's massively important. So you don't do what I did and then use it. but, you know, not saying I, I didn't send it to anybody. It was a test. I wanted to see if I could do it. I wanna be abundantly cleared. Nobody's getting
Kris Lovejoy: dear. No, you know what? It's, it's important. It's
Elliot: for listeners, that's not true cuz Neal's
Kris Lovejoy: Uh, well, you know what? I, I applaud, I, I do applaud, you know, sort of testing the boundaries and understanding, you know, again, going back to this is risk management 1 0 1, right? You have to understand what can possibly happen and who can perform those actions.
Neal if you come work for Kendra, we know that we're gonna be locking your down, your laptop and, uh,
Elliot: you know, Kris, thank you so much for being able to join us and really dig into this from a philosophical Neal, as always, who I will set up. a focal point and we never, uh, lean too far into it. But, you know, we still were able to grab a lot of your experience and expertise and I think that's just gonna be really valuable for our listeners, which are primarily cybersecurity practitioners and people who are just generally interested in the space.
So, um, being able to understand and hear from a leader like yourself who understand the importance of empathy and the people perspective is so critical. Uh, so I'm just really glad that we're able to share that with, uh, you know, the audience that we have available.
Kris Lovejoy: thank you. thank you. guys. I really
Neal: you again, Kris, for jumping in with us today.
Kris Lovejoy: All righty. And I'll see you at R S a.
Neal: thank you.