Martin Hinton (00:02) All right then, welcome to the Cyber Insurance News and Information Podcast. I'm your host, Martin Hinton, and the executive editor of Cyber Insurance News. Today we're going to be talking about incident response. Most companies have a plan, or they should. Binders, workflows, call trees, cyber insurance policy, outside counsel, a CISO, who knows how to buy Bitcoin. Then the breach happens and they realize nobody printed the cyber breach response plan. That's the world our guest today, Matt Mosley lives in. He's an incident response manager at Sygnia. Matt, thanks so much for joining us. How's your day been so far? Matt Mosley (00:40) Doing great. Thanks for having me. I'm really excited to do this. Martin Hinton (00:44) So Matt, let's start with something simple. When you sit down with a company and they tell you they're prepared, they talk through what their preparations are, what are you listening for to gauge whether they're right about that or whether they're Matt Mosley (00:59) Yeah, I think the biggest one is obviously an IR plan. I mean, I think we're always looking for that ahead of time. The truth is, though, is when a crisis does happen, nobody's going to that file and pulling it out and reading it, right? So it's really kind of cocking to them and understanding if they know the muscle memory that goes along with an incident, especially in a time that's stressful for them. But knowing who to bring into the room and who to talk to and who to involve during an incident. So those are the key things that I want to hear. when I'm talking to them, just wanna hear what's their communication plan internally and who they're gonna involve. Martin Hinton (01:32) when you speak to them, is false confidence something that you've learned to hear or what are the telltales of overconfidence or hubris as I like to use? Matt Mosley (01:43) Definitely. So I would say in past years, the overconfidence, we're prepared, we're ready to go, was definitely the way to go for most security leaders that I was talking to. But now I think that's, the conversation has definitely shifted. A lot of them worry. I think, you know, from our survey, was something like 73 % said they weren't ready, even though they had the tools and the experts. And that really just comes down to ⁓ their lack of confidence and who... they need to call when an incident happens. Martin Hinton (02:14) Is that often just down to not having experienced one and gone through it? Matt Mosley (02:19) Absolutely, right? And so that's one thing that we always recommend, especially on an annual basis with these leaders, is that to go through and do simulations, do tabletop exercises, and practice going through these different crisis events, because a ransomware event is completely different from a business email compromise event, right? And different people need to be involved. And I think going through and practicing that, yeah. definitely alleviates some of the stress and also some of the worry and that lack of confidence that they have going into an actual incident. Martin Hinton (02:53) Interesting, interesting. So it may seem obvious to the audience at this point, but tell me a little bit about Sygnia and what the company does. Matt Mosley (03:01) Yeah, so Sygnia is worldwide. We have about 200 different consultants across four different services. So we have our enterprise security team, which can go through and do posture assessments and proactive work. We have our incident response team, which is kind of our bread and butter, and we're spread throughout the world. So we chase the sun is the model that we follow. So if an incident's going, we are able to respond 24-7. We do retainers, et cetera. We have an advanced monitoring team which can come in during an incident and help monitor the situation, make sure everything's good to go or we can detect things early on. And then lastly, we have our index DR service where we go through and manage and detect and respond to ⁓ an organization's security needs. Martin Hinton (03:46) So when you arrive in an incident, is that something that, is it a physical response? Do you go to these companies to be there in person with them? Do you do it more remotely? And when you get there, what's already happened? Like, one of the things we know about or hear about is the speed with which things can go sideways and also the delay time before a breach is even realized. So what's happened inside a company? Is there a typical reality? Matt Mosley (04:13) Yeah, a lot of our response I would say is remote. So we don't need to be boots on the ground. But when we are boots on the ground, it's usually for a crisis level event, something like ransomware. We need resources to go through and pull hard drives and do imaging and all that good stuff. Things that you can't do remotely because the drive is encrypted or whatever the case may be. But I can say that at that point, stress levels are high, right? ⁓ We're usually called in. after the incident's been detected, after the team has tried to eradicate and contain the situation. And now they're kind of waving the white flag and saying, hey, we need help. And so it's usually on a Friday, right? the middle, eight o'clock at the end of the day. And then next thing I know, 24 hours later, I'm on site in Nigeria or something and ⁓ handling the incident from the ground. Martin Hinton (05:02) It may be something we touch on later, but that point, it's usually on a Friday. That's not an accident. We get targeted for these sorts of attacks at times of diminished attention or you're desperate to get away for the weekend or a holiday or something like that. Is that something that's a part of the incident response planning and practice that you just touched on earlier or does it not really factor in that much? Matt Mosley (05:27) It's definitely part of our world because we live in breathing that's in response. So we're, we are prepared for that here. ⁓ you know, we do on call schedules and all that. don't think the average enterprise is ready for that type of situation. And I do think a lot of the reason that we get those calls on the Fridays, it's, it's a question that I've been asking for years that I don't really have the, the solid research or answer for. ⁓ but I do think that Friday, you know, it's the end of the week, people are ready for their weekends or they've got plans and they've, they've wrestled with it all through the week. and they realize they don't have the capabilities or the resources to handle themselves. So they kind of wave the wave flag and have us come in. It's an interesting tidbit too is that our counterparts in Israel, their weekends start on Friday and Saturday. And so they actually, all of their calls come in on Thursdays. Martin Hinton (06:17) Really? You know what? It's funny because we've done a bit of reporting. The one I remember specifically was out of the Asia Pacific region. And it was about how there's a spike in the tax during the Lunar New Year celebration, right? Because you've got people away and traveling and the CFO is on a plane and that instability of leadership, if you will, that's created by all those sorts of things is a vulnerability gap to use the vernacular of the game. When you mentioned pulling hard drives and that sort of thing, When you arrive in an incident, if you will, whether virtual or in person, is what's the blend of technical problem, decision-making problems, people problems? How much of each represents the need for division of labor on your part? Matt Mosley (07:02) Yeah, to give a definitive number, I probably don't have that. But I can definitely say it's all the above. ⁓ Sometimes it just depends on the incident if it's more technical or not. But overall, I think it's always a business issue. It's never just a security or a technical issue. Because especially with ransomware, where you're hitting the top four, you're hitting shutdowns, you're hitting ⁓ reputational damage to the brand. you're hitting data loss, and then you have your potential extortion, know, ransom negotiation type thing going on. That is, you know, a ton of people resources, right? That's communicating with multiple stakeholders, communicating with multiple experts and resources across the enterprise, executive leadership, legal communications, and you got your technical teams. And so it's definitely a little bit of all, but it's never just a cyber or a technical issue. It's always a business issue for sure. Martin Hinton (07:56) Yeah. mean, one of the podcasts I did a few ago now was with a consultant and it was about the adaptive change that corporations need to embrace with regard to the role of a CISO and the concerns that they bring to the table and the reframing of everything they want being a cost center as opposed to protection for business continuity and that sort of thing. So the human element of it, because everyone assumes this is all technical, but when it hits the fan, if you will, it's very much about your capacity to deal with uncertainty and stress and that sort of thing. And I wonder whether there's any more you might say about that from the point of view of what companies misunderstand most about the role of an outside instant response team. Do they think you're going to be able to throw a few switches and remap a few hard drives or do they understand, as you just touched on, the sort holistic, comprehensive 360 nature of that? That there's the need to give someone a hug and then also maybe figure out how to get Bitcoin to the right person. Matt Mosley (08:56) Yeah, before I answer that, I'll talk a little bit about our tabletop exercises that we do. So one of my counterparts, he actually will have someone assigned as the chief care officer. And that is a role that takes care of people's health during these type of events, right? Making sure everyone's fed, making sure everyone's got a break at getting water. And so that's definitely something that's like starting to emerge into our instant response. But ⁓ From a perspective of the CISO, I think it's a little unfair in their role and I think that's changing and I think the enterprises that do it right are the ones that give the CISO a little bit more of the decision-making ability rather than relying on them to be the leader of the incident but have no executive decision-making abilities, right? All the decisions happen around them and then their job is to really try and coordinate or corral the cats at the same time during a crisis level event. Martin Hinton (09:50) Yeah, so I am a journalist, makes me an expert in nothing unless you've had a drink or two at a cocktail party and I sound like I know what I'm talking about. So I say that because I've been at this work a couple of years now and the thing I've done to help myself understand it is use analogies. And the CISO I've always joked now is like the new kid at a new school, right? Like, you know, but they're kind of cool, but you're not sure. You sort of keep them at arm's length. But then, you know, if they're really good at basketball, you really want them on the team. And there's this weird... sort of human, again, back to the human dynamic about integrating someone new into that C-suite environment, which is, you know, sounds easy. It should be something people do with some regularity, but human beings being what they are, as you know, hiccups occur. One of the things that we had shared in advance of this was some reporting that we'd done on the survey that you had recently. And I'm just gonna rattle off some of the findings from that. 73 % of seniors, Security decision makers say their organization would be fully ready if a significant cyber attack happened tomorrow. 99 % have formal IR plans, incident response plans. 90 % expect difficulty coordinating key stakeholders. 89 % cite limited executive or board involvement. And 75 % say legal and communication issues slow decision making. So why, I know it sounds silly, but why do you think that there's that disconnect between perception of readiness and then some of these numbers are almost absolute, right? Everyone has got a problem. I mean, the idea that legal and communication issues slow decision-making, I was doing a podcast a couple of months ago and one of the incident response analogies used was that the incident happened, they couldn't access the IR plan and it took them two weeks to gather all the phone numbers of people that they need to connect with in order for things to happen. It sounds crazy, right? Like how can anything take two weeks in this day and age? But you tell me, what's going on with these numbers? Matt Mosley (11:54) Yeah, exactly. And I can tell you that I've been on incidents on every single one of those type of situations. mean, firsthand, the legal piece, right? Legal and comms. We were called into an incident and this is at a previous organization, senior, but it took over a week to get the contract aligned. Meanwhile, the threat actor was living in the environment, doing all kinds of damage and we're just sitting there waiting so that we can respond. know, so that just that delay and not being prepared. had significant damage on that organization. ⁓ Recent incident where we had communications ⁓ involved. I had gone through and set up, this is what we need to say, this is how we need to approach it. And I gave them a draft. And this is something that they could have prepared ahead of time. A lot of stuff can be preset, and you can go through and do mad libs. You can insert whatever the case may be and go through and edit it and massage it, whatever the case may be. But in this case, I gave it to them, and it was three or four days before they released a statement. you know, so those are, those are things that happen that can affect the reputation of the brand. You know, they can affect ⁓ the operation and the revenue of the company. Because right now the default, instead of ⁓ working to ⁓ respond is really to shut down and contain and then respond. And so when you're doing that shut down and respond, and if it takes you two weeks, like in your example, you've shut down for two weeks, waiting for someone to come in and fix the problem. and how much revenue has gone to the wayside while you're trying to figure out who to call and who to involve. so a lot of that is preparation. I will say that an IR plan is nice to have and we definitely recommend it. I highly, I'm not discounting it at all, but when in a time of crisis, nobody's going to that document and saying, okay, step one, what are we doing? Right? It takes someone who jump in and from muscle memory say, okay, I need someone from IT, I need someone from network ops, I need PR, I need legal, and I need someone from executive leadership here so that we can all get on the same board and start aligning on the plan to get things done. Martin Hinton (14:01) So you sort of touched on it because my next question was what a bad IR plan leaves out. And it sounds like practice is the big thing. Or you used the phrase muscle memory earlier. And I think we know from so many other environments that we exist in, whether it's team sports as a child or even older person to the military, you practice how you plan to play. And while that is the mindset you also know that no plan survives first contact with an adversary, right? You can have all the planning in your world, the future has its own plans and so does the person or people you're going up against. Are those some of the things that need to be more ingrained in the mindset of broadly corporate structure and approach to these kinds of threats? Matt Mosley (14:33) Absolutely. Yeah, definitely practicing the plan and going through those simulations. You know, I don't think it's a silver bullet and obviously there's going to be mistakes made where just nobody's perfect. But I think going through and practicing that, seeing what actually happens, seeing what resources you actually need and who you need to talk to plays a huge part in how successful an internet will be. ⁓ think, you know, whenever, whenever I did tabletops in the past, I would take people out of the. If I saw someone kind of controlling the conversation a lot, right? That one person in the room that was managing everything, I would say, okay, they're not available now. Who's the backup? Right? And that really puts a wrench in their gears because then people are like, wait, that person knew everything, right? That's the tribal knowledge. And now who do we rely on if that person's stuck in traffic when an incident happens, right? Or they have an emergency at school and they have to go get their kid. Who are you going to rely on? And so there's always one Bob in the group. that you wish you had four Bobs of, right? He's the one that is taking all the tasks. He's the one fixing and doing configurations. And he's got the switching tasks, the switching costs of those tasks. And so he is the one person that's kind of delaying things, right? Because everything's being funneled through him. yeah, definitely go through those simulations, figure out what the gaps are, and work to fix them before an actual incident happens. Martin Hinton (16:11) Do you find the gaps you refer to? Are they technical, organizational? Do they fall in any particular bucket more frequently than another? Matt Mosley (16:22) Yeah, I'd say in recent years or months is more on the people side versus the technical side. I think we've got all the tools, we've got all the logs, ⁓ usually in a pretty good space. I won't say we always have the logs because I've shown up to incidents where there's no logs. But what really is the people, it's the coordination and the communication. And I think once you've got that locked in, or at least in a pretty good state, then an enterprise will be a lot more confident in their ability to respond to an incident and also more successful at the end of the incident. Martin Hinton (16:57) You've mentioned sort of some of what I suspect would be the answer to this question, but what's the cost of explaining the plan to the leadership at a company during an incident as opposed to having had the chance to do it beforehand? I mean, I don't mean specifically a number, but what kinds of things, I mean, the obvious reality is you're playing catch-up in that situation where you're trying to explain to them why the Titanic hit the iceberg as opposed to where the nearest lifeboat is. And I wonder whether or not you could sort of frame that based on your experience a little bit. Matt Mosley (17:31) Yeah, a couple of examples of that. ⁓ You know, we had one where they didn't have any kind of legal representation. So they spent a couple of days reaching out to outside counsel, reaching out to their cyber insurance, waiting for a response. And that's just delaying the negotiation time. It's delaying our ability to remediate and close out the incident. And so every time that is wasted is just more time or more revenue that's being lost. especially if the company is at a standstill. Martin Hinton (18:04) I mean, we've touched on this a little bit. You know, I use the word hubris, there's this denial, and it all helps create or cloak a company and sort of a myth of control. you know, and, you know, I have significant exposure to C-suite executives in my professional career, my personal life. There's a value to having them be a little bit arrogant, right? You want them to have this sort of master of the situation kind of mentality about themselves, that confidence. But I wonder where, when you see that Where do you see that hubris show up before an incident? Is that something you're like, you know, if you have the opportunity to work with a company prior to something, do you look for ways to sort of curry favor or learn how to deal with people? What's up? What about that stuff? Matt Mosley (18:48) Yeah, I think, especially in my role as an incident response manager, that's one of my primary functions is to really kind of connect with the leadership on the other side and try and ⁓ guide them through an incident. And even against maybe some of their internal ⁓ decisions that they've made saying, this is not the way we're gonna do it, we're gonna do it my way. That's where it takes a person to come in, like myself and my other IR managers that work at Sygnia. to come in and kind of guide them and tell them, hey, this is what happened in this case in the past. This is what's happened in all of our experience from these 20 years or whatever the case may be. This is how it works and this is how it operates and this is why you should listen to us or this is why you should do that. ⁓ It doesn't happen very often, but I will say that we do get some of that pushback from the ones that think they know better. Martin Hinton (19:42) I mean, I'll ask a basic question. Diplomatically or not, how do you deal with that? mean, you just, these are just the facts. This is not an opinion. This is just information. Don't take it personally. But based on our experience, and something happens to you, you're gonna have a problem that you're not prepared for. Is it just that simple and you let them react to that and act on it? Or if they don't, then that's, you've done your part or you keep chipping away? Matt Mosley (20:09) I mean, there's certain restrictions. Well, I wouldn't say restrictions, but guidelines that we adhere to. ⁓ We will provide them all the information and the facts, especially during an investigation. We will lay it all out on the line for them and let them make the ultimate decision. And it may be against our better judgment, but they are it's their business. They know it better than us. And ultimately, it will come down to their decision, whether it's through the board or them as an individual. Martin Hinton (20:36) Do you, one of things that when we think about thinking that they're ready or they're prepared, do you, you know, we have this approach as human beings where we add more, we don't take away, we do additions, not subtractions when we work on a home. And I wonder whether or not that you see sometimes, do you ever get in a situation where you see an incedent response plan that has too much complexity and that's mistaken for maturity or preparedness? ⁓ And I guess, What I'm trying to get at is the complexity of a cyber attack can be significant. The long tail, the long term costs, the litigation that we're starting to see a lot more in the form of class action lawsuits. ⁓ Can you be over prepared or is it something that you can't be too prepared for? Matt Mosley (21:24) don't think you can be over-prepared, but you can be over-reliant on your tools, right? I think you can think that you have the best tools and the best experts. But when an incident happens, you find out real quickly that it's the people behind it and who run the tools that can actually ⁓ utilize it and take care of an incident. ⁓ I have seen over-engineered IR plans where they've got step-by-step this is exactly what we're doing at this minute and this minute and this minute. And I would say that is definitely over engineered. It's a new response plan. You kind of want it a little bit broad and you want it to be more of a guideline than step-by-step procedures, like more like a run book. But some people get those confused. Martin Hinton (22:11) You touched on the mad libs and when you said it, thought of the classic, I think it's just the US Army, but it may be more branches, is the nine line, which is this of form for, the times I've seen them used as in medevac where it's blanks and it's you fill in all sorts of details about the specifics of the incident, but the heavy lifting of the existing words is not there for you to have to think of or. communicate in the moment. it is, you I suspect that there's no less stressful situation, no more stressful situation than ⁓ medevacking someone. And again, I don't like to compare this sort of thing to warfare because it isn't the same. But the complexity of ⁓ the threat is one that is as changing and malleable and dynamic given the nature of, you know, the geographic sprawl. There's this borderless crime, it's highly organized and that sort of thing. I wonder whether or not you could touch on, if we could drill down into sort of the first 24 hours of an incident response, know, the first few hours of an incident. I mean, let me ask a very practical question. What's the first notification you get about an incident that you're going to have to go manage? How would that information arrive to you? Matt Mosley (23:29) Yeah, so usually it'll come through what we have, we call it a SWAT channel. And that's where our representatives around the world can, ⁓ they'll get contacted by the client and then they notify us. And then we jump on the first available call to begin scoping. And so we try to understand kind of what's happening in the incident, what's been ⁓ compromised or what is being affected. And then we try to understand what's involved, which resources we need to get. on our end and then determine if we need to be remote or on the ground or start both. So we can do remote support and also send someone on ground. ⁓ And then after that, once we're on site within the first couple 24 hours, then we start coordinating with the stakeholders and figure out who we need in the room, which is something that we hope the client knows. We want to know that we have the right resources. And then we start putting together a game plan. based on ⁓ what was compromised or what's been affected and then what do we need to get up and running to get the business back operational. Basically, what can we do to stop the bleeding to go back to kind of your analogy of working in the war zone. Martin Hinton (24:45) mean, is there, you talk about stakeholders and one of the ideas that I think is really important to take away from this is that you have the guidelines of a plan, but the specifics of the situation could change fairly significant things like who needs to be in the room and that sort of thing. Is there ever a person or stakeholder to use your word that gets drawn in too late with regularity? it, we touched on. people not having a legal team or whatever it might be. Is there any particular entity that people listening, watching this might think, ⁓ you know what? I don't got to check on that one. Matt Mosley (25:24) ⁓ Legal incomes are usually the last ones to be brought in. And that's just because I think everyone's trying to stay away from involving the attorneys and potential litigation. I don't know. That's just my own opinion. And then the communications, think, really they're trying to keep a lid on it. And I understand. ⁓ eventually at some point, well, not every case, but sometimes you have to just fight the bullet and let everyone know that there is something happening. and you want to be factual and honest and as transparent as possible. Martin Hinton (25:58) You, you touch on something I mentioned doing some podcasts on incident response. And one of them was with a communications consulting firm that does the comms part of incident response. And what are the really interesting things and not surprising, but, but I hadn't thought about it at this level is the type of message depending on the audience and the medium and the understanding that all those messages could wind up being seen by unintended audiences and timing matters, right? Like information is perishable in this situation in a very, very rapid way. So if you create a press release and it's got to go through internal counsel and external counsel and that sort of thing, the urgency that is applied to that is often hard to force to an outside entity. I mean, it's just interesting to me that the way you described all of that is so similar to... a breaking news situation, like what resources do we need? What's our ultimate goal? know, where do we need to be physically? Pardon me. ⁓ It is, I mean, it's classic crisis management. And is that something that people who maybe don't know if their IR plan is that good should think about? This isn't any different than any other crisis you might encounter, whether it be a warehouse fire that destroys half of your, you know, inventory or something like that. There is a way to prepare for this. So don't let the ones and zeros and the vapory idea of digital cyber this and crypto that distract you and confuse you or intimidate you. It's just a problem and there are tools. There's people to call, figure out who you're going to need to call before the fire. we all know, and we're taught as kids, 911. We know that. That's what you call it. You need the police or the fire department or an ambulance. We don't have to think about it if a crisis occurs. It's a very simple analogy. But I wonder whether or not that basic idea to sort of get people away from this sort of, at least I experienced this sort of kind of paralysis when it comes to the technical side, right? People feel like they need to understand the complexity of encryption. And maybe if you're the CEO, you just need to know that you have to have someone around who understands that. Matt Mosley (28:00) Yeah, absolutely. I use this thing with my kids where I say everything is figureoutable. You may not like the outcome, but there's always a solution for everything. And I think people need to understand that. I think based on the survey that we did, ⁓ 74 % of the enterprises from the 600 different security leaders that we interviewed had an incident within the last 12 months. And so it's not a matter of if, it's just when. And you definitely can prepare. and be ready for these different types of incidents. Prior to COVID, I actually, even though I do cyber incident response, one of the tabletop exercises I did was for a bank in LA, and it was a pandemic exercise. And this was like 2017, 2018. And then the whole room was like, oh, the city would never shut down. Schools would never shut down. But we brought up topics like what would you do if your staff has to be home with their kids? And what would you do if... freeways shut down. And even though I was getting pushback, because we went through that scenario, they were ready when COVID happened. And one of the gaps that we identified was that they didn't have enough VPN licenses for everybody to be on remote work. Can you imagine that happening now? It's not a thing now, right? But back then it was. Martin Hinton (29:18) Well, mean, you would hope it wouldn't be a thing now, but you make a great point about what I've heard described as the extreme fragility of this digital existence. the experiment I use when I have dinner with friends who are like, what do you do for a living? And I say, well, I'll put it this way. Take your phone, put it in your bedside table or at a dresser drawer, and you can't use it for 24 hours, and you go to work. How's that going to work for you? just that one piece of technology, not the internet, not all your apps. You just can't use your phone. Can't use your smartphone. How's that day gonna be for you? And I think that, you know, the fact that find my iPhone exists and the panic we feel when we don't know where we put our phones is probably an indication that we all know what we're talking about. That's the mindset you need to have. I really love the way you just described the pandemic thing, because there's this, again, back to that hubris idea. I think it's the book, have you ever read the book, World War Z? which was then made to a movie through the Brad Pitt. There's part of that book, and I love this element where I forget where it is, but there's a group of people in a government and they're meeting and they all agree, all 10 people in the room, but there's a rule in the room, the 11th man rule, I think, if my memory is correct. And that is that if everyone agrees and you're the last person whose opinion comes up, it's your responsibility to disagree and to argue against the consensus for the point of having people explore the unimaginable, right? And that's what we're talking about. People can't imagine a pandemic, a cyber attack that shuts down, I don't know, MGM or Jaguar Land Rover or Marks and Spencers. Who could ever imagine that happening? The recent striker incident, right? These things are not unimaginable. It's just a teenager mentality that creates this myth that it won't happen to us. I wonder whether or not you have any thoughts about that. Matt Mosley (31:09) Yeah, I mean, it really just goes back to that. It's no, it's no longer a matter of if, but when, I think, I think as more things happen and we see more news and, you know, a lot of people ask me about my job and then they start bringing up, you know, some incident that they heard about the news. It's becoming more common and it's becoming a more widespread in, you know, the major media and no longer just in, you know, the, the, the small websites that, know, me and my industry read all the time. It's becoming more mainstream Wall Street Journal and all those are definitely hitting up. We had a story in the Wall Street Journal about North Koreans. Those are the type of things that people are starting to see. I think people are starting to realize, which is a good thing, that it is a matter of being prepared for the unthinkable when it comes to cyber incident response or cybersecurity, should say. Martin Hinton (32:00) Nah, there was a piece in the New York Times today about it. it a technical story written in a journalistic style that I, in fact, I pinned it because I was going to add it to our newsletter this week. So look out for that. ⁓ I wonder whether you might pivot now and sort of frame some of this stuff in the cyber insurance sort of concept. And when you arrive for an incident, Are there questions that underwriters and other people involved in the cyber insurance side should have been asking in advance about things like log retention and cloud visibility and identity telemetry and all these things that sound mystical and they're not. They're just cool words, right? So what about that from the cyber insurance point of view? Matt Mosley (32:43) Yeah, I'll answer that on two different fronts. So one, as the incident response team coming into cyber insurance where cyber insurance is kind of scrutinizing our abilities and our skills, wanting to know what our capabilities are and delaying us the ability to go in and actually start responding because they want to make sure that we're good enough. So that kind of comes back to that preparation and one of those unknown variables where you don't expect and that delays the response. The other part is definitely insurance should be going through. And I do think from from based on experience that I've worked with and talked to underwriters is they do go in and they do do a survey of the environment. They want to know, you know, do you have logs? Do you have multifactor authentication? Do you have log retention? So that they understand the level of risk that they're walking into by ensuring this company. And so a lot of companies, and this is, you know, within the last couple of years is if you don't have multifactor authentication, they won't even insure you, right? Because it's such a basic step. in securing the environment that if you're not willing to do that, then there's going to be huge gaps everywhere else in the environment. But I think cyber insurance as a whole should be doing their due diligence to understand the risks that they're taking on by these different clients and going through and asking and doing interviews with the teams and understanding what kind of technologies they're using and how it's implemented and all that good stuff. Martin Hinton (34:05) That's interesting because one of the things in the last, I want to say six to 12 months is that you're moving away from the conventional method by which you might do insurance through a sort of checklist or a questionnaire and no verification. to take the homeowners analogy, do you have a home alarm? Yes. you get a 2 % discount, right? now they want to come to your home, see the alarm, double check that you've paid the bill, make sure it works. so you're telling me you're experiencing that sort of, ⁓ trust would verify reality from the cyber insurance side with more regularity. Is that new or is that something that your experience sort of bears out as a little more ⁓ consistent over time? Matt Mosley (34:46) it's consistently gotten more difficult, if that makes sense. a couple of years ago, was, we took your word for it. Then it moved into, here's a survey. Can you fill it out? Let us know what you've got. To now, I think more of them going through and interviewing. And some of these insurance companies even have ⁓ internal IR teams that I think go through and audit the systems themselves. Martin Hinton (35:10) Is there, with regard to this, is there a visibility gap that directly makes things worse? Is there any particular, know, anything particularly more severe than another thing? Matt Mosley (35:22) I think the visibility gap that I see most often is the hybrid environments. So it's like the on-prem infrastructure with the cloud, right? And sometimes it's multiple cloud environments, right? You've got Azure, AWS, Google. And so some companies will have pieces in every single one of those, and then they'll not have logging enabled or whatever monitoring they need in each one of those, or they don't feed it back to a single source. And so even though it's being collected, No one's looking at it, right? So that's a huge visibility gap that we see for sure. Martin Hinton (35:56) I mean, you're touching on something, right? The ability to collect data has become really easy. Analyzing it, pardon me AI, I know you're here, is there's a catch up there, right? I forget what I was reading something the other day that there's a US intelligence agency that collects so much data every 12 months that they would need something like six million full-time employees, an impossible number of people to even review it all. Nevermind comprehend it, understand it and connect pieces to other pieces and that sort of thing. So that basics, I mean, that's again, One of the things I like to do, Matt, is this is not new, right? Siloing of information and power and structure within companies is not uncommon. Sometimes it's useful, sometimes it's not. And what I'm hearing you say is that even when places are recording and collecting the right data to see where their vulnerabilities might lie, it's not being viewed in the right context to see or be useful. Matt Mosley (36:47) Yeah, absolutely. I mean, that's definitely one of the things that I work on probably four or five times a year. We call them visibility enhancement ⁓ engagements. And basically we go through, look at all their different log sources, make sure that they're capturing the right information, make sure that you don't have any gaps, because if you can't see it, you can't detect it, right? That's like the biggest foundational thing in incident response is if we don't know it's happening and it's happening in the shadows, then we're not gonna be able to fix it when it comes time to need to get in there and get going. ⁓ But I would definitely say one of the things that we see now happening is AI being implemented into these ⁓ spots ⁓ where they're going through and AI is doing the threat detection and response, or not response, but the threat detection ⁓ into those logs. And that's pretty interesting. And it's good to see that happening, ⁓ especially with all the pushback from AI and the worry about our jobs being replaced. ⁓ I think it's definitely a huge benefit to see AI coming in and jumping into those workflows. Martin Hinton (37:46) Yeah, the most recent podcast we just did, touched on, was, it like Mythos and the Claude and the way it was finding bugs and issues and that sort of thing that would have taken a human being or a team of human beings far longer to find, we say, without getting too specific. We've touched on it a little bit, ⁓ but one of the things that I want to spend a little more time on is the human element and specifically the CISO, right? a lingering belief that cybersecurity is sort of strictly, back to my phrase, siloed an IT issue. And in fact, you know, the CISO is not just a technical leader. There's a, I mean, in some respects, a true crisis executive role, given the reliance many companies have on their digital structure and their digital, you know, their networks and the way they do business. And that's large and small businesses. And I wonder, pardon me, when this happens and you meet maybe a CISO for the first time. Are there any things that they do that are indications that they're quote unquote good CISOs? Matt Mosley (38:52) Yeah, I mean, a couple of ones that stand out to me were the ones that ⁓ had a natural charisma and ability to talk to people. Some are very technical and ⁓ you kind of wonder, you know, how did they get so far up in the chain without being a people person? And other ones are more people person than they are technical. And I think to be honest with you, even though it's a technical role, I think the ones that are more ⁓ social and have the ability to communicate. are the ones that are the leaders that are gonna be successful during a crisis event. even though myself, as I come in to an incident, I am essentially a crisis manager. The CISO is the ultimate crisis manager because they're the ones dealing with it before I get there. And they're the ones having to relay and communicate to the executive leadership and make sure that they have all the facts so that they can make the right decisions. And so there's a lot of collaboration done between myself and the CISO whenever there is an incident. again, it's the communication piece. It's the ones that know how to communicate and talk to the right people and know how to get things done are the ones that are gonna be successful. Martin Hinton (40:03) Yeah, it's interesting what you said just now. We did a piece, I think it was this week. I can't remember. But it was a Deloitte National Association of State Cyber Information Officers, NASCO, I think it's pronounced. At any rate, what they found was that every single CISO that represents a state or government body that they survey delivers strategy, governance, risk management services to state agencies. And in 22, that that number only hit 81%. So they've maxed out the potential there. And again, the idea that there's a greater awareness about this problem is one that that survey would indicate that these people have way more to offer than helping you update your software, if you will. One of the things that I, staying on the CISO topic is that one of the things that we've done some reporting on is this disconnect between the decision-making structure of a company the rest of the C-suite and specifically the board at larger corporations, well, not just larger corporations, ⁓ is there, you you read things and I can't remember the specifics, but it's, the CISO gets on average 30 minutes a quarter or, you know, 30 minutes every half year with the board. And that's not enough for them to A, illustrate the problems and then do the hardest thing at the corporate level is ask for money for something that isn't gonna obviously make money. It's a cost center mindset that surrounds a lot of their asks. And I wonder whether or not there's some advice you might have for both boards on opening their ears and CISOs on sort of the way to frame things that you've experienced or have seen work when the incident finally arrives. Matt Mosley (41:45) Yeah, I think it's definitely shifting. It's the mindset shifting to more of it's a business issue and not a tech issue. That's probably the biggest thing, right? And understanding that the ROI may not be there on paper. ⁓ know, hey, we didn't get an incident for the last two years. Well, yeah, because the security team is doing their job. It's kind of hard to justify giving more money when nothing happens, but it's a good sign when nothing happens. It means that the tools and the people are working. And so it's a tough spot to be, as I see so. can definitely empathize with them. ⁓ But I think definitely giving the ⁓ statistics and delivering the numbers to the board in a way that makes sense to them is going to be beneficial. And then also pointing to any of the major breaches in whatever industry that they're in and showing what were the downfalls of those different companies or what was the breakdown that caused that incident to happen and what they're doing on their behalf to prevent it within their organization, I think is a strong argument. Martin Hinton (42:51) Yeah, I mean, so I work in the media and I've been doing it for over three decades. And what we know is that things that are hard to, especially in television, but even now because everything's visual, if there isn't a visual element. And what I've been saying lately is if the Jaguar Land Rover hack had involved a massive explosion at a factory or a fire that burned a factory down that resulted in the same level of shutdown, it would have been on the cover of the paper for weeks and it would have been on the lead on the news. because you would have had dramatic footage and smoldering wreckage and all that sort of thing. And that there's an abstract reality to this that for even for the sort of smartest amongst us, if you will, ⁓ it's hard to grasp that idea. I mean, is there a remedy for that? Or is it, you I mean, what do you think about that idea that there is, you know, and maybe it's changing as we've touched on, but this idea that getting people to believe something that is hard to see even when it's happening, is really, really hard to do. Matt Mosley (43:52) I'll be honest, I don't have a solution for that one. In my own personal life, tried to convince people otherwise of things that they thought they saw that they didn't see. So I wouldn't be able to answer that. I'm sorry. Martin Hinton (44:03) No, no, you're not to be sorry. mean, it's a question I used to use Marks and Spencer's and before that I think it was MGM, whatever the more recent example where that, because the Jaguar Land hack, I think the government has to step in with a backstop to avoid companies that were supplying them out of business. think that the GDP of the United Kingdom took a hit for a short period of time because it's such a large part of the economy. And I think in the end, the whole thing costs like $1.8 billion. Like if you had a you know, a physical event that costs that much, you'd have no problem convincing people that they needed to, you know, put a better sprinkler system in the new one or build it out of something that doesn't burn or put better security around a physical location to avoid a bad actor getting inside in a very conventional, classic sort of violence attack kind of situation. So it's just, you know, I agree with you. I don't know the solution either. mean, and I'm, you know, I've been doing this a few years and it's my profession to help people understand things that are a bit complex and a bit abstract. And, well, I'm not there either. So don't feel bad. Matt Mosley (45:01) Just from my personal experience, I can tell you that I've seen the bills during incident response and things that cost maybe 10 grand to implement would have prevented a $500,000 incident, stuff like that. data that should have been locked down with some sort of encryption tool or something like that, that's now been exfiltrated and you're now paying $3 million to some ransomware group. Those would be perfect examples of kind of here's your ROI, but it's not a return on investment because it didn't happen to you yet, right? But it's all preventative. Martin Hinton (45:36) Yeah, no, that's exactly, mean, again, one of the things that I tell people and I try to do on the podcast and in our reporting is that these sorts of issues and what we're talking about, it's not new. What's the Ben Franklin line? An ounce of prevention is worth a pound of cure, right? This idea that the $5 batteries for your smoke detector will protect your $500,000 house, right? Like that is the mindset you need to adopt and it is not good. fun to spend money on something that isn't going to directly result in revenue. But if it stops you from making any revenue because it happens, you've got real problems. So you touched a little bit on artificial intelligence or AI. Help, hype, new attack surface. I think I've got these numbers right. And let me just look at my notes here. You guys reported that a third of organizations in the most recent server you had already had extensive AI use across most of all the threat detection and IR activities. And that was up from 25 % last year. Matt Mosley (46:11) Yeah. Martin Hinton (46:34) And I think you projected it could go to something like 60 plus percent by 2027. But you also warned in that report that AI adoption is outpacing some of the what gets called shadow AI concerns and deep fakes and related abuses and that sort of thing. I wonder whether you've got some more to say about that. there's a lot of hype around AI and depending on how you feel about it, I'm a net positive, right? And I'm just curious, when you see it in this space in the cyber security realm, in the instant response realm, do you have a temperature for it yet? Hot, cold, maybe? Matt Mosley (47:12) Yeah, I'm with you. I think it's net positive. I'm one of those types that's an early adopter of new technologies. But with every new technology, there's always this fear of what it could possibly do. No thanks to Hollywood and Skynet and all that good stuff. ⁓ But I will say that it's great for a number of things, especially in incident response, right? Being able to go through the data, parse it faster than a human can and have potential outcomes of that data, how do we use it, how can we respond, et cetera, is great. Integrating it into workflows is great, but it needs guardrails, and it needs the human element to at least have oversight on it from an IR perspective, just so that you keep those incident response fundamentals in place and that visibility in place. Now, the con of the AI is that the fact that the threat, the attack surface is now expanded. So now not only are attackers using AI to pull off some, you mentioned mythos and stuff like that, but now think about AI being implemented into log analysis and someone ingesting commands into those logs. And now the AI is going to read those logs and then potentially say, you want me to delete the database? OK. And going through and processing that information. So it creates a of a new problem. But I think it's solvable. ⁓ I think definitely keeping, not replacing, but implementing and working around AI within the workflows is the best route right now until we ⁓ see more of it implemented elsewhere and see how it's used in some of the best cases. ⁓ Martin Hinton (48:59) mean, you touched on sort of solutions. Is there anything specific that you could share with us? Matt Mosley (49:07) as far as like protection. Martin Hinton (49:09) Logs and governance, know, mean, we like, you know, guardrails is another way to say checks and balances is just another way to say limits to the range or mobility that something can have. It keeps it on track. It's supposed to be on, you know, guardrails is the thing. And we using that physical world analogy to describe limiting something that is for me and for, think, a lot of people kind of hard to comprehend, like what this is capable of and the speed with which you can do things and the ability to has to sort of comprehend information. as you've touched on faster than people, is it, you know, it can't do something, you know, like it couldn't delete something without, you know, two people, you know, they think, imagine the nuclear silo with two human beings with a key kind of thing where there's a need for redundancy or a stop gap, if you will, that ⁓ prevents it from doing something that is irreversible. Matt Mosley (49:59) Yeah, so I'll give you a perfect example. This was actually in the news yesterday. A fairly large startup ⁓ implemented an AI agent, and it went in, and within, I think it was nine minutes, it lost access to a certain database. And so it took it upon itself to go through their infrastructure, find an API or API key, use that to gain access, and then remove the database and remove the backups to solve the problem. But that wasn't. That wasn't the problem, right? It was just the connection. So I think there definitely needs to be a little bit of restrictions around AI and its access, for one, how far it can go. There's other stories of an open AI platform that everyone jumped on the bandwagon real quick and installed it on their computers. And it started going out on the internet and doing things on its own. And people found out real quick that, oh, it didn't open up. you know, hey, I need to make firewall rules and I need to make sure that this thing is isolated and segmented so that other people can't abuse it. And so it's just like every other platform out there, right? You just need to, when you implement it, you need to go through a little bit of a checklist and make sure that it's not calling out the websites that you don't know about or, you know, reaching out to places you don't want it to reach out to. Martin Hinton (51:20) I don't remember the details, but I'll put a link in the show notes to a story about it. But there was another one I read about that it had been turned on and it had gone off to the side, if you will, in the shadows to use the shadow AI analogy. And it had figured out how to start buying Bitcoin. And all I thought about was Robert Moses, the person who's responsible for a lot of the layout of roads and stuff in the New York City area. He was famous because he had the toll money collected in the bottom of one of the toll booths that he had control over. And it's classic, right? Control the purse, you control the revenue, you control where money can get spent. And, you know, the AIs have all learned from our behavior, right? So anything we've done, nefarious or good, is something that we should expect it to do. And we have guardrails for ourselves. We have, you know, education that teaches us to not steal and, you know, treat people well. And then we have laws that punish us if we don't. heed that kindergarten level advice. ⁓ So maybe AI needs to go to kindergarten or something like that, or that kind of thing. ⁓ Matt Mosley (52:21) I definitely think we're going to find, you know, as things get adopted more and people start using it more, we're going to find new ways to abuse it, but also new ways to curb it. And so it's just like every new technology. Martin Hinton (52:34) I, as we said, we're closing in on an hour. So we'll move to wrapping it up. I just, before we go, just want to wrap, turn back around to the cyber insurance ⁓ element of it all. From an incident response person's point of view, what are, what do you wish underwriters were asking more? And I know you've touched on this a little bit, but tell me a little more, you know, you could repeat yourself, don't feel bad, but, but tell me a little bit more about the IR sort of, you know, cyber insurance, if you will, wish list. Matt Mosley (53:02) I'd say the biggest one, anytime anybody ever asked me something like this, is backups. Please make sure you've got backups. They are immutable. They can't be touched and that they work. So I can't tell you how many incidents we've come in where, especially with threat actors like ransomware groups, they go in and they delete the backups. And when we're coming in to investigate, it just makes it even harder. From an insurance side, it definitely links in the time of resolve for the incident. And so I think in cyber insurance, they 100 % need to start looking at backups if they're not already doing that. Martin Hinton (53:43) Great. So I'm going to shoot a quick few quick questions at you. The idea is you have a quick fire around of questions. And as I think I said to you before we started recording, you don't have to be that quick or or say something fast and pithy. You can consider your answer by all means. In the next 30 days, I'm a see so what's one thing I should test? Matt Mosley (54:04) your backups, but also your communication plan, right? Who are you going to bring into the room when an incident happens and making sure that the contact information is up? Martin Hinton (54:14) I'm a CEO or a member of a board watching this. What's one question I should ask the CISO next time I see him or maybe just email him right now? Matt Mosley (54:23) When's the last time you did a tabletop or threat simulation with the IR play? And if it wasn't within the last year, you need to schedule one now. Martin Hinton (54:31) What's one thing a company should do before they call you? Matt Mosley (54:35) I guess the biggest thing when we're doing scoping is having someone who can answer our questions. And that means having someone from the technical side and the business side, right? What is affected on the technical side and what, how long or what's important from the business side to get you guys back up and operational so that we can make those our objectives when we arrive on site and we can work to get that stuff taken care of as soon as possible. Martin Hinton (54:58) Yeah, you need people who can communicate the reality of their company's situation from this point of view. mean, yeah, that's a good one. Is there a phrase you hear during incident response or during a crisis that makes the hair in the back of your neck stand up or you roll your eyes or you think, here we go again, I'm getting too old for this? Matt Mosley (55:21) ⁓ So this one might not be good, but ⁓ usually from executive leadership is we don't care about other people's data. There's one. Probably shouldn't put out there. with that one, I'm always rolling my eyes at. ⁓ Martin Hinton (55:34) Yeah. Matt Mosley (55:39) The biggest one in my career is probably, yeah, we have network segmentation. And then you find out that they don't have network segmentation. Martin Hinton (55:46) Yeah, exactly. Back to your point about understanding their environment. All right, last question. What's, and we'll end on a positive note, what's the best thing a CEO or the person in charge of a situation that you encounter, what's the best thing they can say? Matt Mosley (55:51) Yeah. We will get through this. It's gonna take a little bit of time, we will get through this and we'll be better for it. Martin Hinton (56:08) So it's again, classic crisis reality. I'm not sure how we're gonna fix this, but we're gonna fix this and we're gonna do it together. Like very, very basic leadership qualities. Great. Matt Mosley (56:16) Yeah, and they will be better for it because now they've identified gaps and they're gonna remediate those and they're gonna be even more secure for the next attacker that comes knocking on the door. Martin Hinton (56:28) So Matt, we've been talking about an hour and as I promised before we began, is there anything we discussed that you'd like to say any more about or anything else you'd like to add some additional context to? Matt Mosley (56:38) No, think just the CISO survey itself had some pretty good insights and I think some things that will wake some people up. I think some surprising numbers, right? I the fact that I think it was like something like 90 something percent were prepared for an incident, but then they don't feel confident in their tools or their organization to handle an incident, I think was pretty eye opening. Martin Hinton (57:03) You know, I just want to say I would second that, that there are some things like the disconnect between those two perspectives is really interesting. it is the sort of back to how you get people to pay attention, the kind of thing you might read and you think, wait a second, am I doing that? know, right? Seeing ourselves can be very difficult at times. Now, finally, is there anything we didn't get to or anything you'd like to discuss or mention or tell us about? Matt Mosley (57:28) No, don't think so. Sorry. Martin Hinton (57:29) Fair enough, fair enough. I mean, think we could go on and on and on. And I think that that's the truth. The reality of this is that complex things can endure significant analysis and discussion. And maybe that's another thing. If it feels overwhelming to you as a CEO or a board member or an executive at a company, that's an indication that maybe you're not doing enough and you're not prepared because you're scared. ⁓ Matt Mosley (57:54) Yeah, but I think as the survey points out, you're in good company. From the 600 leaders that surveyed, everyone's feeling the same. Martin Hinton (57:59) Yeah. Everyone's feeling the same. Yeah. And just so you all know that a few of the things we've referenced, there'll be links in the show notes, so you'll be able to find them there. Mosley, incident response manager with Sygnia. Matt, thank you so much for the time today. I really hope you enjoy the rest of your day. I really appreciate your insights and your expertise, and I hope the audience does as well. And I'm pretty sure they will. So thank you very much. Matt Mosley (58:26) Thank very much. That was great. I had an amazing time. Thank you. Martin Hinton (58:29) Everyone else, thanks for watching. And as mentioned, the show notes will have links to some of the stuff we discussed. If you've got a question or a comment, you can leave it in whatever platform you're watching on this. And if I can't answer it or we can't answer it here at Cyber Insurance News, we'll ping it back to Matt and see what he has to say. Thank you so much for watching. Again, I'm Martin Hinton, the executive editor of Cyber Insurance News. Again, really appreciate you tuning in. Enjoy the rest of your day.