In this episode, we talk with Ira Winkler. He is considered one of the world’s most influential security professionals and was named “The Awareness Crusader” by CSO magazine in receiving their CSO COMPASS Award. Most recently, he was named 2021 Top Cybersecurity Leader by Security Magazine. He has designed and implemented, and supported security awareness programs at organizations of all sizes, in all industries, around the world.
From stopping stupid, to even people at his level fail… good conversations were had. Give it a listen.
Paul Love [00:00:11] Welcome to our sites. And today we will be talking about what every CEO should know about security awareness.
Jason Loomis [00:00:17] By the way, that was Paul.
Paul Love [00:00:19] And that was Jason.
Jason Loomis [00:00:21] And this is offside.
Paul Love [00:00:23] Eventually, we will get it in sync. Jason. I know we will.
Jason Loomis [00:00:28] I like that. We're not programed and scripted, obviously. Yep. So we're talking about how to move the elephant in cybersecurity.
Paul Love [00:00:35] Yeah. We've had some great conversations about the power of stories, presentations, and other things that get people excited about security. And it's about that time of year that security awareness training starts.
Jason Loomis [00:00:47] Actually, actually, good security awareness training is year round, just like Vegas shows up, except what goes on in security awareness training doesn't have to stay there. So it's not exactly like Vegas, but security training, security awareness training. And as a CSO is the most return on investment. I feel a security organization, a security team at any organization can get. Yet, you know what, Paul? It's often the hardest for me to get funding or to get budget for. Why do you think that?
Paul Love [00:01:13] Well, I mean, over the years, I've found the same thing, right? You don't want to invest a ton of money into security awareness, and it's often because it's not something that is seen as super tangible. Right. Like if you buy a firewall, it's very clear what you're getting and your efforts configured properly. Of course, what you're getting and what it does for the organization where security awareness, you know, I've seen historically people think, well, that's everybody knows how to do this stuff. What do we need to make them aware for or it's seen, you know, people have been through such poor security awareness training, the stuff that, you know, make your eyes roll to the back of your head, just boring rote. Hey, do this, do that, X, Y, Z, right. And it just it's not inspiring to most people if it's not done properly.
Jason Loomis [00:01:58] Yeah. You know, you just made me think of that soft science. Hard science where people tend to think that psychology or sociology or things that deal with the human mind tend to be soft sciences. And they're just not very, you know, reliable or as hard as, you know, biology, chemistry or physics. I think that's another good analogy too. But funny you mention about the boring things. I wonder how many of our 38 listeners who had to sit through an insurance seminar because that's what some security awareness training feels like to me.
Paul Love [00:02:22] Actually, I think we're at 40 now. The other day I was 40 listeners. Yes. Yes, y you.
Jason Loomis [00:02:28] Are. And did you just say you were running?
Paul Love [00:02:31] Yes. I was writing it, I think, because I had the, you know, the speakers out loud, because remember, when I drive by the car, I always have the speakers out loud with the podcast. So people here, so we have passive listeners. That's what I like to refer to them.
Jason Loomis [00:02:42] As passive listeners. Great. Well, since you were running, I've a great dad joke. Where did the CSO go? Where did the CSO go the last few days? I don't.
Paul Love [00:02:50] Know where.
Jason Loomis [00:02:51] They ran somewhere.
Paul Love [00:02:54] Oh. So right back to our conversation. You know. Yeah. As I as I was talking about running somewhere the other day, I ran into. Yes, that's another unfortunate pun. I ran into my dog, sister's best friends roommate. I forgot I forgot the name, but who said that? We're going to check it out. The podcast, that is. So, you know, I think we're at 40 now that we have all get that name down, but I think we're 40 now. So that is good.
Jason Loomis [00:03:25] I could feel it. We're going to hit 50 by the end of season one, I'm positive.
Paul Love [00:03:29] Yeah, I would agree. So, you know, as I was saying, you know, security training can be boring. And I've seen every bad cliche thrown at it. And sometimes, you know, the teams think that the worse the cliche or the more, you know, cliche it is, the more they're going to catch people's attention. And, you know, the bigger the company or organization it feels like, the more boring it it can become.
Jason Loomis [00:03:54] Yeah, I definitely have strong opinions on that.
Paul Love [00:03:57] But please share.
Jason Loomis [00:03:58] Well, I think I think one of the biggest negative drivers or one of the biggest reasons for that is compliance, where these you get these compliance requirements that state very specific training requirements like you must say this and do this. Now, I don't want to mention any compliance names here, but trust me when I say the bar is not set very high when it comes to overreach and being too descriptive in security awareness training programs.
Paul Love [00:04:26] Yeah. So any of our 40, potentially 50 coming up soon, listeners out there who had to sit through security training, that's about as exciting as a timeshare sales pitch and sometimes it feels just as captive, right? You don't have any options and you can't leave early or you don't get the prize of compliance. So please don't blame your security team.
Jason Loomis [00:04:44] Which is so odd. Why timeshare pitches are so boring and gouge your eyes out because you think sales teams and people would not be boring. By the way, I think I might have just foreshadowed a future guest in topic anyways.
Paul Love [00:04:57] Well, and you know, this week's guest is a person I've been connected to on LinkedIn for a while. I've seen a lot of his posts and, you know, been very interested in kind of the direction and, you know, the things that he talks about and has very unique and good insights, I think, into security awareness and making it interesting and personal.
Jason Loomis [00:05:19] That's great. I'm looking forward to this. You know what I found in my experience, I'm loving to hear what what our guest is going to have to say about this, to teach me what I've been doing wrong, to be honest. But I pull from multiple training platforms. I use I use ones that are very interactive. I use I have to do the compliance. We have to we have to check that box for certain compliance requirements. But I think interspersing it with more engaging and fun platforms works for me as well. And then you always got to do things like I do for newsletters, team videos, interactive hack sessions for me. Like I think variety adds to it. So I try to be as as have a variety of suite of training options and training stuff to do with the caveat that not a lot of us are funded or have budget to be able to do that.
Paul Love [00:06:02] Yeah, I've seen some of the stuff you've done and it's usually very interesting and creative and you do you manage to make a lot happen with, you know, very little funding in the past. So, you know, that's the approach I like to take is I like to teach people how to protect themselves and their families. And then I find they bring those same behaviors into work. Right. And I also like to keep it simple, tell stories that make the information relatable. You know, and there's a big difference between saying don't reuse passwords across multiple sites, you know, and saying, hey, here's here's why you don't do that. Right? If one website gets hacked, right. They're going to try to find other. So when you explain to people, it really starts to resonate with them.
Jason Loomis [00:06:44] Yep. I agree. I also know you're really big in the measurement pole, as every good CEO should be. But one challenge I've always had is how do you measure a successful awareness training program? Is it Rotten Tomatoes is a 525,600? Daylight, sunsets, cup of coffee. Late now I'm digressing into Broadway mode. Sorry. Stop me. How do you. How do you even. How do you even measure?
Paul Love [00:07:08] Well, and you know, the measurements, you know, typically, you know, it's almost like measuring and marketing, right? It's really tough to measure the results. But, you know, the ways that I've always looked at is like, okay, the measure of the results, right? How many incidents are you having? We we do periodic retesting, right? We'll start with one set of questions at the beginning of a year as that, see how many people answer correctly, like where do you know what to do during an incident? Right. Well, ask those same questions every quarter and see how things are progressing. So you can set baselines and you can measure this and you can show that value to your management. And by the way, to get people to do these voluntary surveys, you typically have to give them some type of reward, right? Because anything with security people sometimes don't either. They don't think if it's not mandatory, they don't want to participate or, you know, they got a lot going on. So we often offer rewards like, okay, if you answer this, five people will get $200, right? And we do things like that to really entice people to participate in our surveys.
Jason Loomis [00:08:13] So now this is getting me more excited to hear what our guest has to say because I'm not completely agreeing with you.
Paul Love [00:08:18] But I know that's one of the good things about this podcast is we often don't agree. So, you know, with that, you know, I have a great guest. Again, I'm very excited to have this person on who's done a lot of security awareness, written books about it, really has dedicated a lot of writing and thought to this. So what for? No further ado, our our guest this week is the chief security architect of Walmart.
Jason Loomis [00:08:43] Walmart, Walmart, Walmart. That sounds familiar. Yeah, I know that name. How do I know that name? No, no, no, no. Don't. Do not tell me. Wait. Nope. Walmart. Yes, of.
Speaker 3 [00:08:52] Course.
Paul Love [00:08:53] I'm pretty sure you're familiar with one of the large largest organizations in the world. He's also the author of Security Awareness for Dummies You Can't Stop Stupid.
Jason Loomis [00:09:04] Best title. Best title ever for a book, by the way.
Paul Love [00:09:07] Which Jason, I actually why don't you tell what you thought the title was at first we were just talking about that.
Jason Loomis [00:09:12] Oh yeah. I thought of off camera. Yeah. You can't stop stupid. Very negative. You know, my typical negative aspect and very pragmatic approach. But his takes it the other way. It says You can't stop stupid because you know, what's the what's the saying you. Yeah you can't you can't fix stupid but but you can stop stupid is much more positive outlook. I love it.
Paul Love [00:09:33] Yeah and you know he's a former so someone who's been in the field and seen a lot of things that information security. So I'd like to introduce Ira Winkler.
Ira Winkler [00:09:43] Hi there. So thanks for having me. I appreciate it.
Jason Loomis [00:09:47] We are happy and blessed to have you.
Ira Winkler [00:09:50] Yeah.
Paul Love [00:09:51] When we we really appreciate the opportunity to talk to you because again, someone who's really done this and spent a lot of time just focus areas is really amazing. So, you know, I wanted to start out actually and say you spent a lot of time on security awareness. You know, again, I've been following your writings for years and, you know, senior on LinkedIn and other forums. What prompted your passion for this area?
Ira Winkler [00:10:15] That's really hard to say. I mean, my background, my degrees in psychology. But in all honesty, the only reason I have a psychology degree was because it was the easiest major to get. You know, I'm not going to lie six classes, multiple choice tests in all of them where the subject matter was pretty much repetitive made my life a lot easier to finally get a major when I was going into my senior year. But, you know, it always kind of fascinated me. I mean, frankly, what had me in doing this in cybersecurity, I mean, when I started my career at NSA, what happened was I was an intelligence analyst, but I hated that job and found out computer people got paid more. So I just kind of said, Hey, that computer internship looks good where I was retrained as a computer person. Then there was like a human factors group inside NSA. I thought I joined that just because it sounded interesting. But you know, I mean, why is anybody interested in anything? I always want you know, I was toying with getting a cognitive science degree, things like that. But one day when I was doing cybersecurity and working for a government contractor, they said they said instead of going into the Pentagon, can you make a few phone calls? And three days later, I had control over one of the world's largest investment banks. And, you know, stealing $1,000,000,000 is kind of a rush. And so what happened was I started doing that and I tried focusing on that, you know, the holes, I guess, in. Then I wrote a paper that was called the seminal work in Social Engineering, and I had to look up what seminal meant, what social engineering meant at the time. And then it just kind of was fun to do. And then I, you know, I mean, and that's just where I try to focus everything in on because the human aspects of cybersecurity, to me, frankly, I don't like sitting behind the desk and working on a computer, even though I'm pretty good at programing, or at least I was, but I still wanted to be out talking to people. I just didn't like being behind the scenes. So that's where I kind of focused in on that and why.
Paul Love [00:12:22] Yeah, and that's a that's an interesting point because security awareness is one thing I found is the most human out of all the parts of cybersecurity. I mean, they all are. But, you know, this is where you have to engage with others, really understand what their needs are, you know, from understanding security and so forth. So, you know, that's that's a that's a great point. So yeah.
Ira Winkler [00:12:44] And part of it I think though and going back to why awareness programs fail a lot though, is that people think, okay, well, they get a job in security awareness and essentially at the end of the day, it just becomes a boring marketing program. Yeah. And you know, maybe they'll read a book and I call it Bro Science. They'll read a book on Influenced by Carl Dini, which in itself is a great book. But the problem with this is and people don't understand the finer points when you're reading a book by Carl Dini and Influence, that's a book essentially on how you engage with an individual, like how do you take an individual and convince and influence the individual the definition of the book. However, the reality is when you are looking at a cybersecurity program and security awareness, your job is not to engage one person at a time. You know, like maybe you have millions of people in an organization. You know, your job is to engage a whole organization simultaneously, for lack of a better way of looking at it. And a book like Influence kind of sort of is helps, but it's not the way you can't go to a million or even a small or. Well, sorry, I say small organization now, but even an organization with like a few hundred people and go to each and every individual and say, okay, what triggers you? How can I get you personally? What will influence you most to, you know, to put together better behaviors you have to look at, okay, how am I going to modify behaviors? Because that's another thing people don't acknowledge, like a guy who writes awesome stuff for that's relevant to cybersecurity. B.J. Fogg and B.J. Fogg has this concept of information action fallacy, where just because you provide people information, it doesn't mean they're going to take action on it. And frankly, they won't usually take action on it unless there's some type of incredible motivation to stop. I don't know what what are you not the inertia, but the, you know, to stop the forward movement unless you get somebody to do it like, hey, we got to make you look over here and like, but we're cruising nicely straight, going straight. You know, you got to go ahead and figure out how to do these things. And, you know, there's too much bro science in the field of taking a good concept and thinking. It sounds intuitively obvious, but it's actually not because you need to take a step back and realize My job isn't to influence a person. My job is to modify behaviors which may or may not be influence. You know, for example, I can modify people's behavior. Let's say I want everybody to walk in and show a badge. I could modify that behavior kind of relatively quickly by locking all the side doors of many buildings and forcing people to come in the front door where I put a guard. And that way I can modify people's behaviors. Now, is that doable? Maybe. Maybe not for many organizations, but you have to stop and consider that that's going to have influence infinitely more impact than just putting up an awareness poster saying, Please wear your badge as everybody walks around the facility without any enforcement whatsoever.
Paul Love [00:16:08] Yeah. So you have to have in, you know, because you bring up good points about, you know, and changing behaviors because unfortunately, if you if you go too far in one side, too far on pushing and requiring compliance, then you get malicious compliance. And I'm sure you've seen that where people follow the rules only as long as they have to versus, you know, engage and understanding. Because what I found is a lot of employees want to do the right thing. Right. And you as a security people, we have to help them, you know, find easier ways to get that done. Now, sometimes you just can't be done like passwords. Right until we do away with passwords and so forth. There is some there are some things that require a little bit of extra work. But, you know, finding that balance between the okay, I'm going to go hard line and, you know, if you make X mistake, you're fired. Or to too liberal where it's just security awareness posters. Right. That it doesn't change behaviors but just informs what what are the things that you've seen? Like how do you get to that middle ground or have you found that or is that something you're still studying?
Ira Winkler [00:17:15] Well, there the middle ground varies greatly. You know, your individual results may vary as all those financial things say, but when you're talking about how do you influence behavior across organizations, you know, first off, you have to look at what your job is, you know, in the first place. It sounds bad to say, but an awareness professional is not the champion of the users. You know, you don't look, for example, as the firewall administrator, as the champion of the firewalls, representing the firewalls interests or whatever. As a security awareness professional, your job, you are a security professional. As a security professional, your fundamental job is to reduce risk in security awareness, to modify it to be actual a a business value return. I would contend that your job as an awareness professional is to facilitate the behavioral modifications to reduce overall risk to the organization. And that mayor and the fact that the word awareness is in there is kind of probably where the problems with the profession come in. Because when you say my job is to generate awareness, you know, okay, I have made you more aware. Does that make the organization more secure? Usually not. You know, you can like more. Yeah. Yeah. I mean, I use the example. How many people can you tell? You need to be healthier. And the way to be healthier is to is to eat less and exercise more. How successful is that in America? And when you get on an airplane, you know how ineffective that is. That's all I could say. So when you look at that and say, okay, for security awareness, because one of the criticisms I always hear is no matter what you tell a user there, somebody's going to inevitably click on a phishing message. And you know what? Absolutely. 100% correct. And likewise, no matter how many firewalls I have, something will make it through that shouldn't. Because security is not about perfection. Security is about risk reduction. And if you promise things like my biggest pet peeve is the whole concept of the human firewall and making the users your last line of defense. If your users are the last line of defense, you're failed. I mean, that's a given. But the reality is you need to go ahead and understand that the you you know, you're not creating a human firewall. Your job is risk reduction. And when you start telling people you should be the human firewall, you know what? That doesn't qualify. And you're setting up you're setting unrealistic expectations. So the first time something happened, which it will, because if you look at the Verizon data breach investigation report, 3% of people and I think is out. Maybe it's for but either way, it's gone down from five, which is a good thing. You know, no matter what happens if a phishing message gets through, 3% of people will click on it. On average, it's going to happen. Your job is to demonstrate with good metrics that human behavior has improved and risk has been reduced. And, yes, that's a combination of information. It's a combination of working with procedures and making sure people know how to do things right. Because one of the problems is people think awareness is telling people what could go wrong. Good awareness is telling people how to do things right regardless of what's going wrong or right. And too many people lose sight of that. So, you know, I'll use the example quickly that what you know, I tell people when, you know, W-2 season, you know, W-2 fraud season and everything like that early in the year in the United States, there's a stereotypic and it happens thousands of times a year. A junior H.R. person will get an email that says, I'm the CEO. I need you to send out our W-2 information to a new accounting firm at this email address. I need it to happen quickly because we're running late to get our W-2s out to employees. And as good security awareness, people are like, I have to determine, okay, I want the user to look at that message and say, is that really the CEO or is that a hacker? Total wrong way of looking at it. I want the user to say this is a request for PII. What do I do with the requests for PII? I'm not authorized to send this out, even if it says it's from the CEO. If the if somebody wants PII sent out, it should go to the head of human resources who then has to confer with the general counsel to make that decision as to whether it goes out. They shouldn't be saying, is this a hacker or not? They should be saying, what is the right thing to do with a request for PII? And that's a major difference because in all honesty, if your entire awareness program or security program as a whole depends upon a potentially new low level employee successfully defending against a trained criminal or sociopath, that is a losing proposition. You need to figure out how do you implement a program that accounts for the fact I am going to have somebody who is grossly outmatched by a criminal. How do I up the odds and reduce the potential loss from that type of situation, which is the actual root of the situation? It's not How do I empower my user?
Jason Loomis [00:22:57] You know, you just you just brought up a great example of that, how you talked about, you know, we we talk a lot about the writer of The Path and the Elephant. What you just described was how much the path can affect it by changing the process of how that user thinks about it instead of, well, it's my SEO, it's this. I follow this process of all. He has to go through this. I'm not authorized. I have this you know, you've shaped change in behavior without even focusing on, you know, we love to talk about the elephant in the writer, but really you just change the path. And that has such a powerful effect on behavior. I love that analogy that you brought up about that. Yeah, that's the way the person should be framing it.
Ira Winkler [00:23:31] And let me just point out that cybersecurity treats itself like a snowflake. Let me ask you a question. Well.
Jason Loomis [00:23:39] Aren't we poor?
Ira Winkler [00:23:40] Yeah.
Jason Loomis [00:23:42] We're special.
Ira Winkler [00:23:43] Yeah, but look at it this way. What would happen if an employee said, look, I really, really don't like filling out time cards. I just don't want to fill it out. And it'd be like, Oh, well, you know, the user, let's just kind of make a fun video to show him why he should fill.
Jason Loomis [00:24:01] Out his, like, getting paid.
Ira Winkler [00:24:03] Yeah. I mean, does anybody go back and say, well, Joe didn't fill out the time card? Joe was really busy last Friday, but we still kind. We should pay him. So, like, no, Joe has to fill out the time card. That is a must. It's not a should. Filling out your time card is your, you know, is the number one priority for all organizations. When I worked at NSA just as an example, the number you would think with, they are monitoring all the world's nuclear weapons. In theory, as an example, what is the number one critical system at NASA?
Paul Love [00:24:41] Yeah. Yeah.
Ira Winkler [00:24:42] That is the system that will they will shut everything down if payroll doesn't go out, because by law, employees have to be paid or they cannot work.
Jason Loomis [00:24:54] That's every organization at the top of the BIA is going to be. And people forget about this four systems they need to have BCD for. Absolutely. Payroll always comes in at number one.
Ira Winkler [00:25:02] Exactly. But why why is it why would nobody question somebody not being paid if they don't take that one act? But they say, oh, well, we have to encourage them not to ruin our network. We have to encourage them and provide them with funny videos. On why you shouldn't bring in outside USB drives.
Jason Loomis [00:25:23] Because organizations are afraid of having negative views of security and negative views of policy. So they want everybody to to hug and feel good and they want carrots versus sticks. That's my opinion.
Ira Winkler [00:25:37] Well, that's with regard to cybersecurity, with accounting, for example, what happens if you violate accounting? I'm not just talking about time cards. Yeah. You know what happens if you're violent? Yeah. Yeah. I mean, what happens if you are sloppy filling out accounting documentation, your travel reports? Don't you still don't get your travel reimbursement if there are safety violations? You know, for example, if somebody repeatedly makes unsafe actions that put themselves and others at harm, they will be fired. You know, it's just a given because I mean, yes, it's obvious, but that, hey, we prefer to fire you than to have you kill yourself on the job. You know, those are regular issues that people have to account for. And in cybersecurity, we are you know, again, in most companies, cybersecurity is the lifeblood of the organization or the i.t. Systems and unsafe behavior with i.t systems is as dangerous to organization as unsafe behavior, for example, on a factory floor.
Paul Love [00:26:46] Well, you know, I think so. When I'm hearing you say this, it's a you know, we have to make sure security is done right. It's a requirement, so forth. I don't think you are also saying that, you know, you actually let me rephrase that. I think you're saying you can do that, but you don't have to be so passive in your awareness program that it's all just, you know, the the emotional stuff, the stories. But I think you would agree that, you know, getting people engaged in and having something that's interesting and telling the story behind it helps to retain that information.
Ira Winkler [00:27:22] Right. That goes into how you do a security awareness program properly. And I mean, there's two concepts we're addressing here. Well, actually, three the three concepts are in the first place are, you know, I talked about in my book, you can stop stupid. You know, stupid is not the fact we're stopping stupid from existing stupid as the fact in theory. In theory that we know people will do things that cause that might inevitably cause harm. That's a fact. It doesn't matter. Maybe they're not stupid. They might be malicious. And in cybersecurity, one of the biggest reasons I hate human firewall. What happens if your human firewall is just a malicious person as an example? But when you look at that so in you can stop stupid. I talk about adopting safety science into cybersecurity where yes, somebody looks at a user who does a stupid thing. I look at, for example, and I, I'm going to you're a podcast, you're not on CNN or something like that, but I'm going to go ahead and say, for example, that I was at an event and this pretty much solidifies my concept, but I was at an event on the buffet line and there was a table next to the line that was giving out stickers that say, Don't click on sched. And I had an admin type in front of me and the guy picks up a bunch of stickers and he says, I really need a lot of these. I got a whole bunch of users that keep clicking on shit. I'm like, Wow, you must give you users a lot of shit to click on. Yeah. And he's like, What do you mean? I'm like, How are they clicking on all the shit? Are they doing it on your like PC or, you know, your company computer like, yeah. And he's like, Well, you must be giving them all that shit. Why would they be clicking on if you're not giving it to him? And I go, by the way, if you know they're going to click on it, why aren't you doing anything about it? Yeah. And so when I look at you can stop stupid. The concept is stop. You know, you have to look at the whole process and figure out how not to give people shit to click on. Then obviously you do need awareness because don't let me downplay the importance of awareness. I wrote a book on it not because I wanted to waste my time, but because awareness is a critical part in ensuring that at that point where users are making decisions, how can we lead them to making the right decisions? And again, that's about behaviors, not awareness, but we call it security awareness. But, you know, and that's why that is a critical part. But then likewise, you know, we also have to expect people to, for lack of a better term, fail, you know, for one reason or another. It doesn't mean they're stupid. It could be a variety of things. I know I've clicked on phishing messages because my cursor was lagging from where you know, it was lagging on the screen from where it actually was in the computer. And what happened was I clicked on one thing which opened up an email message which it started causing, you know, adware to pop up. And I was sitting there thinking, okay, was that me? No, I knew what I shouldn't click on, but the computer was lagging. But still, that happens to good people. Good people will make a mistake, you know, maybe, you know, for a variety of reasons. But then you expect that. And in expecting that, you know how to respond to it. That's really how you address the user, the user problem by Dr. Evil quotes as a whole. But awareness is still a critical factor in causing people to make the right decisions at the point where they are in the proximity of it. Because we blame no, I mean, the problem in cybersecurity is we blame the user for clicking on the ransomware message as opposed to the secure email gateways that failed to detect it and as opposed to the anti-malware that failed to stop it from loading. That's because the user is in the proximity of where the ransomware happens to be embodied. And yes, we have to look at the proximity as where awareness can have an impact. But we can't ignore the other parts of it either. And we need to improve from a systems perspective, how to reduce the loss. And that's what Six Sigma is about. You know, we have we should be looking to those other disciplines like project management with Six Sigma, the whole total quality management from the 1990s to embrace and so on.
Paul Love [00:31:47] So that's a great way to think about it. Right. It's part of a process. You know, like if you want, it's not just about the user. Understand everything that happened before the user. Make sure like do your controls work, do your tools work. But also, you know, important thing that you brought up that I also think is very important is what do you do when the inevitable happens? Right. Do people know how to behave? Do they know how to report it? Right. That you don't just ignore it. And know one thing I always do with that as well as I try to explain, listen, I've been in security for 30 years and I've clicked on these accidentally. It happens, right? You're not going to get in trouble for reporting. Make sure to report. So they emphasize one that, you know, it happens like you just said, because I think some people come from organizations that are zero tolerance. Right. You make one mistake and you're done. And I think so. It's all part of a process, is what I hear you saying. And I think that's a great way to look at it.
Ira Winkler [00:32:46] Yep. Now go. I'll let you ask the next question. I could. Sorry, I could go on a monologue whenever I'm going.
Jason Loomis [00:32:55] How do you do for for, let's say, SMB small to medium sized businesses, you know, maybe maybe have five or ten employees. I've got a two person team.
Paul Love [00:33:03] Mm hmm.
Jason Loomis [00:33:03] You know, you mentioned a lot that I felt that at enterprise level, I can see a lot of the Six Sigma lean, a lot of the stuff you've been talking about. But what are some easy wins? Do you think somebody could do that? You could just go here, start here with a very small org and one, unfortunately.
Ira Winkler [00:33:19] I mean, good cyber hygiene would just take so much out of it. I think NSA, you know, I think just like a week or two ago, NSA and a bunch of other organizations put out paper saying, here's base, you know, here's what you should do to protect yourselves. And it comes down to essentially good cyber hygiene. And somebody on Twitter joked the 1990s called and once they get countermeasures back, it's you know, it fundamentally it's still the same problem. And likewise, you know, I used to give a presentation I should give it again called What The Wizard of Oz Says About Information Security. And I reenact The Wizard of Oz were puppets. And so, you know.
Paul Love [00:33:57] I really want to see this, by the way, I have seen this one. It is fascinating.
Ira Winkler [00:34:01] Yeah. And I give The Wizard of Oz and everybody thinks the morale of the Wizard of Oz is there's no place like home. The real moral is you have what you're looking for. You just don't know what or know how to use it. So, for example, these days it's even better. Most small organizations are not maintaining, for example, their own exchange server. They're using things like Google Apps, they're using Office Exchange, you know, oh 365 and other things. Multi-factor authentication is built into everything. As an example, just turning on multifactor authentication would take care of a lot of problems. It's not infallible, but it's an exponential risk reduction. And I really hate when people say, Oh, this is going to blow. You know, this is going to fail. Yes. So all cyber security controls will fail. If any vendor is out there telling you their tool will stop all attacks. You know, the only people who promise perfect security are fools or liars or well, or both. But you have to go ahead and consider. Yeah, the basics are going to stop a whole bunch of things. And, you know, I really wish I could tell. You know, here's my secret thing. It's like not pick up the latest document that says here's what good cyber hygiene is and just start implementing it across your network.
Jason Loomis [00:35:23] I think he just did about I think you just gave up the secret sauce because I'm a hugely passionate about this. It is about the basics. It is the basic controls. You put these in place at the CIA set, I think is a great place to start with just this basic, simple stuff that I completely agree that tops most of the stuff.
Ira Winkler [00:35:40] Because I use the example like another presentation I would give. I haven't given it in a while, as in in the art of of information security. And you know, I have a black belt in karate and everybody's like, oh, you know, some people joke, Oh, do you know the secrets of the ninja? I go, Here are the secrets of the ninja. There are no secrets of the ninja. Because if you study martial arts, there's really just so many ways to kick, so many ways to punch and so many ways to block. Now, what makes a master? They have mastered the basics. You know, there's no like thing. It's like, yeah, there's little tricks you might do, especially in camp more than others. The style I took where it's like when you go to block, you might block and block a nerve. Like so when you hit somebody's arm, you hit it. And where there's a nerve, for example, when you go to pull back, you kind of like take a swipe at their eyes on the way. Little tricks like that, but fundamentally, it's still the basics. You can still only punch, kick and block. And somebody who's a master has just mastered, okay, here's how I block this type of punch with a tall person, a short person, you know, large person, small person, whatever it happens to be. And in cybersecurity, yeah, there's a lot of basics to master. And it's the same thing. You know, I'm a master scuba diver, trainer. What makes that a dove master? About half of the grading of potential dove masters is just can they do I think it's like 18 or 20 fundamental skills. Can they do that to a point of mastery as an example.
Jason Loomis [00:37:16] Yeah. Mm. Yeah. We talked about this earlier. Don't mistake mastery for perfection. Mastery is not perfection. It's a certain level of achievement, of a basic skill in this vocabulary. The master diver is and I'm the perfect diver, it's I've mastered the basics in these 18 skills exam.
Ira Winkler [00:37:33] Exactly. And yes, you can do things, but still the fundamental skills are all the basics. Yeah. And cybersecurity. Yeah. There's different ways of, you know, once you get into it, there's different ways of like putting triggers in, there's ways of putting, you know, mitigating controls. So you don't have to like get rid of every instance of log for jail in your network. But hey, if you just shut out some outbound traffic, yeah, you can pretty much do an awful lot. So you can give yourself some breathing room. On getting rid of Log Forge across your network.
Jason Loomis [00:38:07] Well, as you'll be, you'll be pulling gum out of your hair for centuries. Yeah, yeah. Which was.
Ira Winkler [00:38:12] Yeah, I kind of describe the cybersecurity profession very similar to the medical profession, unfortunately, because doctors sit there and much like I started, eat less and exercise more. That's what they say. But even when people do things right, they will always go out and break an arm. Somebody will live near radon gas and have lung cancer even if they quit smoking or something like that. So there's still going to be plenty of work for doctors, even if people start to do things right. But still, number one, you know, or among the top number one killer are preventable diseases. So got to look at that as well.
Paul Love [00:38:53] Well, I you know, I had like 15 more questions to ask you. Unfortunately, we're running out of time. So I may have to try to ask you to come back some time.
Ira Winkler [00:39:02] You could just do it. I'll come back. Or we could do a or and or we could do a quick round trip.
Paul Love [00:39:08] Oh, yeah, yeah. I like, you know, definitely getting back with you because a lot of the knowledge you've talked about like keeping things simple, right? These are things that Jason I, these are some of the few things Jason I agree on. We typically disagree on a lot of things and not in principle, but, you know, on the implementation in some cases. But, you know, I think everyone can agree keeping it simple, you know, telling stories. Right. And having security be part of what people do just innately, I think are important things. So, um, I did want to ask you one, we like to end our podcast with asking one question that has three parts. So we like to make it a little complicated. If you could ask the cybersecurity people in your life to start doing something, stop doing something, and continue doing something, what would those three things be.
Ira Winkler [00:40:02] That they should all start reading? My books.
Jason Loomis [00:40:06] Have. Excellent, well-played. Well-played, sir.
Ira Winkler [00:40:10] All right. It's security awareness for dummies and you can stop, stupid. But if I want them to start doing something, I would say I many you know it's to say start an. Say universally is wrong. But let's just say I want people to start considering at least a limited to the human factor to start thinking, you know, my job security awareness is not about making it aware, about making people aware. Security awareness is about risk reduction through the process of improving behaviors and focus on that as opposed to just saying, I'm making the human firewall, I'm making people aware and so on.
Paul Love [00:40:56] No, that's. That's great. By the way, I heard the human firewall. Many times. I've never quite understood. And it's good to see this. Somebody else who's been in the field for a long time didn't quite understand that concept.
Ira Winkler [00:41:06] Regular firewall snuck. Why would you want a human being? You know.
Jason Loomis [00:41:09] Agreed? Very much agreed.
Paul Love [00:41:12] Yeah, but again, we. We really appreciate it. Sorry.
Ira Winkler [00:41:15] Did you want the other two or you just.
Jason Loomis [00:41:18] You totally skipped over poorly. Did you have your coffee?
Paul Love [00:41:20] The free guy is 630. Losing here. Sorry. Good.
Ira Winkler [00:41:25] What was the next one? I'm sorry.
Jason Loomis [00:41:26] Stop. What's. What's one thing you'd want cybersecurity to stop doing.
Ira Winkler [00:41:30] Using the term human firewall? I, I really want people to understand that humans, you know, just expanding on that slightly in the first place, it implies perfection, which you're never going to have in the second place. It implies humans are your last line of defense, which they are not. For example, for ransomware to run a user clicking on it is not or not. Clicking isn't your final defense. Ideally, anti-malware is involved. The permissions of systems that don't allow users to download and install ransomware as an example is something that's not allowed. Even if there's an infection of ransomware, you know, you can have network controls that limit the blast radius and, you know, limit that, for example, with some zero trust implementations would stop the blast radius of, you know, users being infected and so on. And anyway, that's so that's it. And then there was a third one.
Paul Love [00:42:28] I think when you're doing something.
Ira Winkler [00:42:30] Continue doing something. At the end of the day, we're kind of doing things right. I see us making a more diverse workforce, which is great, and to continue doing that, I see people looking to expanding their education, expanding their training. I see us looking to adopting from other disciplines like machine learning and things like that that'll make our job simpler. So I want to see us continue to do that type of stuff.
Paul Love [00:42:57] All right. Well, and again, thank you for taking time out. I think the one thing I took away from this is, you know, you you continue. One among many things is that people are just part of the process. I love that that whole, you know, that visual that I have now. Right. Because people tend to think, okay, well, the people are the last line. And I've heard that many years of many times over the years. But, you know, hearing that part of the process, that's a great statement, you know?
Ira Winkler [00:43:27] Yeah. And here's before I leave, just to reiterate that and drill it in, you could edit this out later if you want, but 3 to 5% of the population are sociopaths or psychopaths. And if you use, those are people who will do you harm if given the opportunity. A sociopath just does it because it's best for them. A psychopath does it because it's best for them and they like to watch people suffer. So that's the sort of high level difference. But 3 to 5% of people in general fall into that. And if you make your user, if you're relying upon those people as your last line of defense, you have a problem.
Jason Loomis [00:44:06] Those are the same. That's a 3 to 4% in the Verizon data breach report. Hmm.
Paul Love [00:44:10] I wonder.
Jason Loomis [00:44:11] Explains a lot.
Ira Winkler [00:44:12] From population.
Paul Love [00:44:15] Oh, that's a great point, right? Yeah. You know, and just like any other control, you don't depend on one control, right? You have multiple layers and so forth. So.
Ira Winkler [00:44:24] Exactly.
Paul Love [00:44:25] It's a great point. Well, again, thank you so much. We appreciate your your time and your expertize on this. And, you know, I didn't think we'd be talking about karate and process improvement and, you know, other things. But, you know.
Jason Loomis [00:44:41] Paul, this this has been amazing. Ira, you're a published author, Emmy Award winning media mogul and the black belt. I need that. I need to do something.
Paul Love [00:44:48] I haven't done enough in my life like I've been lazier.
Jason Loomis [00:44:50] So, Paul, you're the closest. You've written books and you have, like, half of it. Well, you don't have an Emmy or you're a black belt, so yeah.
Ira Winkler [00:44:56] I think I have one of those forms of A.D.D. where I like jumping around, but then I get hyper focused on things and I just have to keep deep diving into them.
Jason Loomis [00:45:05] We are definitely going to invite you on another. There's so much more to dig into. Some of the things that we talked about that I just want to get into with you, Ira, I'd love to have you on the show again. I think maybe you'll hear this and you may never talk to us again, but if you do and you're open to it, I'd love to talk more, man.
Ira Winkler [00:45:20] Yeah, yeah. Let me know next time. I'm happy to do it.
Paul Love [00:45:23] Great. Well, again, thank you. That has been this week's F podcast. Soon, we hope you join us again.
Jason Loomis [00:45:30] Thanks. Thanks, everybody.
Ira Winkler [00:45:33] Is there anything in?