Episode Transcript
[00:00:03] Speaker A: Gather round, my little hackers and defenders. You must have heard of big scary terms like SOC save our careers.
[00:00:11] Speaker B: Not quite. It's actually SOC Security Operations Center.
[00:00:14] Speaker A: I wasn't entirely wrong.
Speaking of careers, this lady who's completely confused about which cyberpath to take is Chahat Bagla.
[00:00:25] Speaker B: Hey, it's not that bad. I'm just curious.
[00:00:28] Speaker A: And so she is giving herself 12 episodes to explore 12 cyberpaths by asking professionals the right questions. Just curiosity leading the way. And if you're in your figuring it out era, come along for the ride. This season we're talking red teams, blue teams, aigrc, and all the juicy stuff. So plug in, scroll less and learn more. This is Destination Cyber Season 2. Powered by KBI Media Press Play. Your cyber origin story starts now.
[00:01:15] Speaker C: Welcome back to Destination Cyber Season 2. To all my cyber explorers out there, we have got an incredible journey today. Joining me is Leslie Carhart, technical director of Industrial Incident Response at dragos. If you imagine a journey that starts on a farm, winds through military service, and ends up in the war rooms of industrial cybersecurity, you're imagining Leslie's story.
[00:01:39] Speaker B: So, Lesley, can we start with your story? Take us back. What first pulled you into cybersecurity and how did you discover Incident response was your partner?
[00:01:50] Speaker D: I grew up in a farm in the United states in the 1980s, and at some point, we didn't have a lot of money, but my father bought a very early personal computer to do inventory and finances for the farm. And at that point, I kind of had two choices in my life. I could learn how to farm or I could learn how to use the computer, and I chose the computer. I sunburned pretty easily, so I spent the next few years learning to program when I was 7 or 8, and I got my first job as a programmer at 15.
[00:02:21] Speaker B: At 15. Wow.
[00:02:22] Speaker D: Different time. Yeah, that was a dot com boom. You could do that back then. And of course, that doubled burst. I had to eventually join the military because I had focused so much on working and learning computers that I hadn't focused very much on school. But all through that, I was reading tons of periodicals and magazines about computer science in the 90s, and I found out about this growing space called digital forensics or computer forensics at the time. And there wasn't a lot out there on it, but I knew it was what I wanted to do. It's the investigative mindset and the detective work and the computer stuff and detecting computer hackers. And I think I called every computer forensic investigator in the nearest Five cities and nobody would talk to me because I didn't look like, like wasn't in the right demographic at the time in the 1990s. And they wouldn't even talk to me. There was a lot of sexism and every ism and phobia at that time in the industry. So it took me a long time to get into incident response. I had to really work, really had to fight and work, get my work my way there over time. Took many years after, after university, so.
[00:03:33] Speaker B: Wow, times were really hard during that phase. So you mentioned you went into digital forensics. Was it the time that you were in military that you sort of went in that pathway?
[00:03:42] Speaker D: No, I was an aircraft mechanic. I got to work on airplane computers. So that was a totally different area of computer science. And that kind of pushed me into the space that I spent most of my career in, which is working with computers that don't look like computers. I work on industrial technology, so I work with a lot of things that look like aircraft computers today. Train computers, power plant computers, mine computers, things like that.
[00:04:07] Speaker B: So would you say that that experience you got from military sort of shaped the way you approach security today?
[00:04:13] Speaker D: It does. It made me think about the real life consequences of everything that I do and everything that the computers around us do. It's very easy as a young computer science person to just be caught up in code and hackery stuff. And the world that I work in involves real lives and people being injured and our society functioning as a whole.
[00:04:33] Speaker B: And you mentioned that you were a programmer earlier and that's how later on you went into cybersecurity. How did that sort of bridge a gap between coming from like a programming background to getting into cybersecurity? Was it like very handy or do you have. Does both of these things have a different approach?
[00:04:48] Speaker D: They have different approaches. It was a very, very different time. And I always hesitate. Young people ask me for the story of how I got into it because they want to emulate it, they want to do what I do. And something I always caution them is that my path is not a path that you see today. There was an era in the dot com boom and the early aughts where you could just be a hobbyist and get into cybersecurity. And unfortunately, as times have changed, the academic requirements are much more stringent. There's much more clear university paths and technical training paths to getting into cybersecurity.
Computer science and software engineering are still essential parts of cybersecurity certainly, but they're very different training practices.
[00:05:32] Speaker B: You're perfectly right about that one. Because now if I, as a student or my fellow graduates, when we graduate the uni, we go through the graduate program and the format is a lot different. So we have to go through resume, and then they give us the assessment, which is like quantitative based normal math skills. After we clear that coding assignment that we have to do, and that's also monitored. So after we finish that, there is a possibility we'll get video interview. And in the video interview, it's normal behavioral questions. Sometimes they also test you about the coding thing. And then after that we just go to the assessment center, which is very rare to happen.
So back in the time, there was nothing sort of like this. So how did you approach it? So if. When you went for your first cybersecurity role, how did you get into that role?
[00:06:20] Speaker D: Very early 2000s. So back then, again, it was a different space. They were looking for people who had foundational knowledge who could learn how to build the tools that we use today in cybersecurity. So they were looking for people with more of a systems administration, computer engineering background then a computer science background or a cybersecurity background, because there were no university degrees in cybersecurity back then in any country. My. My degrees are actually in electrical technologies and network engineering. That's what was available at the time. I also have a degree in avionics. So it was more building the framework for what we have today, as opposed to today. The framework's very much constructed and people need to be able to fit into it and hit the ground running. And that's what a lot of the academic programs are doing today.
[00:07:09] Speaker B: You're definitely right about this one, and I've been dying to ask you this question. So do you remember one of the first major incidents you handled where you thought, wow, this is like serious business. And what did you learn from it?
[00:07:22] Speaker D: Sure. So all of the cases that I have are serious business. I think the first one where I realized what it was really like working with industrial technology, it was very early on in my career, and I was working a case where a. There was a disruption to the Internet for a mine. It was a very remote mine in the Arctic Circle. And ultimately we sent a bunch of technicians out there to fix the problem, and none of them were qualified to do it. And eventually I just pled with my boss, Sunmi, so I can fix it. I don't want to be screamed death anymore. So in the dead of winter, when it was pitch black, they flew me out to the. To The Arctic Circle, with my equipment and replacement hardware for this, for this site, for this mine. It was extremely remote. And when I landed it, first of all, it took two hops in tiny 1970s airplanes to get there onto the sheet of ice. When I landed, I realized what I had gotten myself into because it was me and maybe 12 other people on the ground at this airport, which was a shack. There was no lights, there was no mobile coverage, there was no utility services, etc. I was on a patch of grass and it was freezing cold. I ended up having to hitchhike across the ice sheet with another. A van of people going to the nearby clinic and the telecommunications center and things. And so I hitchhiked this mine and I. I got to the mine and they were so thrilled that I was there because they hadn't had any way to talk to their families in weeks.
And I go down through the dust and the dirt, put my protective gear on into the darkness of this mine in the dead of winter and get to the equipment and I start setting up. And I realize that their equipment is too old and I can't connect to it. And I'm alone in this room in a mine that I've hitchhiked to. I have no mobile coverage. I got no way to call for help. And I can't fix it because I don't. I don't have equipment suited for what I'm sitting in front of.
[00:09:26] Speaker B: I would have panicked badly in that situation.
[00:09:28] Speaker D: Young and stupid. Yeah. And I should have been better prepared. And I'm very well prepared now when I go into places like that. But I was in my early 20s and I was like, they're going to kill me if I don't get this working. The minors are going to kill me. And not Lisa for Quantum now, not really, but that's how I felt at the time as a young person. So I kind of panicked and then it calmed down, which is big incident response skill there. And I looked around the room and I found the parts to build an old computer. And I sat there in the dark and I built a server so that I could connect it to my computer and the computer to the piece of equipment I was trying to repair. So it was a very scary moment. Once I left, they never knew. I finally told somebody who worked there, like 20 years later about it, because it's funny now, but at the time I was like. I was. I was like 22 or something, and I absolutely thought I was going to die. But, yeah, I fixed their Internet and I took apart the computer and they never knew and everything worked and then I went home across the ice sheet.
[00:10:29] Speaker B: So they just know you as a superhero. And I know that you make it sound very easy, like it was just like building Legos, but like building a whole new server from compute all computer parts. That, that's incredible. So did you know how to do that beforehand or did you just sort of.
[00:10:43] Speaker D: Sure. I was, I was that 1980s, 1990s nerd. And that's, that's something I mentor a lot of young people today. And one of the things that really concerns me and I, I meet amazing students, incredibly qualified students who are super dedicated and really want to do cyber security. But one of the things that does concern me about curriculum today is back in the 80s and 90s, we had to know how to do that. To have a computer, to work with a computer at all, you need to know how to take it apart, put it together, build one from scratch, troubleshoot programming errors. That was just the necessity of having a computer at all, playing video games, anything you wanted to do. A lot of those foundations are being removed from cybersecurity curriculum today. And so now when I talk to young people about working on these very old devices that we have in the industrial space, a lot of them don't know how to do that. And it's very problematic from a hiring perspective.
[00:11:36] Speaker B: Yeah, so you mentioned that you learned that back. So how was the process of learning it? Was it through books or did you just sort of explore around? Did you have any mentor guiding you?
[00:11:47] Speaker D: Nobody would mentor me because nobody would mentor a girl. And some books, I think my math textbooks in primary school had some very early computer code in them and that helped.
So some magazines, some books from the library. A lot of it was trial and error though. It was just taking things apart and seeing how to put them back together in different ways. That was what early hacking in the 90s was. There was no Google search, there was no ChatGPT. It was take things apart and see how you can make them do something interesting. And that's, that's where most of us came from. And I hope that things like CTFS and challenges in cybersecurity today inspire young people to think the same way, because it's really important.
[00:12:30] Speaker B: I think you're very right. The times are so different, especially with ChatGPT. I don't think anyone wants to learn foundation skills anymore. It's just about doing the higher level stuff, just fighting the flag and then going ahead with the CDOs. But yeah, you're right. They do have a motivation of us trying to learn what's happening behind the scenes. And pardon me, but I read your article on stories.inc where you mentioned students today are missing foundational skills, which was including working with legacy systems like Kabul. So could you explain what Kabul is for those who have never encountered it? And why is it still significant in today's cybersecurity landscape? And how can students start learning it?
[00:13:08] Speaker D: So when I talk about legacy systems, some of the computers that are still running society, making water come out of pipes, making the power stay on, making trains run, are 20 or 30 years old. Some of them are verging on 40 years old now. And they're still doing very important things. And the easy question for a person who's not familiar with the space is why not just upgrade them? Well, there's a few different reasons why. First of all, it's very hard to shut down the trains to do a major system upgrade. That's something that has to be scheduled way in advance. It's incredibly expensive and incredibly disruptive. So limited Windows to do it. It also has to be tested very far in advance by the people who produce the whole system. So like if you're talking about a train or a crane or something, the people who manufacture that device have to vet an upgrade for years to make sure it's not going to make something crash or catch on fire. And finally, sometimes the companies that made those systems have gone out of business. There's no source code anymore and nobody knows how to replace that system. If it fails, they go on ebay and they try to find another one. So the maintenance is done in a very cautious way. So those systems are doing really important things. People have to keep them running, at least in the present time, in the near future.
And some of them are running things that a lot of computer science students, much less cybersecurity students, haven't been exposed to in their lifetimes, like Windows NT and FORTRAN and Cobol, those old programming languages that we saw come out of the early, early days of machine to human translation in computer science languages.
So they're still out there. They're doing important things in banking, they're doing important things in critical infrastructure. They're very, very hard to replace. They are very rarely replaced on and very scheduled Windows. And they have to stay running or there's very serious real life consequences.
[00:15:06] Speaker B: So you talked about that. These systems have been running for 30 to 40 years now. That's a very high time. And we don't have any Engineers building that these days. So would you say that this is sort of a worldwide problem or is it more so in Australia?
[00:15:20] Speaker D: Oh, so here's a funny bit of trivia for you. I, I speak and there's probably less than 100 people, maybe 100 people on Earth who do what I do. So we all know each other and we all talk, we all sit at the pub and we talk about what we see. And I've done this across a number of countries now and I've talked to governments, I've talked to senior leadership and university curriculum leads all over the world. And everybody, every single organization I talk to, somebody pulls me aside into a hallway and says, hey, can you tell us if, if are we the worst ones?
Are we the only ones who have this problem? I get this from students too. They think that they're the only ones having problems with their degrees, but everybody thinks that they're in the worst situation for legacy systems and industrial. And everybody's having the same problems right now. Everybody on earth who's actually thinking about these problems is facing the same ones in the same situations. The legacy systems are not unique to Australia. They're not unique to the United States or Europe. They're all over the place for those same reasons. They're very, very hard to replace, they're very expensive to replace. And replacing them causes a potential risk to life and safety and basic services functioning so takes a long time, very hard to do, very expensive for everyone. And everybody's trying to figure that out right now.
[00:16:39] Speaker B: I feel like there's now going to be another degree coming up in unis that's going to talk about how to bridge the gap between legacy systems and modern comput. I wouldn't be surprised if there is another one because it seems like a whole different stream and there's so much learning and it needs to be done at early stage.
[00:16:55] Speaker D: I encourage young people to get into the OT or the legacy space. Honestly, I do a ton of mentorship again, I run clinics and I'm seeing how bad the market is right now. A lot of my peers, senior peers, are not seeing how bad the market is for young people right now. Recent graduates. It's very bad out there. There's a ton of saturation in entry level roles. What you need to do in those cases is set yourself apart and give yourself some skills that are unique and not necessarily the part of every university curriculum. And legacy and OT are a great way to do that. There's next to no degree programs in that globally right now. The problem is growing. The legislation around it is growing. It's a really smart space to build some skills because all over the place there's old stuff and the universities don't want to want to teach it because it's not cool. It's not something fun to advertise. It's not the newest, hottest technology and young people are not always interested in doing it because it's not the new cool hot thing. But there's jobs in that. There are jobs in OT and legacy. They are increasing. It's actually hard for us to hire people with the foundational skills to do them.
So no degrees really out there that comprehensively cover that. I'm starting to see some, some coursework at some computer science, computer engineering universities, but that's about it. So think about it.
[00:18:14] Speaker B: Yeah, and when you mentioned some sort of foundational skills, so what would that be? Would that be like learning Kabal, working with Windows np? And if students actually want to do that, how could they do it? Is it there any YouTube resources or do we have to sort out a leader who is working in the OT cybersecurity space and we have to reach out to them to get some mentorship and then work in our own time.
[00:18:37] Speaker D: It's always good to get mentorship. No, there's not a ton of YouTube videos. There's again like a hundred of us who do this and we're all trying to produce content. So there is some stuff out there. Go watch videos on YouTube from conferences about OT. There's a few of us who are trying to, trying to start, but it's nowhere on the scale of normal cybersecurity content. There's just not enough of us. We're trying. But if you want to learn this, the best thing that you can do is learn about industrial processes first. So start, start thinking about systems, of systems. So if you can get a little exposure to manufacturing or transportation or electric power in your degree program or just as an aside, as a side job or something else you do in your life through a family member, that's really good to learn about older technologies. Well, there's still whole out there. You can still buy old computers. And I'm not really talking about oh, you must know COBOL or you must know Windows nt. There's everything's out there. It's good to know any of it. The more problematic thing is basic computer science skills. That's what I'm seeing missing. And I've spent a day, a week mentoring young people. I want pipelines, I want to bring young people into this space. It's not me gatekeeping, I promise. But it's very hard for me to train people to use legacy systems or legacy tools when they don't know how like file systems work, or they don't know how cloud or hard drives work, or they don't understand how Windows puts things in memory. And again, a lot of degree programs that focus more on the new cool hacker y stuff, they don't have a lot of core curriculum like how packets work on a network or how BGP works. That's the type of foundations I'm talking about. You can go learn how to use Windows nt, you can buy a book, you can get a copy of it and put it in a virtual machine. That's not too hard to do. To be successful there, you have to really understand how computers work at a foundational level. And if your degree program isn't teaching you that, you have a really big problem.
[00:20:37] Speaker B: I think you're right about how we don't get to learn that sort of stuff today because like when I'm doing my course. So what they made us learn, even the languages were just like C, Python, Java, but we didn't go into detail. We learned the basic stuff on how to form like loops and just normal if then statements, all of that. And it was just a basic of those language. And would you say that would be helpful in OT somewhere or.
[00:21:02] Speaker D: No, everything's helpful. The interesting thing about the OT space and the legacy space is knowing a little bit of a lot of things helps you. So we all have these really bizarre backgrounds. You know, spending time working on airplanes, spending time on oil platforms, working in tuna canning factories, you know, those types of things are all beneficial. So any knowledge is good in this space because everything's so weird.
But I'd focus less on like code. Code is less important here. It's, it's more how an operating system works and really understanding how network protocols work. I spend a ton of time looking at packets because there's no tools to do it. I'm looking at like non standard protocols from the 90s, from the 80s, things that used to be serial protocols that aren't documented well and there might not be wireshark dissectors for them. I have to understand how packets work.
Same with, you know, doing forensics on these computers. I have to understand how a hard Drive from 1990 works, how it writes data, because I'm doing it all manually. So less about code though, that's helpful. It never hurts to know some programming and more about how computers do what they do.
[00:22:14] Speaker B: So you mentioned that a lot of the storage is like, there's a lot of use of hard drives there. And is there a use of any cloud in the OD environment or systems? Have we shifted the storage for legacy on the cloud or is it not possible to do that?
[00:22:27] Speaker D: Mostly it's not possible because of potential latency and points of failure. Again, the consequence of cloud going down in a business is like you lose access to your documents or something. It's bad, but you like lose access to your documents. The consequence of losing access to data in OT is somebody dying or the power going out for hundreds of thousands of people. You can't add those failure points. The systems have to be very simple and very well tested.
So no, there's virtualization. There's been some accepted risk for virtualization. Not in the systems that are actually operating systems, but the display screens, things like that that the operators use. That's been an accepted risk. So we see some virtualization locally on local servers. Cloud is very minimal. We're starting to see a little bit more of it. It's mostly for telemetry, for like collecting data about system operations because it's not mission critical.
It's important. Yeah, we want to know how well the process is functioning and if things are breaking. But if that goes down, nobody dies.
So that's really the only place we're seeing cloud right now is like in telemetry, IIoT, data collection, things like that that tell us about the system after the fact and are not actually involved in things running safely just for the sake of not having a network outage kill people. Yeah.
[00:23:43] Speaker B: So it is really a high risk environment.
[00:23:45] Speaker D: They're all high risk environments. They're either going to impact the process, whatever they're producing and the equipment surrounding it, or life and safety. The bottom line in all these OT systems. And then there's tangential effects like damage the. And contamination to the environment.
So it's all very physical, real life things.
[00:24:03] Speaker B: So would you say making a mistake in these sort of environments would cost a lot worse as in other environments?
[00:24:09] Speaker D: Yeah. There's tons of redundancies in industrial environments because human beings make mistakes.
Most of the failures, outages you see in industrial environments, the accidents you see on the news are caused by failed equipment or human error. And that still happens. And that's most of the time when things fail. That's why we're starting to see an increase in people doing this using cyber means on purpose, certainly. But those attackers have to evade those layers of redundancies and safety controls that are in place to prevent mistakes.
Now, that doesn't mean that you as a cybersecurity person can't do severe damage doing normal cybersecurity things like scanning an environment, using active scanning, doing pen tests, et cetera. Those things can potentially bring down or disable or damage industrial equipment. So we do have to do cybersecurity very differently in these environments and think very carefully about when we're going to do specific activities.
[00:25:06] Speaker B: Is there like a chain of approval that works in this system? Like if you have to make major changes, you have basic levels that you need to sort out permission from before you make any changes. And is it like very strict with those constraints?
[00:25:19] Speaker D: That's a great question. It's usually not the cybersecurity person who makes those decisions either. In an enterprise environment, in a normal IT environment, the cybersecurity team or the CISO might have the final say in an upgrade or a policy or an audit. In ot, that really is the final decision makers are like the facility manager and the safety manager. They take advice from cybersecurity people. So you need to be able to give them good reports and good advice. But that's more of a decision that the people who are in charge of keeping people safe and alive are making.
[00:25:53] Speaker B: Yeah. Now, coming sort of to your current role, so as the Technical Director of Incident Response at dragos, what does a day in life actually look like for you and your team?
[00:26:04] Speaker D: Chaos. Something to tell young people. And I know you've got some students that watch your, your, your show for students, everything in cybersecurity will look interesting to you. When you're very young. You're going to be like, I want to be a pen tester and I want to be a forensic analyst and I want to be a SOC analyst and I want to be a malware analyst. Because it all looks cool and also looks like.
[00:26:23] Speaker B: That is so true.
[00:26:24] Speaker D: I know, I know. I hear it all the time. That's one of the things. People pull me aside in the room and they're like, am I the only one? You're not the only one. So the good thing about that is that there's a cybersecurity role for everybody because they have very different requirements and very different lifestyles. And what you should do is you should look at the downsides of the job. Because my job is complete and utter chaos. I have no work life balance. I'm on 24 hour notice to go anywhere in the world. I never know what I'm going to be doing the next day. And I spend a lot of my life living out of hotels and airports.
[00:26:54] Speaker B: That is a lot of spontaneity. It's like, yeah, and if you're a.
[00:26:58] Speaker D: Spontaneous, adventurous person like me who likes to travel and likes drives and chaos, I call it the firefighter personality, then incident response, especially consulting incident response is a great choice for you if you do not like those things. There are plenty of other interesting roles, even involving forensics in cybersecurity. But that's my tip for the young people out there. Like, think about the things you don't want to do. Because, you know, if you ask me about the cool parts of my job, I'll be like, I get to solve cases in every industrial sector and I get to save people's lives. And it's really cool. And. But if you ask me the things that I don't talk about in those talks, like, what is the downside of my job? It's the burnout. The median burnout rage for my role is like 40 to 50 people burn out and have like mental health problems and physical health problems or have to quit the industry.
[00:27:49] Speaker B: Oh my gosh, that is very scary.
[00:27:51] Speaker D: Yeah, it's the high stress job all the time. So there's days when I just, you know, I have no cases and I'm working on projects or things like this, like developing the next generation, doing training, teaching classes. And then there's days where I've got like two cases simultaneously and I'm on a plane and I'm like working from an airport floor and, you know, lives are on the line. So it just varies by what happens and who calls us that day.
[00:28:16] Speaker B: So you mentioned you spent a lot of time teaching and mentoring. So what are the most common misconceptions that students or early career folks have about incident response?
[00:28:27] Speaker D: Incident response in general? That's an interesting question. So first of all, understand that it really does have two halves. A lot of people think about it as one half or the other half or the acronym dfir, which is digital forensics and incident response. They're technically really two jobs. But to be an incident responder, especially in the consulting space or the corporate space, you do both of them. So you have to build two sets of skills. You have to be good at forensics, you have to be good at host forensics, including disk and memory and network forensics, log forensics. But you also have to be good at crisis management and the human side of incident response, which is walking into spaces where people are crying or screaming. At you. It's customer service. I mean, people take out their emotions on you as you walk into the room with somebody they don't know. So be cognizant that you have to be good at both of those things to succeed at incident response. And not every program prepares you for both of those things. You might have to draw on your own life experience.
I oftentimes tell people who are like immigrants or people who are parents, hey, you've built a lot of skills that can will help you be good at incident response in, in those things that have nothing, nothing to do with computer science or cybersecurity because you need that broad range of skills to survive in different roles. So that's probably the biggest misconception I see out there, is just the array of skills you really need to succeed in this space.
[00:29:56] Speaker B: So if you had to boil it down, so what are the core skills and mindsets that make someone successful in industrial incident response? So I know one that is staying calm under the pressure. You gotta be doing that. But what are the other skills?
[00:30:09] Speaker D: Not panicking. Don't panic is the number one skill. There's. You have to be very adaptable and creative. You have to have good computer science foundations, like I mentioned. That's a vital part of it, understanding how computers and networks work. You have to understand how people practically actually attack industrial environments. So like the ICS Kill chain and Mitre attck for ics, you have to be able to stay calm under pressure and you have to be able to let things go, which can be very, very hard for computer science and cybersecurity people from traditional backgrounds. I walk into environments that have 97 pieces of malware on each computer and sometimes I have to walk away because if I tried to clean them, it would cause more damage.
And all I can do is suggest that in five years, when they have an outage, they make a plan to clean it up. Call me then, because if I try to touch anything in operations, the computers are so old and so sensitive and they've been running with that malware for so long that me poking at them is more likely to cause somebody to get hurt. And there's also cases where I have to stop forensics. Like in an enterprise environment, I might have a month to do a full workup and catch the bad guy and find out what everybody did at every second, every nanosecond of the incident. And in these cases, sometimes I only have two hours before they have to get the systems up and running again or they're already working on restoring the systems.
So I can do very limited forensics. I have to pick and choose. I have to decide which computers matter and I might never get a full answer of what happened. I can only advise the best answer that I can.
[00:31:43] Speaker B: So it's also about making the right sort of judgment and prioritizing.
[00:31:47] Speaker D: Yeah.
[00:31:47] Speaker B: And I feel like a lot of these skills we do learn in day to day life, but it's just how to implement those in your work as well.
[00:31:56] Speaker D: Oh, absolutely.
Don't ever think that the skills that you learned in jobs before you got into computer science, before you got into cybersecurity aren't important.
They make you a better, better cybersecurity person. They make you a more well rounded human being and they are things that you should be conveying in your interviews and on your resumes, your cvs. That, that shows that you have a diverse set of skills and you can handle situations that perhaps other candidates can't.
[00:32:24] Speaker B: That is a very good point. And so you work with a lot of life safety environments like you mentioned, and there's a lot of chaos going around and it's a massive pressure. So how you yourself stay calm in and effective in those moments. So what's like your magic trick, Part.
[00:32:40] Speaker D: Of it is knowing how to socially engineer people and that's something you learn over time. But when I talk about that, I mean managing people it in managing emotions, understanding and being empathetic. Yes. But also understanding how to portray what's going on and how to convey my own experience in a way that works for the audience I'm standing in front of. If somebody's panicking, there's various tactics for dealing with a panic. And different people are looking for different things. Sometimes they're looking for reassurance, sometimes they're looking for facts, sometimes they're looking for an authority figure. And I have to be able to read the situation and understand what that person's looking for. In the end, I'm always trying to come in with an air of confidence, but empathy. So I understand, but I know what we're doing and it's going to be okay because I'm here and I know what I'm doing. But how I convey that to different audiences from different demographics, different backgrounds, different teams, different roles is quite different. So being able to read the room really well, being able to read human emotions is a vital part of this. And then being able to change my demeanor and the way I convey ideas is really important as well.
[00:33:55] Speaker B: That is a very good skill. And how did you sort of manage to have those techniques in. Was it like through experience you learned a lot of that, or was it that you started to build different scenarios in your head and be like, oh, how I'm going to act in this situation?
Was it a lot of that?
[00:34:11] Speaker D: So I'll give you a couple different tactics I won't say just for myself in particularly if you're neurodiverse, reading emotions and reading situations like that can be very hard. You might want to treat it like a game and educate yourself on it. Like be able to work with those situations by educating yourself on human reaction and how to calm people down in various situations. There's plenty of written psychological and sociological material on that out there that you can read, even work on team dynamics. Other than that, if you are a pretty empathetic person, it's more about controlling yourself because you see people in horrible situations and that can be incredibly, incredibly hard on you to see people hurt or injured. And in those cases, you have to focus more on managing your own expectations and understanding what you are doing as part of the mission. And that includes stress management techniques, things like box breathing, things like meditation, whatever it takes to, to help you get through those situations without showing that you're stressed too. Again, the vital thing is that air of confidence and reassurance that you bring into a situation. So you can't, you can't even if you're scared. And there's been cases where I've been scared, really dangerous situations where I've been scared too. You cannot show that. So you're going to have to learn tactics if you're a more empathetic, feeling person, to control how you portray yourself to other people during those situations.
[00:35:39] Speaker B: That reminds me, if we apply for any customer service role these days, the jobs they ask us to do this assessment where they show various emotions and you have to pick which one is the angry, which one is the sad. And it's not just like one question. There's like hundred different questions and they keep on repeating. I think it is basically to assess that skill of how you're going to manage a room full of people when you have an incident on your hand. And the other thing is. So what changes do you see coming in the next five to 10 years in industry, industrial cybersecurity? What should students start preparing for?
[00:36:09] Speaker D: So the problems are getting worse, the mitigations, and the defense is getting better. People are starting to think about this problem, so they're hiring more people. There's more programs and more legislation appearing to deal with these Problems of industrial cybersecurity. But at the same side, same coin, other side is the adversaries have also learned this as well. So they have started to think about these industrial environments as very good targets for crime, for extortion, as well as the typical long term sabotage and espionage we've been seeing for a long time from like governments.
So they become more of a target. They're still really vulnerable. People still really don't know how to deal with them. There is progress being made and more people are getting into the space. But the attacks are definitely on a rise, especially attacks specifically targeting those environments. And the connectivity of these environments to the Internet and to each other is just increasing.
It's just continuing to increase. Like we talked about cloud, we're starting to see those cloud connections into the environments and the environments are tremendously vulnerable. And now like there's like 8 to 10 remote access methods into them and that that vastly increases their vulnerability to these types of attacks that people are trying to conduct across a wide range of verticals. So there's a growing space there. It's not going to go away. There is no easy fix. The systems that are getting produced today with like basic cybersecurity controls built in, we'll be seeing in 30 years. So we're going to be seeing the Windows 11 systems that come out today for 30 more years. So long past my retirement. But students today will definitely keep seeing them throughout their career, you know, so we're in for the long haul.
[00:37:47] Speaker B: Yeah. And as we know that students today use a lot of tools and AI. So if students have like laptop and AI at their disposal and they want to start building those foundational skills, so do you think AI would actually be helpful? Like is there a lot of public information out there on this that they can start building their foundational skills?
[00:38:06] Speaker D: AI is a tool that can be used for some of these tasks, if you understand how it works. My side of things in defense, I see it more as a risk than a benefit because adversaries are starting to use AI to figure out how to do things like, hey chatgpt, if I am in this model of water treatment facility, how much chlorine would I need to add to kill the population?
ChatGPT will answer questions like that and then you can say, hey ChatGPT, write the logic for a model 5000 PLC to do that. And so those used to be things where you'd have to have an engineer or do some research, significant research over time to answer those questions. How to write logic for obscure PLCs to do bad things. And now that's something. You know, what LLMs are good at is taking a big data set and normalizing it and finding the most common answer. So they control the whole Internet and they can find the most common answer for how would we increase the chlorine and the water until we killed everybody? Or how would we write code to do that? And it will produce that answer in a lot of cases if you ask it the right question. So for us, it's mostly an increasing threat than something that's beneficial to use it in security analysis. We certainly use like machine learning to detect, you know, baseline changes in these industrial environments. So we're absolutely using AI. We use it to like look for changes in protocols and traffic direction in remote access, things like that. And that's a very valid use that we've been using AI and machine learning for a long time for as a learning tool.
[00:39:45] Speaker B: Be cautious.
[00:39:46] Speaker D: There's no vetting of the information that comes out of this kind of mean average result. And we're talking about environments where people can very easily die. So if it hallucinates, if it gets something wrong, if it pulls from bad data, you could be accepting an answer that could lead to somebody's death.
So I would be very cautious in going to subject matter experts before you use AI for anything in the industrial space.
[00:40:14] Speaker B: That is a very fascinating point you make, especially with the AI hallucinations. It's very important that we verify all the data that AI is giving, because most importantly, it's not right. And even at the bottom of ChatGPT, you can now notice the thing that says ChatGPT can make mistakes. Please do it at your own risk. So I think that's very valid. If you are working in a high stake environment, or even if you care about what information you're gaining or what data, it's important to check what links you are getting or what is the source of information that any AI is pulling the information from.
[00:40:44] Speaker D: I couldn't agree more. I sound like a curmudgeon when I'm on these podcasts and I'm like, I'm not really a fan of AI, but that's why. It's because people don't understand what you just said. You have to understand that it's a tool that does a very specific thing and you need to understand how it works and how it's getting the data it provides you. It's not sentient, it's not conscious, it has no emotions, it doesn't care, it's Just a data collection tool which can be very useful. But the problem is people are misconstruing what it can and can't do and that's leading to very dangerous situations.
[00:41:17] Speaker B: Yeah. And also you have mentioned mentoring a lot and I think that's such an important point for students. So do you have a story for students who reached out to you, maybe through LinkedIn or another way, and what they said or asked that stood out to you? And how do you think in today's time, like, you yourself mentor a lot of students?
So how can a student find a cybersecurity mentor?
[00:41:38] Speaker D: Finding a mentor is tricky. I wish more senior people were doing it because I'm booked out to October right now. It's, it's currently September. I'm booked out into October. And that shows the immense demand for mentorship versus the supply of mentors out there. Going to conferences is a really good tactic. Going to your local meetups, your B side conferences in every major city around the world, any cybersecurity events and organizations, you can get to having conversations and finding people organically that way. There's also some hashtags like Mentoring Monday on some social media platforms you can look into, but that can be a bit challenging. A lot of conferences also have discords and slacks and those could be a tactic for finding mentors as well.
[00:42:23] Speaker B: Thank you. That's a very good piece of information. I'm going to look into some of those as well.
[00:42:27] Speaker D: Yeah. So things that are most commonly asked of me like, like I already mentioned, like everybody wants to know what niche is cybersecurity to get into, which is just amusing to me now, which is really cute and makes me happy because I love, I love that people still want to learn about everything. That's wonderful. Another thing that I get asked about is like toxic work environments. A lot of the mentorship that I do is just like validating as an unbiased outsider what people think is going on. Like being that, that confirmation of a bad situation or something that they think is negative or illegal or things like that in their workplace. And they just need somebody who's not invested, not biased, to tell them that they're correct. And so yeah, a lot of the young people I talk to are in ticket mill type security operations centers, environment where there's no progression or they have toxic management, they aren't getting raises, things like that. And it's just, just being able to talk to a human being who can say, yeah, that's wrong, you should report XYZ to HR or you should look for employment elsewhere. Sometimes they just need to hear that from somebody else because it's scary. And I do a lot of that as well. So, yeah, I mean, if you're feeling like that there are a lot of toxic workplaces out there in junior level cybersecurity. Please find somebody to talk to you. Please do go to your local meetups. Find a senior person who's not in your organization to talk to about this. It probably is bad as you think it is, and you probably do need to make a change, but that can be really scary to do alone.
[00:43:57] Speaker B: I think that's a very good point you touched on. Not only will it build the confidence in the person, it will also make them feel safe and heard. And once you have that sort of guidance or you say friendship or any sort of, like, just because someone is listening to you, it helps a lot in their progression of the career and making them feel seen. Thank you. That's a very good point. The other thing is. So, Leslie, I wanted to ask this question from a long time. I read that during the pandemic, you pulled together a virtual DEFCON almost from a scratch, which sounds incredible for students listening who might not know what the full story is. Could you share what happened and from that experience, what are the skills or qualities you think students need to develop to handle unexpected challenges or make things happen in cybersecurity?
[00:44:44] Speaker D: I don't think it's a special set of skills. Yes, I'm an incident responder, and I'm used to chaos, but I think I just have the mentality of take a bad situation and try to make it a little better. The story behind that is, yeah, we had the second longest lockdown in Chicago. I'm now living in the city. I live in Melbourne, and I live in the city with the longest COVID lockdown. But, yeah, there was really long COVID lockdowns at the beginning. All the cybersecurity conferences around the world got canceled, and there was a lot of, like, big ones that people were very excited about that were canceled all at once. And it was incredibly, incredibly demoralizing during a very scary, stressful time for everyone around the world. And so I did what I do, which is deal with the crisis and try to make things a little bit better. And I said, why don't I run conference and run it online and why don't I do that, like, next week because everybody's stressed out right now.
So I found some friends who were also stressed out and needed an outlet for their stress and, like, Five of us put together an online conference in a week and we got 24 speakers. We put out a CFP, we got speakers, we got people to present talks and the community pulled together. We did it on a shoestring with no budget. We ran it on YouTube and Twitch and we had like 8,000 people. So tune in because nobody had anything to do, Nobody could go anywhere. We're about to run year six. There is, I guess, I guess there is a demand for a goofy virtual conference. We keep getting asked every year to run it again. So that's the story and like the message there. The moral of the story is just like if you are dealing with impossible, stressful times, sometimes a solution for some people is to find an outlet that makes things just a little bit better for a few people. Just help one person. If you can only help one, you know, just help somebody.
[00:46:38] Speaker B: Wow, this is huge. And the advice that you gave at the end, the moral that that really stands out and no one would actually want to do this because, you know, there's a lot of pressure. And any conference that is generally built in a cybersecurity space, it takes ages for like to organize getting the speakers, getting them to rsvp, getting students in and like who's going to come and attend all of that pressure. But having the ability to do it, having sort of a can do attitude and okay, I'm doing it for myself, I'm doing it for someone. And I feel like someone can be helpful and then having that sort of confidence built in you is what I think is really helpful. Even in cyber security or any other space.
[00:47:17] Speaker D: I think that's something that comes with age. I think that's something that comes with age and failing a lot over time and making lots of mistakes is just caring a little less what people think and giving things and go. It's very hard when you're young and as you get older, if you don't get boxed into a miserable situation, you just start caring a little less about, you know, oh, what if, what if it doesn't go well? What if people think it's funny? What if people don't like it? You know, who cares? Let's do it. Some people will have fun.
[00:47:47] Speaker B: I think coming from a professional like yourself, definitely. But for students who are just starting out, I think these sort of questions that you mentioned, they are like the top priority, like the higher risk. Oh my God, what that person gonna say? Oh my God, how am I gon look in that situation?
I do think that I hope it gets really really easy as we grow older because that would take off a lot of pressure and we can actually then focus and prioritize on what needs to be done.
[00:48:13] Speaker D: As long as you keep being honest with yourself, you know, as long as you keep liking yourself and being honest with yourself, I promise it gets better.
[00:48:21] Speaker B: That is so reassuring to hear. So lastly, Leslie Dragos is renowned for its expertise in industrial cybersecurity. And for students who are aspiring to enter this field, does Dragos offer opportunities for junior roles? And if so, I know that you mentioned a lot of foundational skills already, but is there any sort of experience that they look for in the candidates, what they have worked with, or how can students best prepare to meet these expectations?
[00:48:49] Speaker D: We do have junior roles in a variety of spaces. We have kind of three arms of our business, from software development because we make security products, to threat intelligence, tracking what bad people are doing, to industrial environments and to services, which is where I'm at, which is everything from incident response to pen testing. We do hire junior people in those spaces and we also have internships. The things that we look for most typically are those really strong computer science and computer network foundations and usually some knowledge of process environments. And that could just be a part time job that they held when they were in school.
[00:49:26] Speaker B: School.
[00:49:27] Speaker D: Some knowledge of maybe a manufacturing or a shipping facility. Just to understand that concept of there being real life consequences or a hobby would do too. That's very helpful. So that's what we're looking for. Yes, we do hire junior people. We have job positions open right now in Australia. So I hope that people check us out and consider us.
[00:49:46] Speaker B: I'm so happy to hear that because I know most of the students and even my fellow friends that they would love to look for those opportunities. So thank you so much for sharing. And lastly, Leslie, I'd just like to end it with a rapid fire.
So just whatever comes to your mind. One word, one sentence, go for it. You ready?
[00:50:02] Speaker D: All right.
[00:50:03] Speaker B: Favorite keyboard? Shortcut, control. Z1 app or tool you can't live without. Besides email, of course.
[00:50:09] Speaker D: Oh, apps that I can't live without. Mastodon and Blue sky right now. But Instagram is a guilty pleasure.
[00:50:17] Speaker B: Yeah, 100%.
That's the first thing in the morning. If cyber security were a sport, which would it be? And why?
[00:50:24] Speaker D: It would be curling because there's a lot of drinking afterwards.
[00:50:26] Speaker B: That is very fascinating to hear. And that's all. Thank you so much, Leslie.
[00:50:30] Speaker D: Sure.
[00:50:33] Speaker B: Thank you for tuning into this episode of Destination Saba Season 2.
[00:50:37] Speaker A: Knowledge is a gift but its true value is in how you use it.
[00:50:41] Speaker D: Whoa.
[00:50:42] Speaker B: Where did you come from?
[00:50:43] Speaker A: Just dropping by to remind everyone. Learning is great, but doing is even better.
[00:50:48] Speaker B: Timely advice if today's episode left you.
[00:50:51] Speaker C: With questions or sparked new ideas, feel free to connect with me on LinkedIn. And don't forget to follow the podcast so you're always ready for the next stop on our cyber journey. This is Shahid signing off until we re encrypt another conversation on Destination Cyber Season 2.