Q&A with Katie Moussouris, cybersecurity professional

NOW PLAYING

Katie Moussouris is the founder and CEO of Luta Security, a cybersecurity firm focused on vulnerability disclosure.

So tell me a little bit about how you got started in cybersecurity?

Well, as a hobbyist — as a teenager — I was kind of growing up in this era where you dialed into a bulletin board system to talk to other folks online. There wasn’t AOL instant messenger even, back then. I happened to dial into the same bulletin board system as a bunch of hackers known as The L0pht. Some of these hackers, we’re all grown up, but have founded cybersecurity companies and worked for DARPA [Defense Advanced Research Projects Agency], etc. But these were the folks I grew up with as a teenager. So as a hobbyist, I kind of learned online with my friends.

And then professionally, after trying a few different things out, including being a molecular biologist, and being a systems administrator, and being a Linux developer, I finally realized I can write code, but I’m way better at breaking it. So, I decided to become a professional penetration tester, which is a person who is hired by, let’s say, a bank to find the holes in the bank and show you how to fix it. So, I did that professionally for about seven years, and then I joined Microsoft as a security strategist and helped them build better ways, better programs and interactions with hackers. Takes one to know one.

As a cybersecurity professional, what are some of the projects that you focus a lot of your time on?

{mosads}My company [Luta Security] is unusual in that it focuses on building back-end processes to deal with vulnerabilities. So, it’s one thing to learn that you have a vulnerability, but then what do you do with it? How do you make the decision — especially if you’re getting report, after report, after report — how do you prioritize those? How do you build up enough engineering resources to handle that on an ongoing basis? How do you feed that knowledge back into writing more secure code the next time?

So all that back-end stuff, I like to think of it as that conveyor belt in the factory, and you need to have adequate preparation there. So, that’s what my company does. It lets you prepare to hear about vulnerabilities on a schedule that maybe isn’t your own.

And when you say “you,” you’re meaning businesses, corporations, governments or things along that nature. Not like the average person.

Well, the average person normally isn’t receiving a whole bunch of bug reports, but yeah. Absolutely. I work with governments like the U.K. government, and my company has a partnership to work on vulnerability coordination in different companies. Also, National CERTS — those are the Computer Emergency Response Teams of different countries.

And the reason why they ask me, and my company, to do this is that I was one of the co-authors and an editor of the international standard on how to do vulnerability coordination. And also, I created a lot of vulnerability disclosure programs at major software companies. So I created Symantec vulnerability research. I created Microsoft vulnerability research. And I wrote their vulnerability disclosure policies. Also, I helped the Department of Defense in creating their programs to work proactively with hackers, not only in terms of the cash reward program called A Bug Bounty Program, but also in an ongoing effort to just prepare to hear “if you see something, say something.” Right? What is the reporting mechanism, and what does the back-end process look like so that they can continue to resolve those issues?

 What is a vulnerability disclosure?

Somebody tells you that you might have a bug, a security hole. And it’s the process of you hearing about it, making some decision about how to resolve it, and then communicating that resolution out to the public. So, that whole feedback loop is the process of vulnerability coordination, vulnerability disclosure.

And this can happen with a hacker trying to tell you something, a helpful hacker. It can also happen with a customer who happens to find a bug and it turns out there’s a security or privacy issue. It can finally also happen in a supply chain situation. So let’s say you were a manufacturer of something like a car, or a phone. There’s a number of different components in there that you didn’t make them all. But, you might have to coordinate up and down your supply chain to tell your other suppliers that “Hey, there’s a bug in my code”, or you may hear about a bug in somebody else’s code, and you — the entire supply chain that produces the phone, or the car — might have to made adjustments. So, that’s also vulnerability coordination.

How likely is it that hacking would affect the average American’s life?

Well, what’s interesting about the average American’s life, and our digital dependence, is that we are building software and systems faster than our ability to secure them. Right? It is infiltrating everyday life. So, let’s say, for example, the average American decides that they’re gonna have a classic car for the rest of their lives — none of these self-driving cars or cars with any electronics in it whatsoever. Well, you’d think they’d be safe. However, in a world where there are more and more electronically-enabled cars, and eventually self-driving cars, they’re going to share the road with these devices. So, even if they opt out of a particular new technology, the average American is going to see society itself is not really allowing them to opt out.

What are some easy ways for the average American to protect themselves?

Accept the patches as soon as they’re available for every device you have. Turn on automatic updates. I cannot emphasize this enough. So much control is in our hands when we postpone updates, and if you look at some of the big headlines, those big worms, the big Botnet attacks and everything — those are things for which patches were available, and yet they weren’t applied. And so, we’ve seen this pattern of exploitations since the very beginning of the consumer internet. What I keep seeing is the fact that patch deployment — not zero days, not things for which there is no patch and attacks are running around. Zero days account for less than 1 percent of actual breaches. And the average consumer, the best way to protect themselves is by those updates. 

And what are some things that the general public should know that they maybe don’t know?

So this whole time I’ve been talking about the process of vulnerability coordination and disclosure, you’d think that big companies and big governments would have these processes. At least the big ones.

Well, when we did a market survey of the Forbes Global 2000 — so, that’s basically ranked by Forbes – the top 2,000 companies. These are companies that make a lot of money, and they spend a lot of money on security, as they should. Like the Equifaxes of the world. You’re spending money on security, they have security staff, they are not naïve or under-resourced. However, in that population, 96 percent of them had no way to report a vulnerability to them. No front door.

If you see something, well, go ahead and try to say something, but you can’t find the front door. So that is something that I think that — most Americans would be shocked that these companies that have plenty of money and are making plenty of money and spending plenty of money on security have been avoiding this process. And I think the reason why they’re avoiding it is that they’re afraid that their back-end processes can’t handle it.

What are some things the government should do to help regulate cybersecurity best practices?

Well, some things the government is already doing — which I really love — is regulators have started giving advice to their constituents about doing vulnerability coordination. Have some way to hear from somebody outside. So, the FDA [Food and Drug Administration] has post-market cybersecurity guidance for medical device manufacturers, meaning, have a way to hear from a helpful hacker or a customer or a doctor — anybody who might report a security vulnerability, and resolve that issue.

The other thing FDA has been doing is they have been myth busting in a really important area. So, usually when a medical device gets certified by the FDA, it has to go through [a] rigorous certification process. One of the things where some of the medical device manufacturers were saying to the security researchers was, “We’d love to fix that bug, but we’d have to go through the FDA recertification process if we did it because we’d make changes in our code.” That’s not true in most cases. What FDA has been doing — in addition to giving this guidance about saying, you have to have a way to handle these vulnerabilities — is saying, “No. In most cases you are allowed to fix those bugs, and you will not have to go through full FDA recertification for your product.”

I think those two things together is a really positive example of excellent, leading-edge, regulatory posture in this area.

Currently, is there any legislation that’s in the House or the Senate that you’re following or supporting?

Well, I’m following three different bills in particular. Two of them have to do with the concept of bug bounties. So, I mentioned it earlier — it’s paying cash awards in exchange for vulnerability information. And it can be quite a helpful tool. I launched Microsoft’s first bug bounty programs. I was the adviser to help launch the Pentagon’s first bug bounty programs. So they can be quite successful.

There’s one that’s proposing the DHS [Department of Homeland Security] run a bug bounty. And that’s been under consideration for a while. And there’s another one that just dropped saying the [Department of] Treasury should run a bug bounty. Now, remember what I was saying about that conveying belt process, on the back end? Most organizations, governments included, don’t actually know how to best prioritize the bugs they already know about. So jumping to a cash reward as next step seems a little wild to me. And frankly, I think they’re looking at the great success of Hack the Pentagon and thinking this is a cookie-cutter, discreet thing they can just do without all that prep.

And the thing was, we had DOD [Department of Defense] prepped for quite some time to be able to receive this rush of vulnerability information. And that is a mechanism that is being overlooked in these two bills.

And then there is one other bill. And it’s one that is basically the bill that talks about procurement of IOT devices. And there are some good provisions, sort of, at the end of that about giving protection to security researchers who want to uncover vulnerabilities in IOT devices. But the thrust of the bill is good, it’s just the way that it’s written, it’s kind of overscoped.

Part of the problem is the way it describes IOT devices. That could be anything, not just a phone, but all the way up to a complex system like a submarine. So think if the requirement is — let’s know all the ingredients and components and let’s not buy anything with known software vulnerabilities. If you don’t limit the scope of what IOT means in that context, you’re gonna be polling and trying to find all of the different subcomponents of highly complex systems. You will either spend all of your time trying to identify that list of ingredients in the first place or you’ll be spending the rest of your time filing exemptions to basically just go ahead and procure that submarine after all.

So there are some serious issues with the definition and scope of that one that I’m worried about. I like the idea that it has for the provision for security researchers and safe harbor. And there’s specifically an area in that bill where they’re suggesting Digital Millennium Copyright Act (DMCA) exemption, or protection in there. But they’ve kind of rewritten the concept. Right now, the DMCA has a security research exemption, which is a temporary provision. However, it’s being recommended to Congress to make that permanent in the DMCA. Why not just cut and paste that language? It’s already been approved in DMCA, and just get it over into this bill.

So there’s a very straightforward way to carry that already-vetted language into this bill and provide some protections. There’s some additional stuff around the Computer Fraud and Abuse Act that’s — again, I like the idea of protecting researchers who want to disclose vulnerabilities and improve public safety. But I do worry about the scope of how it’s described in this bill.

Kind of piggy-backing onto that: What do you, personally, feel is the biggest problem in cybersecurity that policymakers should address?

Well, I think, honestly, we’re seeing the secondary effects of our global talent shortage. And it’s actually manifesting in a lot of these overbroad, overreaching regulations. There are very few Hill staffers who have a computer science background, let alone a cybersecurity background. Even fewer actual members of Congress that have any of that technology background. So there’s a need for — what I call — technology day-walkers. Half vampire, half human. But, essentially half-technologist, deep practitioner technologist, and half person who’s willing to come to the Hill or go to international forums and, not just tell them “You’re wrong, and here’s why,” which is, of course, what we often do when we point out holes and flaws in things. But to be open to hearing, what is the actual objective? And how can I help you get there as a technologist? And how can I help you get there and not mess everything up?

So, I think we have a global technology talent shortage, in general. Where I see it manifesting in lawmaking, regulating, and where I’ve personally had to get involved is because we have very few of what I call day-walkers. And I would like more of them. I would like to build the bench of day-walkers.

What are some common misconceptions about hackers or cybersecurity professionals that you run into?

Common misperception has been media-driven largely. Hackers have been synonymous with criminal. And unfortunately, especially when we — as hackers started sort of as a fringe of society, as the internet-enabled society was growing, we and our culture are largely misunderstood as being at best, mischief-makers, and at worse, criminals. The fact of the matter is, we wouldn’t have the internet today without hackers. We wouldn’t have all of this amazing technology without people who are willing to see the technology of today and envision something different.

Joe Uchill contributed.

Tags

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

See all Hill.TV See all Video

Log Reg

NOW PLAYING

More Videos