Search

At the Intersection of Data Protection and Security with NetApp – Infrastructure Matters Insider

At the Intersection of Data Protection and Security with NetApp - Infrastructure Matters Insider

In this episode of Infrastructure Matters – Insider Edition, Krista Macomber is joined by Mignona Cote, SVP, Chief Security Officer at NetApp. Their discussion revolves around the intersection of data protection and security – including the top customer requirements, and how NetApp is responding.

Their discussion covers:

  • How the consumerization of IT and need for perimeter-less, zero trust security force a need to protect data at its source
  • Given that it is a matter of when, not if, malicious actors will gain access, the value of vulnerability scanning, tracking data movement and transmission, and identifying anomalous user behavior
  • Capabilities to consider when looking to address cybersecurity requirements on a limited headcount base
  • Tactically speaking, how NetApp is helping customers to gain visibility across their data estates

You can watch the video of our conversation below, and be sure to visit our YouTube Channel and subscribe so you don’t miss an episode.

Listen to the audio here:

Or grab the audio on your streaming platform of choice here:

 

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this webcast. The author does not hold any equity positions with any company mentioned in this webcast.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Transcript:

Krista Macomber: Hello everyone and welcome to this insider edition of Infrastructure Matters. I am thrilled today to be joined by Mignona Cote, who is SVP and Chief Security Officer with NetApp. Mignona, thank you so much for joining us today.

Mignona Cote: Krista, I am delighted to be here and every time I talk to you, I get excited. A little bit nervous on what you may ask, but very excited.

Krista Macomber: We promise not to throw you any hard balls today.

Mignona Cote: Well, I’m ready.

Krista Macomber: I’m sure you are. You have such a wealth of experience, that we met earlier this year at a trade show actually, and it was just so evident that you have so much expertise that we can all learn from.

Krista Macomber: So on that note, before we really kind of jump into the meat of the content today, do you want to give people maybe just a couple of the highlights from your background?

Mignona Cote: Sure. So my background, born in Central Louisiana and moved to Dallas, Texas and worked globally since then. I have worked across about six Fortune 50 companies, including AWS, Bank of America, AIG, and Aetna. Also, I am now at NetApp working across multiple cloud environments, which is super exciting, because we get to now look at how we reimagine security, and so I am thrilled to be here and delighted to talk to you.

Krista Macomber: Well, and that’s really I think a perfect segue, and I’m especially excited to have you on given your role as CSO, as NetApp, and especially given everything NetApp is doing as you’re mentioning with these multi-hybrid cloud environments.

So Mignona, where I thought we might start is this intersection of security, which of course really is kind of your focus area and data protection as well, which I think tends to be very inherent in an infrastructure grounded focus. Certainly something that NetApp plays quite a role in from an industry perspective. I know actually when we met, you called yourself and your team customer zero when it comes to cyber resiliency capabilities that NetApp is offering and that NetApp is developing.

So, I’d love to kind of take this from the angle of the customer and maybe kick it off with discussion around some of the key challenges that you’re seeing related to data security, data protection, again, from that customer perspective.

Mignona Cote: Yeah, so from a customer perspective, it’s overwhelming. Data is everywhere. So one of the things we do at NetApp is we protect data at the source where it’s stored, but that data’s often copied. It’s copied onto our laptops, it’s copied into our mobile devices. People are taking pictures of data with their iPhones, and so it’s everywhere. So, the question is how do we protect that data?

So we have many mechanisms of which we protect the data, but right now I’m leaning more towards how we detect when the data is being accessed, and then what is being done with that data. Leveraging behavioral analytics tools for that, using our own cloud data sense on how the data moves through an environment.

Actually recently since I’ve last talked to you, I’ve become very intrigued with browser security. How do we lock our devices actually to the source of the data? To then I can look at using my own computer instead of my work computer or my own iPhone instead of work iPhones, and being able to actually still protect that data.

So the exciting part is the things that we used to do a lot of blocking, and you keep blocking and you keep bypassing what’s blocked. The more we block, the more circumvention of controls that happen, now we’ve got technology at the point where we can actually start protecting at a different level.

Krista Macomber: That is exciting and so important. I mean, you’re kind of describing this Wild West almost, and it’s very true. It’s this concept of consumerization of IT, and I think we’re all guilty of doing work from our personal devices and things of that nature.

So, beginning to kind of tie that concept back to this theme of infrastructure matters that we have on this show, I did want to talk a little bit more around this idea of protecting data itself. So what you’re describing, it kind of loosely makes me think of these Zero Trust, these perimeter security approaches that are kind of a buzzword, we’re hearing quite a bit about them.

Really the premise is that security really needs to go beyond these more traditional network-based approaches. Even technologies like even next generation firewalls, for example, are important. But really the controls need to shift to the data source as well, which it sounds to me is kind of what you’re describing here in your browser example.

So, I was wondering if you could maybe comment a little bit on that capabilities, more capabilities run access control for example, and other tools that you’re seeing be very important as we begin to maybe shift our mindset towards thinking about security at the heart of where data is being created and accessed?

Mignona Cote: Yeah, so if you look at the heart of where the data is being created, you’ve got the … I’m just kind of thinking of … I’m so fortunate to work in a company that protects data, and so you look at where it’s stored and you have all these control mechanisms, like who is accessing the data? How frequently is that data being accessed? Also, is the data changing? Because we rely on data to run our companies and to make critical management decisions.

So the world we’re in now, protection becomes more of access. Who has access to that data? So, we’ve for the past few years have been hearing about identity is the new perimeter. So my identity, either I have an identity or the devices, the devices that work into a network that communicate with each other, or the IoT devices, they all have some form of identity.

We’ve got to know what identity can access that data, me as a person or an IoT device or another application. Then how do I know that I am who I am, or how does the system know I am who I am? So, there comes back to many criteria. There’s 160 different elements last time I looked at it, contextual features, like how fast am I typing on a keyboard? What’s my device I’m coming from? What’s my behaviors that I execute? What are my biometrics?

So, identity now becomes all these contextual features that say I am who I am, and then I am allowed to get to that data. Then at the source that data says yes, this is allowed to get to the data, and the behavior with that data now is what we’re measuring to see if it’s good behavior or bad behavior.

Krista Macomber: Yep. Yep, that makes a lot of sense. Wild to think about, even as you mentioned, the speed at which we type, right? There’s so many different factors and biometrics that can maybe go into access control, but at the same time, unfortunately we do see that these bad guys, these malicious actors, they’re innovative, they’re always trying to find new ways to penetrate the environment. So it’s almost like try as we might, we do have to assume that they will get in.

So I think the access control is certainly something that is an important table stakes feature, but we also have to think about, okay, what happens if they get in? Or what if we have a situation of a malicious internal actor as well? So, this leads me to think about some of our more traditional best practices more closely tied with data protection, things like the 3-2-1 rule for having that offsite backup, having sufficient recovery points for example, and validating backup integrity, not assuming that our backups are good, right?

So you’ve alluded to this a little bit, but thinking about the ability to track data movement, data transmission, and then I know you were mentioning the ability to identify anomalous user behavior. These are being built into data protection tools, and I know that when I was lucky enough to spend some time with NetApp earlier this summer, there was a lot of talk around things like vulnerability scanning for example, and like you mentioned, the analysis for anomalous behavior. So, maybe you could share a little bit more about that and what you’re seeing from that standpoint.

Mignona Cote: Yeah, so first of all, we have great controls in place, but to your point, bad things happen. So bad things happen, let’s think about it, because people are still setting up the environments and people will make mistakes. We’ve got upgrades happening to our technology and there will be vulnerabilities in those upgrades, we have legacy stuff that still has old vulnerabilities in it. So, you just kind of look at all of this and so you have back doors in the environment and the threat actors look for those back doors. So if they can’t compromise my identity, they’re going to look for the back doors.

So traditional controls, we’re scanning the environment. Especially if you look at the cloud, are there any holes into a cloud environment because it’s internet-facing? Also scanning internally, are there any of those key ways that a threat actor wants to try to get through and then traverse through the network?

So you’ve got lots of scanning going on, but the cool thing is you have all these controls, you’re scanning and you’re trying to get it all fixed, but augmented to that is still what’s happening on the network, the pulse of what’s happening in the environment. So, that’s where we get into our behavior analytics. So you’ve got the traditional controls, do we have any holes in the environment? Let’s fix those holes. What’s the pulse of the environment through the behavior?

But then I always call it the ultimate control. The ultimate control is the data where it’s stored. So the data where it’s stored, one of the things that we’re doing is we have snapshots that are continuously taking pictures of our data, so should something ever get to the data or we see some type of weird behavior, we can go back a certain time period, let’s say 10 minutes or whatever, and recover that data to that point in time.

It is that snapshot that helps us know that if any of these things fail, we’ve got this still, and so this is a form of backup and recovery. You’ll hear all companies always talk about backup and recovery, and so you do have to have that control in place.

Krista Macomber: Absolutely. I think I’d have maybe just a couple comments on that, Mignona, which is that certainly I’m seeing heightened interest in identification of vulnerabilities as well, especially again, as all these approaches to attacks continuously change.

You bring up an important comment regarding the ability to roll back to a previous snapshot. Can you maybe talk really quickly about … one of the big challenges there that we see is the ability to understand which is the last known good backup, right? So, which is the best recovery point to recover to before the attack occurred? So is NetApp doing anything around this, maybe using some of this anomalous behavior identification?

Mignona Cote: So, I think that the critical point on the backups is looking through. When you say known good, you have to compare, let’s say snapshot number one, number two, and number three. So you look to see what type of changes have happened and are those changes, I call them the known good changes, is there something suspicious in those changes? So, it does require a bit of some analytics in doing that.

Krista Macomber: Mm-hmm. Yep, yep, certainly, certainly. All right, so I know another big issue for customers that we are seeing, and I think you alluded to this in your previous comments, Mignona, is this idea of limited IT staff, limited headcount. It really comes up pervasively, it’s among the top challenges that we’re seeing customers are facing with, probably second only to cost and budget-related issues.

So when we think about this growing plethora of cybersecurity and data protection capabilities, from my standpoint what this means in tandem with these headcount pressures is that these security capabilities need to be able to be applied and overseen not only consistently across these environments, but also in a way that’s as streamlined as possible for IT operations, and that really limits the headcount from their perspective.

I would say it also means that there is a premium that’s placed on automated accelerated recovery capabilities as well. Of course this feeds into limiting downtime for the business coming out of attacks, which is I think certainly a top of mind concern for the business. So, can you maybe add some commentary around that? From a customer perspective, do you see that as well? Maybe what are some of the key capabilities from your perspective tying in here?

Mignona Cote: So first of all, I’m so glad you brought up the issue about technology shortages. You read all these statistics about technology shortages, and I tell you what works on that is having a grapevine. You have to have contact, you have to have a network, you must know people.

In addition to that, we’re at a great point in technology where we can actually shift and rely heavier on technology instead of some of the manual methods that we’ve dealt with in the past. So Krista, I keep thinking, every time I count, there’s always seems to be this magical number, we have 4,000 vendors. So okay, we’ve got 4,000 vendors, I’ve got one brain. How do I keep up with what’s going on?

The thing I like the most with the cloud is the ability to actually build out our controls now in what I call reusable code. You’ll hear security is code, policy is code, but in just standard JSON language, you can set the configurations of how you want your AWS environment to work. Then also, you can reuse this design across the multiple hyperscalers. So what you do is you create this is how I want my settings to be, and then that reusable code starts driving consistency of protection across my environment.

So I no longer am having to log on, set some type of setting and then hope someone else doesn’t log on and change the setting. Instead, I put the code in and then I set the code so no one can change it. That’s what we call an immutable environment. So we set it one time, use it many times, then make sure it can’t change, and then the control now becomes testing to make sure that that code is actually in there. So it’s kind of exciting, because now we can rely on the actual code of the environment to start driving and protecting our security.

Krista Macomber: That makes sense. It’s almost these guardrails, that it can be overseen by IT, yep.

Mignona Cote: Yep. Yeah, the automated guardrails.

Krista Macomber: Yep, I love it.

Mignona Cote: I hear many words. Having come from AWS, I know the AWS language. Now spent a lot of time with Google and Azure, I’m learning their languages. So we still have the opportunity to standardize on our language, but the concepts are there and they’re active in our production environments.

Krista Macomber: Certainly, certainly. It’s a little bit of learning the language, as you mentioned, of different service providers and vendors and things like that, but that’s a topic for another conversation probably.

Mignona Cote: Okay.

Krista Macomber: Yeah, so I think this makes me think of maybe just the final key area that I don’t know that we’ve fully touched on yet, and that would be visibility. So, that’s something that we’ve seen on our side for many years. I was actually brought on to Evaluator Group prior to the acquisition by The Futurum Group to start looking into multi-cloud data management and visibility technologies, which data storage and data protection certainly fall into that category as well.

So when we think about the context of this conversation here, we’ve been talking about this myriad of devices, we’ve been talking about the idea as well for IT and security teams to have visibility across the entire data estate as well. Not only having that visibility, but the understanding and the context of where the most critical data is. For example, uncovering privacy risks.

We were talking about vulnerabilities earlier for example, and really the ability to direct where the efforts really need to go during the recovery process to meet the service level requirements for that downtime, that data loss, for example. So, maybe you could give some commentary regarding how we see this being implemented from a technical perspective.

Mignona Cote: One thing popped in my head when you were saying that, it’s kind of like making a good soup. You’re just throwing everything into the pot.

Krista Macomber: Yep.

Mignona Cote: So we have our data prevention tools, our data monitoring tools. We have all of our cloud services, we have our behavior analytics tools, we have our endpoint security tools. So we’ve got all these things, all these tools that are monitoring the environment. So one could easily get overwhelmed by logging in and looking at the dashboards, because we’ve got a lot of dashboards, on how all this stuff looks.

So, what we’ve done is we’re making our soup. We’re throwing all the ingredients into one big AI engine and having that AI engine actually normalize the data and let us know what we’re really seeing. Is what we’re seeing in this environment looks similar to what we’re seeing in these other environments? Also, is it measuring normal? Are we seeing things not normal?

So with that, we let the technology stir it for us, and then all we’ve got to do is take that last little bite when there’s something that we really need to pay attention to and just focus on that one component. So again, that brings up the coolness of where we are from technology, and as well it lets us start merging multiple types of analytic tools together in order to see what’s happening in the areas that we have to manage.

Krista Macomber: That’s really interesting, and I know in my role as an analyst in the industry, I do get questions. Okay, we’re hearing a lot of buzz around analytics, artificial intelligence, as you mentioned, in the role that it’s going to play when we think about cyber resiliency, ransomware recovery. I would certainly agree with your commentary that I think it can play an important role. Do you think it still needs a little bit of development to get there? I’d love if you maybe have any comments on that.

Mignona Cote: So, it’s going to keep needing development when you talk about generative AI. We’re going to keep seeing things, we’re going to keep going, but we are at a place where we’re using it as well. So, we’ve got it implemented here and every Saturday morning I am going through all the data to see, okay, well, what are we actually seeing?

So, we’re seeing a lot of attempts to download malicious code or just other things that seem very normal, because we see these attempts all the time. The thing is, if something ever got through, then that would be where the question is. So, we are stopping stuff and the good thing is we are stopping stuff. However, there’s going to be creative minds out there trying to find out ways to keep breaking in, and so we’ve got to stay in tune to what’s happening. That’s where we go and analyze what’s happening out in the hacker chatter community. So yeah, it’ll keep changing, but we are at a point where we can analyze.

Krista Macomber: Awesome. That makes sense. Certainly I think it’s going to be continual refinement, right? Especially looking at this space, attacks are going to be changing all the time, the business environment is going to be changing all the time on top of it, so that certainly makes sense.

So, we have covered a ton of great themes today. We’ve talked about data discovery, we’ve talked about being able to attack these attacks, respond and recover more quickly. We’re approaching the end of our time unfortunately, but I was wondering if you had any couple of key takeaways from a customer perspective, what would they be?

Mignona Cote: So from a customer perspective, I always go back to a couple of main controls to make sure you’re doing, and one of them, we’re all moving to the cloud, so make sure you’re doing cloud security posture monitoring to make sure that those holes are not there that a threat actor can get in.

Then the cool thing where we are within the cloud is that we get to have virtual private cloud, so now we can automatically deploy virtual private clouds to keep our network segmented. Actually, that enables rapid recovery within the operating environments that we’re all moving towards. So, those are the key things that I would want people to know is that yes, you can be secured in the cloud, and there’s just a couple of really basic controls that can protect that front door.

Krista Macomber: That makes sense. That makes a lot of sense. Well, on that note, Mignona, I want to thank you so much for joining today. This was definitely very insightful, very enjoyable as always, and we want to thank everyone for joining our conversation today. As always, please like, subscribe, do all of those great YouTube things.

This is again, the Infrastructure Matters Podcast through The Futurum Group. We do have weekly conversations that will be coming to you on that regular cadence, as well as these incredibly insightful insider editions where we have the likes of these very intelligent folks such as Mignona on to share their insights on anything and everything infrastructure. So with that, thank you all so much and we will see you on the next one.

Mignona Cote: Thank you.

Author Information

With a focus on data security, protection, and management, Krista has a particular focus on how these strategies play out in multi-cloud environments. She brings approximately a decade of experience providing research and advisory services and creating thought leadership content, with a focus on IT infrastructure and data management and protection. Her vantage point spans technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her work has appeared in major publications including eWeek, TechTarget and The Register.

Prior to joining The Futurum Group, Krista led the data center practice for Evaluator Group and the data center practice of analyst firm Technology Business Research. She also created articles, product analyses, and blogs on all things storage and data protection and management for analyst firm Storage Switzerland and led market intelligence initiatives for media company TechTarget.

Krista holds a Bachelor of Arts in English Journalism with a minor in Business Administration from the University of New Hampshire.

SHARE:

Latest Insights:

Quantinuum Announced a Dramatic Improvement in Error Rates that Should Lead to Faster Adoption of Quantum Error-Correcting Codes
The Futurum Group’s Dr. Bob Sutor discusses Quantinuum’s announcement of achieving better than 99.9% 2-qubit gate fidelity and what this means for quantum error correction.
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Apple Vision Pro developers losing interest, U.S. awards Samsung and Micron over $6B in CHIPS Act funding, does AMD have a datacenter AI GPU problem, Adobe’s use of Midjourney, Samsung knocks Apple off of number 1 market share, and Arm says CPUs can save 15% of total datacenter power.
In Recent Years, the Concept of a Sovereign Cloud Has Gained Significant Traction Among Nations Seeking Greater Autonomy and Security in Their Digital Infrastructures
The Futurum Group’s Steven Dickens observes that Oracle's recent $8 billion investment in Japan not only expands its cloud infrastructure but also strategically aligns with the growing global trend toward sovereign cloud solutions.
Hammerspace, Seagate, Quantum, LucidLink, and Resilio Are Among the NAB Products of the Year for 2024
Camberley Bates, VP of Data Infrastructure at The Futurum Group, covers the significance of data infrastructure at the NAB Show 2024 and the Product of the Year Awards.