Want to share this?


Don: Good morning and good afternoon and welcome to On Your Radar. I’m your host for On Your Radar, Don India, and I have the distinct pleasure of speaking with industry experts tackling critical topics in the areas around data privacy, corporate compliance, cybersecurity, and artificial intelligence. And our guest today spans the gamut with all of these topics that we talk or we’re going to be talking about.

Don: So let me set the stage for you and introduce our guest. Early in his career, he was a healthcare litigator. After a successful six-year run, he transitioned to a practicing privacy attorney, representing hospitals, healthcare systems, and healthcare providers. He then spent the next 11 years at GE Healthcare out of Chicago, Illinois, one of my favorite cities and my hometown.

Don: Ultimately concluding his final six years there as the Chief Privacy Officer and Data Protection Counsel. Now, he’s the Senior Vice President and Deputy General Counsel of Trust and Privacy at Hewlett Packard, where he leads HP’s Trust and Privacy organization, defining and driving the global data privacy and governance legal and compliance strategy for the company.

Don: Nestor Rivera, welcome to On Your Radar.

Nestor: Thank you, Don. It’s great to be here. Thank you for having me.

Don: No, I appreciate it. Appreciate the conversation we’ll have today. But before we jump into our critical topics that we’re going to be discussing, our audience loves to hear about our guests’ journeys. Can you take our audience through how you arrived at where you are today? 

The Path to Privacy

Nestor: So it’s interesting. I went to law school thinking I wanted to be a labor and employment lawyer. And so, when I went to my second summer associate role, I specifically was targeting the labor and employment department for an offer, right after law, after law school. And, you know, much like any law firm in the summer associate program, you have to go through the different rotations and one of them happened to be the health law section of the firm and there was a focus that summer on HIPAA because it had just been passed, you know, a couple of years prior and companies were now starting to figure out, you know, do we need to come into compliance with this thing and how do we do so?

Nestor: And so some of my assignments that summer were in that space. When I went back to law school for my third year, I actually ended up doing a seminar in health law where my final paper was on HIPAA. And so when I went back to the law firm, you know, I started doing more of that work. And at the same time, as you mentioned, being a litigator, that was, you know, taking up most of my work because privacy just wasn’t a thing back then.

Nestor: This was in the year 2000. So we’re talking 24 years. But as things would have it, you know, there was a breach that came into the company, not healthcare, but I was the only person that had, you know, some sort of, you know, knowledge of, of what privacy was all about. And so that’s really how privacy started for me was dealing with that data breach.

Nestor: And then over time, other companies experienced them. By the time I went to my second firm, as you mentioned, it was primarily the focus of my practice, expanded it beyond healthcare, you know, to include financial services and other industries. And, you know, by that point, privacy was really catching on, right?

Nestor: Because, you know, the EU starting to contemplate a future after the EU directive and transitioning to GDPR. And so by the time I went to GE Healthcare, it was, it was all I did. And, over time, you know, we started in privacy, but then cybersecurity became a big thing. And then, you know, most recently in the last five years, AI became a big thing.

Nestor: So, you know, being the person that had experience with data, those other areas started getting added to my role. And, I would say though, that really my current role at HP is the culmination of all of those at a very high level, because, you know, I would tell you that 50 percent of my time right now is spent on AI. Now the interesting piece of that is that privacy has a huge intersect with that, so it’s not like I’m not doing privacy, but the AI component and the AI governance component of it is taking an awful lot of my time right now.

Don: I can absolutely imagine that. And the path that you’ve taken into privacy, many of our guests have taken the similar path, where they were the only and unique ones with the privacy ability to analyze a critical event. And as they progress, much like you, other things have evolved into their overall repertoire of what they have to accomplish and what they have to be responsible for.

Don: Well, that’s actually a great opportunity to segue, Nestor, into our topic of conversation, which is really around evolving regulations. And how technology has and will continue to play roles in the reduction of enterprise risk as these regulations continue to evolve. So my question for you as we start the conversation Nestor is, tell us about your experiences that you have with this evolving regulations and the history you have with technology and has it, how it has it bridged and created the efficiencies and then we’ll get into a little bit of future state conversation.

Nestor: So let me let me approach this from from kind of a broad perspective in two buckets. One is just like the evolutions of the laws over the years. And then I think the other piece is responding to incidents. You know, in the face of those, you know, laws that now exist that maybe didn’t 20 years ago, and they’re very related, I think. So, not surprisingly, in the past, a, you know, when I first started this two decades ago, everything was super manual, right? You found out about these proposed laws just because they got printed in some magazine or some bar newspaper or something of that nature, right? And you ended up doing a lot of reading, and I even recall reaching out to a bunch of legislators when I was in Florida.

Nestor: Yeah. Wanting to know what, you know, kind of what the discussions were behind the scenes so I could get some intent to how they meant these things to apply. And so, you know, everything was super manual. You always felt like you were missing something because there was just way too much going on and you had to search for it.

Nestor: And, you know, similarly, when you had a breach, you know, particularly if it was a breach that entailed, you know, multiple states, right, where you had individuals at issue that were in all these states. You literally had to pull all these laws one by one, read them, you know, research them, right? For those of you that are used to just clicking a button now, you literally had to pull books from the library and follow case law and see if it was still valid and what have you.

Nestor: And so, you know, it was a very time-consuming process. Now, of course, in private practice, that was great for me because that meant a bunch of billable hours that I could, you

Don: of course, of course.

Nestor: But if you think about that in terms of the very short timeframes that we have currently to respond to incidents and provide notification to regulators and otherwise, I mean, that’s a crazy world to be thinking about living in, right?

Nestor: And so I think now with the technology that we have available to us, both in terms of monitoring the, you know, the proposed and existing laws that go into effect. Plus, if you have a breach, being able to, you know, use a service like RadarFirst, right, and input everything, click a button, and it very quickly gives you, you know, an analysis, right, of how something, a particular incident, you know, looks from a, you know, regulatory or legal standpoint, that’s, that’s amazing, right, compared to the way we used to have to do work before.

Nestor: And it’s funny because a lot of people think of that in terms of, well, you’re taking work away from, from lawyers. I said, well, no, I think the way that I look at it is, is that sure you take away some of the initial, you know, I don’t want to call it low-level work, but you know, very much just fundamental, you know, basic work.

Nestor: And then. allow the time and effort to be spent on kind of the more strategic and important work that we oftentimes don’t get to because we’re mired in these things that take up a lot of time and we just need to do.

Don: And what you’re talking about in that particular thread, Nestor, is actually scale, right? Technology, you’re, what you’re describing is technology provides the efficiency of scale in this particular example. How do you scale when incidents and events are occurring at a more rapid pace, and your volume of FTEs is maintaining a steady state?

Don: So you have to leverage technology or someone has to create technology to allow for the efficiency of scale to actually happen.

Nestor: Absolutely.

Don: So continue down the path. So we’re talking about privacy. We talk privacy. It was manual. We’re talking the privacy evolution technology has created an opportunity for efficiencies of scale and a whole other suite of benefits.

Listen Now: AI & Cybersecurity Laws Are Coming

Listen Now

Evolving Regulations: Cyber and AI

Don: Let’s go bridge it into more current evolving regulations, cyber and artificial intelligence. Let’s go deeper into that because as you stated earlier, 50 percent of your time is in the AI world. It has compliance ramifications. It has potentially cyber ramifications. It definitely has privacy ramifications.

Don: So let’s go dig deep into that.

Nestor: Yeah, I mean, I think, well, a couple of things. It’s, it’s really interesting to have conversations about AI nowadays because it’s such the hot topic that people that maybe have absolutely no prior exposure to, you know, data and what you can do with data and large language models, which have existed, you know, for decades are coming at this as a new thing, right?

Nestor: And so there’s a lot of, I think, confusion and breathlessness associated with, oh my gosh, we need to come into compliance and all that. Right. But I think for those of us that have been in this space for a while, we recognize that the laws that are being proposed in the AI space are really the culmination of all of this work that’s happened over the years in the privacy, the cyber, you know, even like right to repair laws and, you know, that sort of thing.

Nestor: So I, when I look at them, it’s, it’s twofold. One, I don’t really am shocked by them because a lot of the aspects of them we’ve been dealing with for many years, but what I’m most concerned about though, are the unintended consequences of these proposed laws. And so, candidly, that’s where we spend a lot of our energy because a lot of the, you know, compliance aspects, if you will, of the cyber security and AI laws that are coming, you know, they’re coming into effect now or being proposed.

Nestor: We already have a lot of those things already from a control standpoint set up in our systems, right, to be able to comply with that. But I think the more difficult thing is, is the unintended consequences of these laws on the business operations. In other words, what you’re able to do with the data, who you’re able to do it with and for what purposes.

Nestor: In spite of having, you know, proper consents required by applicable laws. And so that’s really where we’re spending a lot of our time and energy is just thinking through from an advocacy standpoint. to the extent that we can be able to educate policymakers and regulators that, you know, there might be these unintended consequences that they’re not really, you know, thinking through because they’re looking at it, you know, from a purely academic standpoint, right?

Nestor: And so, you know, obviously there’s a selfish intent in that on some level because you’re a business and you want to maximize, you know, the use of the data, but by the same standpoint, I also think there are a lot of benefits that the individual data subjects can, receive from allowing companies to be able to use AI in certain places like healthcare, you know, super critical area where AI plays a big role in figuring out what’s the best treatment outcome or course of treatment for someone, right?

Nestor: So, the difficult thing about this, though, Don, honestly, is that, you know, there aren’t a lot of laws in the AI space. I mean, the EU AI Act passed, you know, earlier this year, and, you know, there are a bunch of laws that, that are in the works. My prediction is you’ll likely see AI laws catch fire globally, like GDPR, you know, triggered all these laws in other parts of the world.

Nestor: But the reality is, we’re just at the infancy, I think, of the AI-specific laws.

Don: Agreed.

Nestor: But on the, but like I said earlier, I’m not super concerned about it because I feel like a lot of the components of AI law have already been, you know, thought through in the privacy realm in particular, but also with, you know, cyber laws and other data affiliate, you know, associated laws that are in existence.

Don: I’m going to go deeper in one piece you said. But if you think about where artificial intelligence came from, it’s been around for quite some time. Large enterprises have been leveraging artificial intelligence in some ways. So their corporate governance process exists or should exist in a very robust way.

Don: Whether they’re compliance, regulatory compliance obligations or not. But if you go backwards to just, what, two years ago on the emergence of Gen AI, the reason why it’s heightened is because everyone that has access to the Internet now has access to AI in a way that was never shaped before. So that’s why this rap, at least my view is.

Don: That’s why we’re seeing the heightened level of scrutiny. We’re seeing the heightened level of diligence, and then we’re seeing almost a scramble to say, What do we do and how do we not allow for unintended consequences to the benefit of business? Because I want to unpack that a little bit. You started talking about unintended consequences.

Don: But what you didn’t give us as examples of what those unintended consequences could be. I have a lot in the back of my mind of regulatory legislation and what it could do to constrict organizations. So talk to me a little bit about that because it’s not just an organizational level. It could actually be at an at a countrywide level too in terms of constrictions of the ability to innovate.

Don: Love to hear your thoughts on unintended consequences.

Nestor: Yeah, I mean, I think I think it’s just, you know, unpacking the statement I made, you know, regarding certain really positive uses or possibilities with AI. And the reality is that the lifeblood of AI is data. And so, you know, going back to the privacy world, it’s critical to try to balance the rights that data subjects have with the potential to innovate and create real beneficial solutions for society, right?

Nestor: And so I think the unintended consequences I’m referring to primarily exist with limiting the data, right? And the uses of that data through AI. That would then lead to these innovations and positive effects on society for fear of all of the bad things or the risks, right, of things that could go wrong.

Nestor: So, I think there’s a bit of a knee-jerk reaction sometimes to focus on the negative. I think that’s just a human thing, right? We can do nine things right, but we’re going to focus on the one that we feel like

Don: It’s the one that gets the biggest headline though.

Nestor: Correct. And so when you think about AI, I think you hear a lot about the very valid risks that are associated with that, like hallucinations, like potential discrimination or bias in models and that sort of thing.

Nestor: Right. Or, you know, the apocalyptic view that we’re going to turn into a real live version of the West world, right? If

Don: Fun. Yeah. Good show, by the way.

Nestor: Right, it is, I, I’m so upset they canceled the last season before they, the whole story was

Don: Totally agree. Yeah. It kind of left us all hanging. Yeah.

Nestor: I keep hoping that they’ll bring it back at least for a movie.

Nestor: So, so we’ll see. But you know, we focus on those negative things without really thinking about all the positive stuff. So it’s really, I mean, I don’t have any specific examples other than to say that if you limit the type and the use of data on the front end that you feed into these models, then I think the output, the innovations, the benefits to society are, are very much, you know, held up if you will.

Don: Sure. No, they’re, they’re kind of muted if you don’t allow it to run. On the flip side, I can understand the arguments of if you let it run, then how are you governing any sort of drift in this AI model that sways responses into a very one-sided response rate. So it’s a lot of work, a lot of work, but fascinating to watch how this is going to evolve over time.

Don: Let’s go a little bit deeper. Let’s, let’s shift, shift the questions away from artificial intelligence and more into the future of technology, right? Just across the gamut of privacy, corporate compliance, cyber and artificial intelligence. Technology, obviously, artificial intelligence is a technology of itself.

Don: But how do you see technology playing in a role of this regulations as they evolve? Because you talked about radar privacy, and I appreciate you plugging that one in. Where are holes in technology today that you can see in Nestor’s experience that organizations are going to need to create and fill?

The Future of AI Compliance

Nestor: Yeah. I mean, the reality is, technology’s playing a more and more important role every single day within companies. I mean, just thinking about every single department that could use some form of technology to, you know, make more effective and quicker decisions and identify pain points or areas for improvement a lot quicker.

Nestor: So, I would say that on the whole, the technology that exists is doing a pretty good job of doing that at a high level. I think where there’s still opportunity for improvement or innovation is in specific areas. Right. So like, for example, when you see drill down beyond kind of like the high level of how is my department running from a financial standpoint to things like, you know, how can I make sure that I am tracking and complying with all obligations in a certain space?

Nestor: And I know we’re talking privacy, but the same exists in things like trade compliance or, you know, just general compliance. If you happen to be in a highly regulatory and heavily regulated industry, like, you know, financial services or health care, what have you. So I think the broad, if you will, approaches through technology are fairly well covered, but I think it’s those, you know, deeper, more subject matter expertise type areas that I feel like still need to be fleshed out a bit more. 

It’s a hard, it’s a hard proposition, right? Because particularly if you’re a company that’s focusing in that area, you’re taking a risk at that there’s actually going to be an appetite and need for that, that’s long-term and sustainable. And so that might be, you know, part of the reason why we don’t see that in a lot of spaces per se,

Nestor: but I think that’s one area that I think would, would behoove some companies to really look into and target doing more of,

Don: Now, I appreciate that. If you think about where technology can go and I’ll use the term niche areas, cause that’s really what you’re talking about in the subject matter expertise. There’s a lot. There’s NIST corporate compliance in terms of security frameworks. You can have a security framework. There is ESG reporting that every organization has to report in their 10Qs and 10Ks.

Don: How does that get flowed, distributed upwards into the audit committees and allow for consistent decision and documentation? There’s a lot that’s there that is very focused on the expanded regular regulations that are out in the world. And AI is going to be the next layer of this thing. So all this exists.

Don: And if you go backwards to SOX compliance, financial compliance, that’s been around for a long time. That’s pretty locked. They understand it. But now these new regulations are forthcoming that there are gaps in what technology can allow for and provide to. So there’s documentation, there’s consistency and decisioning.

Don: There’s all those facets that, that technology can open the door for. And there are, there’s technology companies that will find and create solutions that will solve that. I can guarantee it. Well Nestor, one last question that I ask all my guests on On Your Radar. Nestor, what is On Your Radar for the future?

Nestor: Yeah. So, you know, for me, it really is about getting our arms around all of these evolving laws and where things are going. I think, if you get too bogged down and trying to track every single thing that’s out there, you just will never have enough people and time to do that. What I do with my team, and I think we do this more broadly within the company is what are what are the trends, right?

Nestor: Almost trying to be, you know, a bit clairvoyant, if you will, of What is this environment going to look like five or ten years from now and start making whatever moves we feel are necessary now in order to best position ourselves, right? Not necessarily knowing, if you will, the details of what that will look like.

Nestor: And so I think there are a lot of general things, you know, both from a, you know, part process policy control standpoint that, you know, we start to do when we feel like things are headed in a certain direction. So I think for me, it’s about synthesizing everything that’s going on in the environment and trying to draw inferences from where things are headed and start advocating to make those moves before they become an obligation that’s thrust upon us with a very tight timetable for compliance.

Nestor: And I liken this to, you know, several years ago when, you know, China passed their various laws and cyber security, data security and, and, and personal information protection. And a lot of companies struggled with, well, what do we do? You know, cause it’s not super clear how each of these different definitions, you know, are meant to apply or what have you.

Nestor: And you know, I had to make an argument that we should invest, you know, a certain amount of money in getting our systems ready, not knowing what the full extent was of how things would play out. Right. And so on some level you have to, you know, advocate for something that almost doesn’t exist on some level.

Nestor: And it’s difficult, but I think those, those conversations that you end up having with the leadership team, are super critical because this stuff is evolving at the speed of light. And I feel like if you just wait to see what happens and then react, you’re going to be so behind things that it would be very difficult to catch up.

Nestor: So, I guess I just need to put on, you know, my car neck hat, right. And, and, and try to figure out what’s in the envelope for, for five or 10 years from now.

Don: Now you let me know who’s gonna win the Super Bowl in 10 years, and then we’ll figure that one out and maybe we’ll retire younger than,

Nestor: Yes, that’d be great.

Don: One of the things you said, it, it, it’s fascinating, is you’re, you’re preparing for the future. Now, your history allows for you to actually have good insight into what’s forthcoming.

Don: But if you think about what regulators want,

Don: no matter where they are in the world, they want consistency. They want a process. So you can demonstrate you have that consistent process in place, you’re going to be afforded a significant amount of grace from the regulators because you’re ahead of the curve.

Don: And that is significant for all organizations to understand. And I know everyone of our, of our listeners understands that, but sometimes we get caught in the weeds. So elevating yourself higher and saying in 10 years, this could happen. Let me put the plumbing and infrastructure in place to future-proof what I need to future-proof for any evolving regulations and new laws that are going to come forward.

Don: Nestor, thank you for being on, on your radar podcast today. I appreciate your time.

Nestor: Yeah, no, thank you for having me. It was great. And, hopefully, we can do this again in the future.

Don: I plan on having you back. Absolutely. And to our audience, thank you for listening to On Your Radar podcast. On Your Radar is made possible by RadarFirst. RadarFirst automates intelligent decisions for your privacy and compliance regulatory obligations. If you’d like to learn more, please head to RadarFirst.com. And if you enjoyed listening to today’s podcast, please follow On Your Radar podcast. Our next episode will be available next month. Thank you and enjoy the rest of your day.

Enjoying On Your Radar?