Description:
In this episode, Bill Tolson and Chris Cronin, Partner, Governance and Engineering Practice at HALOCK Security Labs try and define "reasonable data security" - a term that continually appears in every states' privacy law or proposed legislation. But what is "reasonable data security"? Today, there is no prescriptive definition for "reasonable data security". Chris Cronin will share what he tells his clients and the best policies and procedures for staying compliant.
Webinar
Cyber Risk and Data Privacy Panel Discussion
Adjust cybersecurity strategies in the face of a new threat landscape. Experts in this panel discussion will cover Cloud Adoption, Data Security, and Ransomware.
Speakers
Chris Cronin
ISO 27001 Auditor
Chris Cronin is an ISO 27001 Auditor and has over 15 years of experience helping organizations with policy design, security controls, audit, risk assessment and information security management systems within a cohesive risk management process. Chris is Chair of The DoCRA Council and the principal author of CIS Risk Assessment Method (RAM). Chris is also a member of The Sedona Conference, Data Security and Privacy Liability – Working Group 11 (WG11).
He is a frequent speaker and presenter at information security conferences and events. Chris earned his Masters of Arts from Case Western Reserve University.
Bill Tolson
VP of Global Compliance & eDiscovery
Archive360
Bill is the Vice President of Global Compliance for Archive360. Bill brings more than 29 years of experience with multinational corporations and technology start-ups, including 19-plus years in the archiving, information governance, and eDiscovery markets. Bill is a frequent speaker at legal and information governance industry events and has authored numerous eBooks, articles and blogs.
Transcript:
Bill Tolson:
Greetings. My name is Bill Tolson, and I'm the vice president of compliance and e-discovery at Archive360. Joining me today is Chris Cronin, partner with the governance and engineering practice at HALOCK Security Labs. Welcome Chris, and thanks again for joining me today to discuss this really interesting topic on what is reasonable security as it relates to privacy laws. And one of the things I forgot to do was I mentioned what the topic is today, and really, Chris and I are going to be talking about what is reasonable security, or reasonable data security, as it really focuses in on my mind on privacy laws and things like that. I think Chris has a somewhat wider view of it as well, but I'll give you some more data around that. So Chris, welcome, and do you want to give us a little rundown as to what HALOCK Security Labs does?
Chris Cronin:
Yeah. Yeah. Thanks for having me, bill. It's good talking to you again. HALOCK Security Labs is a company, we're about 27 years old now. What we do is we're a consulting firm, but we help our clients understand where they are in terms of their cybersecurity preparedness, and then we help them get better at it. So we've got pen testers and incident response teams, engineers. The group that I run focuses most strongly on governance risk and compliance, helping people get to a defensively reasonable position with their cybersecurity, and then to demonstrate through measurement that they're improving over time. That's basically us.
Bill Tolson:
Wow. Really, obviously, timely with all of the headlines we see on a daily basis, so great to have you here. In fact, to set the stage, Chris and I first met at the MER Conference in Indianapolis this year. I was giving a presentation on data security basics for information managers, and I didn't know this, Chris was scheduled in the same room to give a presentation right after me titled Defining Reasonable Security Measures, which I thought was funny. Once I got done with my presentation, Chris came up and said, "I was hoping you were going to bring up reasonable security."
Chris Cronin:
Yeah. It was one of those delightful moments. For those who do public speaking, you want to know that the audience is going to understand what you're saying, and they're going to receive it well, and there'll be something useful that comes out of the talk. And you'd say that a complaint that has driven a lot of my career, to answer, you said, "Geez, when you look at these privacy and security regulations, what does every state tell you to do? They tell you to get to reasonable security, but they never tell you what that is." And I thought what a great setup. It's like Bill knew. Bill took care of me getting set up correctly, and ladies and gentlemen, it was perfect.
Bill Tolson:
It was wild, and a lot of fun. Following up on what Chris just was saying, at one point during my presentation, I described all of the new and emerging state privacy bills that were becoming law. Now, I'm a technology guy first and foremost, and I wondered out loud during my presentation why all of privacy bills and laws used exactly the same terminology about data security, which I.e. is data collectors must use reasonable security practices. Now, ever since that GDPR, but beginning in the states with California CCPA, my complaint with this terminology is that in my mind, it's not prescriptive enough to ensure a minimum data security base, and again, little did I know Chris was going to be addressing that as well.
Bill Tolson:
To also set the stage further, Chris is one of the authors of the Sedona Conference, well-known legal think tank type of organization, but the authors of the Sedona Conference paper titled the Sedona Conference Commentary On a Reasonable Security Test. And I've read through it, and much of it, Chris, I can't understand, but hopefully we can talk a little bit about it. But with that setup, let's start with that Sedona Conference paper that you and many others together to address the security. And as I mention at the podcast, you were one of the authors. Can you explain, again, obviously in general terms, what the reasonable security test entails?
Chris Cronin:
Yeah, I will. It's basically a balancing test. It says anyone who has personal information about say consumers are going to need to take safeguards to protect that information. And if they break it, if they break that covenant, if there's a breach and personal information gets out, and there's a lawsuit, an adjudicator, it might be a judge, it might be a regulator, is going to need to make a decision. Did this organization apply reasonable security controls? And they've got a decision to make. And what the paper does is it gives that adjudicator the analytical model to determine whether security was reasonable or not.
Chris Cronin:
And it basically says if you were to put a safeguard in place that would've prevented that breach, and the safeguard would've been more burdensome than the increased benefit of reduced risk to the public, then it would not have been reasonable. We can't expect you to go to a greater length to protect someone from harm that would've been less harm. The easiest way to put it is the cure cannot be worse than the disease. That's really how it's done. But what's really critical about this is it's a balancing test between the efforts, the costs at an organization, and the harm that can be caused to somebody else. So that's a quick background of what that is. I can go into more depth about how it's to be used and all that, but that's a quick summary to answer your question.
Bill Tolson:
Well, that was also one of my follow up questions. But just interestingly, when I first cracked the paper and started reading through it, I noticed that the group boiled down a reasonable security test to what amounts to an algorithm, and that really surprised me. It's like wow, how did they get to that? Not that it was wrong, I thought it was a really interesting way to lay it out. I thought it was pretty wild. What kind of feedback have you gotten on that paper and what your group came up with?
Chris Cronin:
Well, the feedback has been pretty varied. I'm going to set a little backdrop for this, because there's something really, really important about that paper, and it's within one of the first pages. When you look at who the co-authors are, and you understand who they are in cybersecurity law, you realize what a miracle this paper is because you've got regulators, defense attorneys, plaintiff's attorneys, and a couple cybersecurity folks who came to agreement on what reasonable means. Now ,think about what this means. I was just talking to someone a couple days ago about this. They could not believe that there has not been a clear definition for reasonableness up until now. And they say, "But it seems simple to me. Why wouldn't we have come to a conclusion that just there's this balancing test? You can't expect me to hurt myself more than you would've been harmed when I put a protection in place to protect you."
Chris Cronin:
And that seems simple enough. Why didn't we come up with that solution earlier? And I said, "Look, in law, the people who were involved in this are adversarial. The whole model of law is an adversarial situation. Imagine how difficult it is to tell natural adversaries that they're going to find a way to agree on something that gives them the thing that they argue about and earn their livings from." They're in a much better situation to have an adversarial relationship, so they can earn their money as they argue with each other. But this group of people at the Sedona Conference, and this is the nature of the Sedona Conference, it's a think tank, they say there is a big problem in law that people haven't cracked, and it's not good enough for us to just fight about it and earn our fees. We have to solve this problem so that we can function, and to use the Sedona Conference's phrase, to move the law forward and affair and just way.
Chris Cronin:
And the spirit of the people who came together, the regulators, defense attorneys, plaintiff's attorneys who very often see each other in court, and two of whom actually broke the word reasonable at the federal level with the LabsMD case, came together and said, "We've got to actually fix this problem. We've got to define reasonable so that an adjudicator knows what this is, because we're spending too many resources to get it done." So the first, most miraculous thing about that paper is the people who authored it. Natural adversaries, who have fixed a problem that's been very bad for law. This is one of the most important things about this. The other thing is that we had to work really hard to create a rule that the regulators, the defense attorneys, the plaintiff's attorneys and the security people would say, "Yeah, that looks like a good way to apply a rule."
Chris Cronin:
So the way we did that was we said, "Let's take on the tradition. We're not going to invent anything." There's this really great line, it says we have invented nothing. Essentially, we have invented nothing. All we did was reach into the history of litigation, regulation, tools from the cybersecurity world, and we said, "You know what we all have in common? We all have a risk test, where we look at risk in terms of what Judge Learned Hand called liability and probability," or what we say in cybersecurity is impact and likelihood. I need to know what my risk is, the impact and likelihood of a problem, if I'm going to make a decision about cybersecurity. And in law, we say that the impact and likelihood of a harm to someone else shouldn't require a control that's more burdensome than that likelihood and harm.
Chris Cronin:
And in cybersecurity, we say when I look at likelihood and harm together, I look at this thing called risk. And if I have it in a risk register, and there are risk assessment methods that do this, it says I can compare my current risk in terms of my burden and the harm to others with the risk that would be in place with the harm and burden to others if a safeguard was put in place. So we said, "Well, look, if we all have this common language of risk, let's just put it together in a way that when we look at that algorithm, we all recognize it. It's mutually understandable to everybody." So that's why we have that little piece of math in there, so that everyone from all these communities looks and say, "I know what that means. I know how to apply it." So that's what that little thing is.
Bill Tolson:
Oh, interesting. And I've talked to you about this, but also on my other podcast, I've actually done podcasts with several state legislators that have authored bills, and some of them have become laws in Connecticut and Colorado and Utah so forth. And I've all asked them about the reasonable security practices line in their bills and laws, and not one of them have mentioned the Sedona Conference or that paper. And basically, I asked them isn't reasonable security practices kind of squishy when it comes to a legal side? And this was before I knew about you and the Sedona Conference, and surprisingly, not one of them mentioned it. Does the Sedona Conference each out to legislators to maybe push this stuff.
Chris Cronin:
They do, but understand, the Sedona Conference paper was just published last year, and the reasonable person test has been in law for 150 years, so you're looking at the last minute of the last day of the last calendar of the calendar of the year type situation with that Sedona Conference paper. But what's been interesting is the uptake in law has been really quick. There have been two cybersecurity settlements that have used the Sedona Conference, either cited the paper or used the construct since that was published, so the uptake has actually been pretty quick in law.
Bill Tolson:
Wow. Wow.
Chris Cronin:
Yeah. So there's a decision out of Pennsylvania regarding Hanna Anderson. Hanna Andersson is a clothing retailer.
Bill Tolson:
Well, let me get to that after this enough stuff, and then we can get into those decisions because you sent both of them to me, and I found them really, really interesting. But back on this, and really you did address some of my issues, but when looking at security of PII and trying to write that in the law, and you mentioned the risk, I would think that the risk of PII being breached, no matter the size of the company or what state they're in or anything like that, is basically going to be the same.
Bill Tolson:
It has the potential of harming the data subjects just as much, no matter what the size of the company is. And that brings up my question that's really been the hot button issue for me, and again, I said before, I'm more of a technology focused individual and a person that has received my fair share of corporate apology letters about how they suffered a breach of my PII, the last one being the gigantic T-Mobile breach, and I hadn't done anything with T-Mobile in 20 years. My first question was what are they doing with my data after 20 years?
Chris Cronin:
Yes, sir. Good question.
Bill Tolson:
But my question has always been why can't these legislators and the public and corporate advisors that advise them at least write some basic data security requirements into the bills? Not get all weird technology and stuff like that, but such as all PII must be encrypted while in transit and at rest. That's not a reach, and that technology has been around for 30, 40, 50 years. OR PII shall never leave the enterprise, I.e. loading it into a laptop to take home to work on over the weekend, which I've had a breach letter that didn't mention that, but later on in the news, I found out some lower level individual had loaded a gigantic database of PII on the laptop in their trunk, and their car was stolen. Or encryption keys shall never be stored in the same location as the encrypted data, and so on. And I've had other subject matter experts try to explain this to me. Any thoughts on my stance there, Chris?
Chris Cronin:
Yeah, because in principle, what you're saying is perfectly sensible. It's the exceptions that need to be taken into account. So encryption always in transit and in rest of PII sounds great, until you're trying to get customer service and you're talking to your person on the support line and they say, "I actually can't help you because I can't read your record." The information has to be unencrypted at some point.
Bill Tolson:
Yeah, but nowadays, at least the technology that we and others offer would be number one, it's based on role-based access controls. Is the person trying to access it authorized to do it? If they are, the system automatically decrypts those pieces of it that they need to see. So I would think in most cases, there should not be a hold up just because they got to encrypt or decrypt, I hope. I can't say that flat out.
Chris Cronin:
We've done risk assessments of hundreds of companies, and we see so many cases where information needs to be unencrypted in order for it to do the thing that it's there to do. You'll see in the Sedona Conference, and you'll see in actually the proposed California and federal regulations, well, the CCPA is including this, but the federal regulation is also doing this, it references the business purpose. You've got a legitimate business purpose for using this data. And the idea here is that in the Sedona Conference, we call it the utility of the data. We want to be sure that the use of this is appropriate for the context that you're using it in. People look at it first and they say, "Oh, this is Bill's T-Mobile problem. He was a customer 20 years ago." They shouldn't have had his data. He should never have had that breached.
Chris Cronin:
There's zero utility for them having Bill's data, they should be 100% liable for that. And if you put that in this equation, you'd say if there's zero liability, then any kind of safeguard was appropriate because there's no benefit. There's no way to improve the benefit to the client, other than just getting rid their data. Now, in the case of rules like MFA or encryption, where we say you just have to use it all the time, and then we come up to situations in medical practice or in customer support issues or other issues where someone says you have to understand that if it's in this state, it's unusable. Here's a fun example. My parents are part of a community where they live that gets senior people out and exercising. And the way you get out and exercising is you do it in a community.
Chris Cronin:
"Hey, we're having an event. We're all going to go to this park and we're going to gather, and we're going to make this a two hour walk. Bring your water, et cetera, but sign up." And they know that when we get someone to sign up for the event, they're feeling committed and they're actually going to go, and it's going to encourage them to do the healthy thing. Well, someone in the village said, "This is actually health information because they're saying they're senior citizens, and they're saying they're going to be doing exercise. We're supposed to have multifactor authentication on anything where someone signs something in having something to do with their health, so you must apply MFA to this signup sheet."
Chris Cronin:
And so now there's this question, but if we apply MFA, are senior citizens going to be equipping their phone to do this multifactor authentication thing so they can say, "Yeah, I'll be there on Saturday for the park walk," and if they don't do it, are they less encouraged to do something that's good for their health, and good for their social lives? So did the utility of having this sign up get hurt by the strictures of a strong security program? Is MFA good? Absolutely. Can MFA supersede the benefit of someone using a system? Absolutely. So we need to not say, "You must use MFA everywhere, and it's a strict liability situation. If you violate that rule, you're in trouble," when all you're trying to do is get my mom and dad to go for a walk with their friends in the park.
Bill Tolson:
And that's an interesting example, and I like it, and it helps me understand it more. I have noticed, and I've done some discussions and podcasts and stuff on president Biden's cybersecurity executive order last year that basically said all federal agencies in office that within a year, and actually, that time is passed now, shall you use multifactor authentication, shall use zero trust designs, and shall use encryption, amongst some other things as well. But I noticed that the government is starting to say, "This is what you will do. This is the minimum." And I know with these privacy laws, and I've had legislators tell me this, that they get a lot of help from the industries, from the private sector, and many of those companies, good intention, but they're trying to potentially do what's better for them, my personal opinion.
Bill Tolson:
But in fact, two of the senators I talked to when I brought this up, they said, "We've had that comment back, and we're thinking about doing amendments in the following years and stuff like that." But those, like the whole legislation process, they could be shot down and things like that to get things done, and that's the way government works, and that's fine. But again, I really liked the way that you explained that. It helps me internalize this more. [inaudible 00:19:30] do this, do this, do this, but it helps. But you had mentioned a couple minutes ago these decisions that have come out, and when you first emailed me, you said that six states were going to submit court filings defining reasonable security using a three part test. What is that three part test? Can you explain that?
Chris Cronin:
Yeah. Yeah, and I'll add it's six states and the District of Columbia, so it's seven jurisdictions, so we're excited about it. The three part test is actually a restatement of what we have in that Sedona Conference paper. In this case, the breached company was Wawa Inc. For those who aren't familiar with it, on the East Coast, they're a convenience store and gas station, and very popular and very well-loved. And they've got a very strong reputation for taking care of their customers and employees, and they're great, but they had a challenge in actually managing some of the cybersecurity organization, and they had fairly sophisticated attackers attack their environment and grab millions of credit cards.
Chris Cronin:
So what the state was trying to say was, "We're not going to tell you have to use MFA and encryption everywhere. We need you to make decisions." And this is what's really helpful about this construct for law. "We're not just going to tell you ought to do this and you've got no choice." The government shouldn't be in the position of telling a company how to run itself, for all the right reasons. We won't need to go into that. But it's a three step process, with the first step in your risk analysis is to be sure that the risk you're targeting would not require some kind of repair to the consumers of the state's and DC who signed on to the injunction against Wawa. Here's what this means.
Chris Cronin:
A lot of risk assessments will say, "Well, how would this affect my profit? If I had a breach, how would it affect my profit? How would it affect my reputation? Would this be high, medium or low?" They're saying you can ask that question, that's fine, but you really need to be asking how badly could this hurt consumers? You need to switch your focus. So anyone doing risk assessments out there who are familiar with this question, how does this hurt my profitability? It's a good question to ask, but you're forgetting to add harm to consumers. That's the first test requirement. The second is that when you put a safeguard in place, the safeguard should not hurt a legitimate business purpose.
Chris Cronin:
That's really interesting, because that goes to this MFA issue that we talked about, where we want the senior citizens to sign up to do the hike in the park, and if we put a control in place that prevents that benefit, it's the wrong control. We got to find another way to protect the way we need to protect. So that's what that second role is. The safeguard cannot unduly burden a legitimate business purpose. And the third is that the safeguard cannot be more costly than the risk that someone might suffer without that safeguard, so the cure cannot be worse in the disease. And think about the beauty of that construct. You're a company that needs to run your risk management program, and someone says, "It has to be reasonable."
Chris Cronin:
If a security rule, state regulations, Gramm-Leach-Bliley Act safeguards rule, they're all telling you reasonable reason. This is the exact point you made at the MER Conference. But what does that mean? And then seven districts, six states and the DC say, "I've got this answer for you. You're going to do your risk assessment, you had to do one anyway. But when you do your risk analysis, look at the harm you could cause to other people, and make sure you're targeting a harm that would not need any kind of repair. Second, make sure that the safeguard does not hurt your legitimate business purpose, and make sure that the safeguard is not more burdensome than the risk you're reducing.
Chris Cronin:
What you're going to be able to do as a risk management practitioner is take a risk assessment method, and there's a case that gives you a few references for doing that, and make sure that your risk analysis is automatically demonstrating that you're applying due care, whether or not you've applied all the controls. It doesn't say you'll make sure that all the controls from a security standard are implemented and operating perfectly, and that they're going to block every hack. They say you're going to demonstrate that you thought through this, you had a process where you're thinking through everybody, to make sure everyone was okay. And if someone hurts you when you're doing your best to protect others, while not overextending yourself in the process, we're not going to blame you for that. That's a fantastic construct.
Bill Tolson:
Yeah. That makes a lot of sense. This same framework that you laid out, I think one of the states right now, California, as the California law, CCPA, CPRA, the federal ADPPA bill that's in Congress right now, as well as, by the way, the CPPA up in Canada, which hasn't become law yet. I'll talk about private right of action. So would this framework also be usable, obviously, probably within a data subject suing a company under the state law?
Chris Cronin:
Yes. Yeah, it can be, and it has been. And the Sedona Conference paper sets it up, any adjudicator, whether they're a judge in a private lawsuit, or a regulator in a public matter, or someone mediating, can all use the same test for the same reason.
Bill Tolson:
So these six state filings, are they going to be utilized as precedent for reasonable security and privacy laws going into the future?
Chris Cronin:
I certainly believe that, because people are clamoring for it. There's a first move, the situation in law. You have some vanguards who want to go out and change the world by pushing novel legal theories, but most people in public law, regulators at the federal or state level, are not those Vanguards. They say, :Look, we have to operate within a constitutional construct. We're in the executive branch here. We're not allowed to make law, so we're not going to try to be very innovative in the way we pursue regulatory matters."
Chris Cronin:
Which makes the Pennsylvania versus Wawa so important because now you have these six states and the district saying, "This is actually a good construct. The Sedona Conference brought us something useful." There's some good risk analysis methods in the public. CIS RAM is one of them from Center for Internet Security that maps exactly to this three step rule. The community's ready for this, so let's state that this is what you're going to do to demonstrate reasonableness. And as soon as someone makes that statement, and when they see the public goes, "Oh good, we can go with this one," it'll be easier for these other regulators to say, "Let's go with this definition."
Bill Tolson:
Well, and no, that's fantastic. Actually, a brief side comment about one of my earlier points about encryption and stuff. Even the GDPR does not get into prescriptive demands around technology, but they do in some of their commentary talk about but if the data collector utilizes encryption with the safeguards around encryption and there's a breach, then there is no breach response triggered because the data was encrypted. You agree with that?
Chris Cronin:
Yeah, with the limits that we apply from cybersecurity, that the key is also not-
Bill Tolson:
The key is held in a different repository.
Chris Cronin:
Yeah, exactly. Right, right. And the encryption protocols are strong enough that it's going to withstand the time it'll take to decrypt, that people are safe, et cetera. Yeah. That makes perfect sense, the idea, because really what they're saying is there's no impact. That's the concept of a risk assessment. You're looking at the likelihoods of impact. And if someone says, "Hey, there's a likelihood this encrypted data, the data itself, without a key with this very strong protocol will be gotten," and what's the impact? Zero. Okay, so the risk now is zero, because zero times the likelihood is zero. What kind of burden should we take on to additionally protect the encrypted data? Well, zero, because if I apply a single dollar worth of effort, or if I impact the speed of my application in any way to improve on a zero risk, then I've already overspent. I've already taken on too great a burden. That's how you would apply that math.
Bill Tolson:
But the zero risk is based on the fact that they've encrypted, right?
Chris Cronin:
Right, because the impact is zero.
Bill Tolson:
Right, right. So what I'm thinking is that seems like a carrot for some companies to say, "Well, let's encrypt the data because the cost of the breach includes the hit to a brand's PR. shareholder equity," all kinds of stuff that is not always measured, and you don't have to respond with a breach response that always shows up in the news and all kinds of stuff, I think is interesting. I think we'll eventually get there. And that's one of my arguments, as we look at this whole idea of digital transformation from on-prem up into the cloud. The cloud has this ability to dynamically scale when needed, and it might take hours, or it might take milliseconds. Obviously, you get charged for the extra CPU.
Bill Tolson:
But now, at least companies can say, "At least for my semi-active and inactive data that I'm going to keep, I'm going to encrypt them basically to lower my overall risk." And I've to, actually, some cyber liability insurance providers, and they all say, "Absolutely. The more you do, the lower your cyber liability insurance rates will be," within reason. And you got to do a break even time or an ROI on those to make sure, like you say, does it make sense? Are you going to spend much saving $10 on cyber liability insurance when you have to spend $100 to do that? That makes a lot of sense. But on the other thing, Chris, you mentioned Pennsylvania and Maryland. You sent me those two papers. Tell me if I'm mixing up things here, but wasn't the Pennsylvania one, you mentioned Hanna Andersson, and I think Salesforce was involved with it as well. And that was this same case, right?
Chris Cronin:
Yeah. Salesforce was involved because Hanna Andersson was using Salesforce as their application backend. Right.
Bill Tolson:
So it was mostly Hanna Andersson, not-
Chris Cronin:
Right. Salesforce was not party to the suit. In Hanna Andersson, there was just an inappropriate, not secure way of providing people access to the Salesforce backend. They could have used more robust access controls and they did not, and it would not have cost them anything additional in comparison to the risk of harm to their consumers, and that was a fairly simple case. Now, that was just the Commonwealth of Pennsylvania themselves, it wasn't joined by other states, but that was the first time we saw Sedona Conference Paper used. What's relevant there is that the deputy AG we work with there is in a position, like any AG, if I'm going to take on a case, look, I'm operating on tax dollars. I've got to be very thoughtful with how I use those tax dollars. I've got to take on a case that I know I've got a relatively good way of doing public law and I can make good law here.
Chris Cronin:
I can send a signal to people doing business with Pennsylvania Residents Data, and I've got a fair way to talk to businesses about how I'm going to treat them. So that injunction that they issued for Hanna Andersson said, "Look, the Sedona Conference gave us this mathematic equation that says that you're going to take care of people by reducing their risk, but using safeguards that are not going to be more burdensome than the benefit of the risk reduction," but as well, "I know the cybersecurity community also has tools that I can reference you to, Hanna Andersson, because you didn't get this right, but here are some tools that are free for you." Center for Internet Security, which is a nonprofit. They put a lot of time and effort into creating free tools to the public, but one of them is their risk assessment method, CIS Ram.
Chris Cronin:
And CIS Ram says you're going to look at three impacts, and the three impacts are the three point test in the Hanna Andersson injunction. "So we're handing over to you, Hanna Andersson, everything you need to do to do well, and at the end of it, if you get hacked again, we're not going to give you a hard time because you were hacked if you showed that you did your best to manage risk to consumers, you were focused on them, and you took care of your legitimate business purpose when you put your safeguards in. You didn't hurt that, and you didn't spend anymore on a safeguard than the risk would've cost someone else. And as long as you've done that, we're going to be okay." That's a fantastic thing to say to a company, because you've given them a fighting chance to operate their business conscientiously, and you won't punish them if they get hurt, as they were acting conscientiously.
Bill Tolson:
Yeah. Yeah. Actually, no, that's a great point. And in that Pennsylvania document you sent me about Hanna Andersson, didn't it lay out a series of processes and procedures that they must do to ensure security?
Chris Cronin:
Yeah. This is also very interesting. You'll notice a shift in the way that Pennsylvania has started talking about the requirements. They talk about a control objective, and they don't say you must apply MFA. They say, "You're going to evaluate in your risk analysis method the best way to determine that a user is who they say they are when they log in, and you can do that through things like MFA, but your risk analysis is going to tell you the way to do that. So Pennsylvania says our consumers are protected when you've validated who users of your system are, but as a business, you have to figure out what that is. I'm just telling you what the goal is, you find the right way to get to that goal." That's excellent, because again, it's government saying, "I don't run a clothing retailer. I don't run yours either, but I can tell you I know the outcome I'm looking for the consumers," so that's a fantastic construct, I think.
Bill Tolson:
Yeah. And I thought a lot of the process and procedures that were actually listed made a lot of sense. Like you say, they weren't necessarily prescriptive, but you could basically determine what needed to be done, so I thought that was interesting. And then in the Maryland decision you sent me, who was involved with that.
Chris Cronin:
You mean which states?
Bill Tolson:
Well, it had to do with a company that had lost credit card data, right?
Chris Cronin:
Yeah. So is this the one that I sent you a couple days ago?
Bill Tolson:
Yeah, yeah.
Chris Cronin:
Pennsylvania versus Wawa. So this one's making the news because this is-
Bill Tolson:
Well no, I'm talking about the Maryland one. Didn't you send me a Maryland one too? Maybe you didn't.
Chris Cronin:
Maryland was party to the Wawa case.
Bill Tolson:
Okay. I apologize, I'm getting it mixed up. Go ahead, you were talking about Pennsylvania.
Chris Cronin:
Yeah, yeah. That's fine. But you see the same construct. It's basically the way an injunction is typically set up after a breach is the regulator says, "When I look at the complaint and then look at the cause of the breach, I see that..." What I'm about to say doesn't apply specifically to Hanna Andersson or Wawa, this is just a walkthrough of a typical thing... "Your people weren't trained, and you didn't have an email filter that prevented them for falling for a phishing scam, and the user's permission on the network allowed them to see more systems than they needed to see," what they're going to say in the injunction is they want remedies for those three things. Your people need to be trained, you need a way to handle phishing email automatically, and you need to restrict people's access just to the systems they need, because what an injunction won't do is typically, it won't tell you to do more than you would need to do to have prevented the thing that they're talking to you about.
Bill Tolson:
Well, it also has to do with industry best practices too, doesn't it?
Chris Cronin:
Right, right. Exactly. Now, if they find that it's credit card data and the organization wasn't following rules from PCI DSS, they'll say, "Follow PCI DSS." If the company was under contract to work with NIST 800-53 or the CIS controls, they'd say, "You have to actually do what you say you're going to do in your contracts and follow those standards." But again, going back to the Wawa case, this is constructed the same way as the Hanna Andersson one, you're going to do your risk analysis, you're going to apply that three part test. But we're also telling you that we noticed that these set of controls were not in place which allowed the breach to happen, so you're going to meet these control objectives, and you can achieve them by finding the right risk remedy by doing something like MFA, by doing something like whatever, but you're going to figure it out by risk. So they're both using that same construct, start with the risk, and then meet control objectives that we'll tell you to me. You find the best controls for it, based on risk.
Bill Tolson:
Okay. One of the things that jumped out at me in the Maryland decision, I think having to do with Wawa, it said all cardholder data processing will be protected to the use of commercially available and reasonable encryption tokenization and other similar solutions approved by the PCI DSS. And I think I looked this up, but I think the PCI DSS basically says this kind of data needs to be encrypted anyway, so that was not something new out of the blue.
Chris Cronin:
Right. If it's telling an organization to apply to a standard, then they're pretty much going to say what the standard is. Understand that the injunction is not solely written by the regulator, and they just hand it over to the defense and say, "Live with this," it's negotiated. And when you have a good defense attorney, they're going to say, "Wait a minute. You can't tell us to do any more than the standard would've told us to do anyway, unless there's some other interest that you're trying to represent on the part of your consumers, your residents."
Bill Tolson:
Wow. Interesting. All right. Well, that's been fantastic, Chris, and I think we can now wrap up this edition of the Information Management 360 Podcast. I really want to thank you for this interesting discussion today. You've really addressed some of my questions that I hadn't had answered before, especially those hot button issues, but I think I understand it more now, so hopefully I won't keep bringing up stupid questions.
Chris Cronin:
This is hard stuff. This is hard stuff, and we've had to dig deep into legal history, and cybersecurity methods, and regulatory history, so no, everyone's trying to figure this out, but I'm just glad that we have real, practical advice to help people through this now.
Bill Tolson:
Well, and just so everybody knows, the paper you coauthored on the Sedona Conference talking about the reasonable security test is actually downloadable from the Sedona Conference website, and I think it's free, so I think everybody who really wants to look at this deeper should download that white paper. Basically, it's a report, and it's very, very detailed. I really enjoyed going through it. I still have lots of questions, but I'll save that for another time. So on that, if anyone has questions on this topic or would like to talk to a subject matter expert further, please send an email mentioning this podcast to info, I-N-F-O, at Archive360, all one word, dot com, or my email. You can send me an inquiry, bill.tolson@Archive360.com, and we'll get back to you as soon as possible. You can also contact Chris at C-C-R-O-N-I-N @halock, H-A-L-O-C-K, dot com.
Bill Tolson:
Also, check back on the Archive360 resource page for new podcasts with leading industry experts like Chris on a regular basis. I also have several podcasts lined up that I'm going to be recording in the new future, that I have recorded with state representatives, and I have few more of those on. I'm talking to the Uniform Law Commission I think week after next on the podcast, and we just published a podcast with Jordan Crenshaw at the US Chamber of Commerce in DC, really interesting discussion. But keep looking back, and we're working with all kinds of folks to stay within this data security, data privacy, information governance type of focus. So with that, Chris, I very much appreciate you taking the time with this. Really, really enjoyed our talk. And with that, Chris, thank you.
Chris Cronin:
It's great being here. Thanks, Bill.
Questions?
Have a question for one of our speakers? Post it here.