Podcast Description:
In this episode Priya Keshav, CEO of Meru Data, discusses the effects of the explosion of privacy legislation on corporations. This episode also discuss:
-
Whether the approach to regulatory compliance can be simplified by using one set of regulations as the "golden standard"
-
Reasonable data security: how is it defined and who enforces those requirements
-
The proposed federal privacy legislation and if it will pre-empt the current state privacy laws
Blog
The "Reasonable Security" Standard For Data Privacy Revisited, Again
The bottom line has been and remains that data privacy laws must become more prescriptive in defining the data security technologies and practices companies are required to use when collecting, using, and storing sensitive PII.
Speakers
Priya Keshav
Founder and CEO
Meru Data LLC
Marie Patterson
Chief Marketing Officer
Archive360
Marie-Charlotte Patterson is the CMO of Archive360. As a pioneer of measurable ROI-based marketing automation, digital marketing and content marketing, Marie’s spent the past 20+ years helping transform and grow international and US software companies in the digital archiving, Governance, Risk Management and Compliance sectors define and dominate their market sectors.
Transcript
Bill Tolson:
Welcome to Archive360's Information Management 360 Podcast. This week's episode is titled, Data Privacy and Security From a CEO's Perspective. My name is Bill Tolson and I'm the Vice President of Global Compliance and eDiscovery at Archive360. Joining me today is Priya Keshav, CEO of Meru Data. Welcome, Priya, appreciate you being here.
Priya Keshav:
Glad to be here, Bill.
Bill Tolson:
I mentioned you're the CEO of Meru Data. Can you give us a brief description of what Meru Data is, what you do, those kinds of things?
Priya Keshav:
Sure. We are a privacy managed services firm, so we help companies with privacy compliance efforts, we help with global initiatives around privacy. So it could be as simple as managing their DSAR, to helping them improve their process so they're able to do true privacy by design.
Bill Tolson:
DSARs are really an interesting topic for me and we'll get into it a little bit here. I think a lot of people don't understand what they are and what kind of complexity they can add to a company. I'm sure you've seen it all, probably a lot more than I have, but really appreciate it, Priya. And everyone listening, Priya and I will be discussing this really interesting topic today, of data privacy and security. Many of the podcasts that we've done in the past, we look at these, especially data privacy and data security, from various points of view. Priya and her company actually being very focused on this and helping clients, I think, is going to give us a really interesting view into this. Let's go ahead and kick it off.
It seems as though we're witnessing an explosion on privacy legislation in the US and around the world, including Canada. In fact, January is when state legislatures typically start a new session, and so far there's been a rush to introduce new data privacy bills in state legislatures around the US. And again, we're only in, what are we in, February. On top of the five current data privacy bills that became law over the last couple years, and I'm sure most of the listeners have heard about those, and they're from the states of California, which has been around for a couple years, not the state, but the privacy law, state of Connecticut, Utah, Colorado, Virginia. And then in the first, I think it was two or three weeks of this year, 2023, in January, there was an additional, I think, eight or nine states that introduced new data privacy bills. And obviously there's going to be a lot more, there always are.
The question is of those bills, which ones will actually make it into law? I think last year there was a very large number that were introduced in many of the state legislatures, and I think last year I think three became law. So as I've mentioned in several podcasts and blogs and webinars that I've done, these new data privacy laws are similar but different enough that organizations trying to comply will not really be able to pick a high watermark privacy law, that if an organization meets that particular law, they'll automatically be in compliance with all the others.
When I bring that up, the first thing I hear people say is, "Well, we're going to concentrate on complying with California." So it's like good, but ... It's interesting, Priya, all of the laws look somewhat similar. California's is a little off, but the other states I talked about, they all use a lot of the same terminologies, some of the same definitions and stuff like that. In fact, in talking to the various state senators who've authored these laws, they all admit, "Oh yeah, we copied Washington State's from several years ago and then we added to it." And then the State Senator from Virginia said, "Yeah, we did the same thing," and Utah and Colorado.
So they're all sort of similar, but I think the problem, and tell me if I'm wrong, number one, on this, but in a lot of the laws, they're slightly different enough to make it impossible to have that high watermark law. I mean, the definitions can differ enough, exemptions, timing and all kinds of things like that. Am I kind of off base here in saying that with all of these laws and the more to follow, this is obviously going to be a complex environment for companies, companies that are your clients, on how do we meet those laws, and because they are slightly different from law to law to law, it's going to make tracking these things and complying with them much more difficult?
Priya Keshav:
Yeah, absolutely. So I have clients who have taken an approach that they would offer same or similar rights to all customers, and then I have customers who have taken the approach to take a very state specific approach, which means Colorado provides you this, California provides you this, Virginia provides you this, and that's what I offer. And in both cases, we've looked at implementations and it's pretty much exactly what you talked about, which is when I say, "Okay, I'm going to offer the same rights," it's not a simple. Yes, it looks like, well, I've taken a simple answer, so it should be, but no, because California expects you to provide details around deletion, for example, once I've completed, I have to talk about the exceptions that I have used to sort of not delete if I have data that I've possibly not deleted. I have to be able to reference it. And there are specific requirements around what that closure email should look like.
Colorado has a completely different expectation around what that should look like, and Virginia probably has no details around it, but then what I'm expected to provide is different. So if someone from Texas applies and I delete their data, I couldn't say, "Hey, there are some exceptions available to you under California. And so I have not deleted the state." I couldn't say that, so I have to come up with something more generic. So by the time I'm looking at the implementation for all the customers across different states, and I've made sure that we've complied with all the regulations, the solution is like a cobweb, literally.
And even if you took like, okay, so maybe the problem is because you've generalized it, you are trying to apply multiple rules into one generic process and maybe that's what complicated it. So if you take the approach of providing exactly what the law requires within that state, only to those citizens of those states, again, you have the same issue, which is now I have to know which state you belong to, I have to be able to provide 15 different rules to a person who is processing it and be exact and specific. So it's a no win situation and there is no way. Yes, to some extent, California is different, and all other laws are sort of similar in terms of terminology, but I don't think it's that simple. I do believe some aspects of Colorado regulations are pretty strict, probably stricter than California, for example. There is no way to simplify. I wish there was, but the more you peel the onion, the more layers there are. So that's the current situation with pretty much privacy compliance.
Bill Tolson:
You brought up a couple of key points there and excuse me for kind of jumping around, but you kind of got me excited on some of these things. Two things, and you mentioned this and I referred to it as well, but one of the things that I find many companies, potential clients, those kinds of things, that I've spoken to and work with, they don't quite understand the reach of the laws. You mentioned the Colorado law, and I'm in Colorado and I have spoken to the state senators who sponsored it and all kinds of other stuff.
But I think companies in general, and this is the same with the GDPR as well, they don't understand that the reach of these laws are beyond the borders of Colorado. I mean, they'll say, "Well, gee, I'm not a Colorado company. I don't have to care about Colorado." It's like, no, that's wrong, basically it comes down to the data subjects residents, not where the company collecting the data is. So if you're collecting my PII and I live in Colorado and you're in New Jersey, you're still susceptible and must comply with the Colorado laws on how to treat my PII, correct?
Priya Keshav:
Yes, absolutely. And I think again, not providing a legal opinion here, it's much more complex than that. For example, you could have an employee, for example, or a customer who was probably in let's say Texas before but now lives in Colorado, I don't know when exactly the jurisdiction starts applying, but yes, it's technically around citizens of Colorado. So the reach is not about where you are located, it's about where your customers are located and where your employees are located.
Bill Tolson:
Like you mentioned with consent, but with all of these laws and more coming and the differences between them, for example, was consent and opt-in or an opt-out, and the various other things, I mean, I think data collectors are going to have to very granularly track the actual single data subject PII. Did it come from a Colorado data subject? How was consent given? Was the consent given on a very specific use of their PII for some short period of time, and after, say two years, do you have to delete it, but only that data subject's PII? I'm looking at this and I'm thinking the complexity of a company with a database of millions of these data subjects PII and how they will track them to be compliant with the various local state laws, is just going to be a massive undertaking.
Priya Keshav:
So here's what I would like to say for that. The idea that you could provide a privacy notice that is clear, concise and informed and where somebody could actually provide ... And this is, you're talking about ... maybe I should back off a little bit and talk about more and more data is being generated faster than ever, and every piece of ... I think I was doing a lunch and learn a month ago, and one of the things that I was talking about is just if I take a person who has just woken up, probably grabbed a cup of coffee, reached the office, I'm talking about maybe one hour time span for an individual in the morning. And if you kind of think about all the data collections that are involved and the number of people that to whom you have given data to, like when I say number of people, as the number of companies to whom you have provided data through just that process of simply waking up, grabbing a cup of coffee, getting in a car, driving to the office and maybe logging into the computer.
That's about all you have done, nothing significant, nothing out of the ordinary, but you've probably given lots and lots and lots of information to many companies, probably 10, 15, 20 companies in the process of just that one hour. So the amount of data collection is huge, and when you look at it from a company standpoint, this applies to all the data that anybody's collecting and processing and holding as part of the process. So to think that you could just provide a privacy notice and that someone completely understands all of this, be able to make those correct choices and be able to sort of inform you with a yay/nay. It sounds very good in theory, but because of the complexity of the world we live in, if you look at the practical implementation of it, it's just very, very, very tough. What you described is an impossibility, just being honest here.
So imagine in that one hour, if 20 people are collecting data from you and all kinds of different pieces of information and everyone popped up a screen somewhere or there was a voice somewhere just asking you, "Do you consent? Do you consent? Do you consent?" And then give you some idea. Do you think you could comprehend process, be able to say, okay, or are you going to be overwhelmed through that process, right?
Bill Tolson:
Oh yeah, overwhelmed.
Priya Keshav:
Both from a consumer, as well as for the companies. I am a privacy geek and I believe in privacy, so I want to make sure that as an individual whose data is being collected, I do want to make sure that I have choice and I also want to make sure my data is secure, protected. But at the same time, as a person who is on the other side trying to implement this, I can see the complexity in trying to implement the data privacy law and making it user-friendly to me as a customer, it's a complex problem that would take some time and effort and I don't know how much of that is possible just through regulations. And some of it will require a lot of innovation, a lot of cultural change, it's a big shift from where we are.
Bill Tolson:
I absolutely agree with all of that. I think most companies, even a lot of GCs and CISOs and stuff like that, really I don't think have taken the time to comprehend what this new privacy landscape environment is going to mean, even just in the United States or North America. You mentioned DSARs, I want to get into those, but one thing that you brought up a couple minutes ago, what we just talked about, it's a tiny bit off-topic, but it's a question that I've been following for quite a while now, especially when GDPR came out. And with the GDPR is the right to be forgotten, but in many of the privacy laws, just the right to erasure, the right to deletion if there are no overriding reasons that they can't, eDiscovery or regulatory compliance type of thing and the financial services. But my question is this, do the privacy laws around the right to deletion, does it assume an unrecoverable deletion? And you know what I mean by that?
Priya Keshav:
Yeah.
Bill Tolson:
I've asked that of lawyers in Europe around GDPR and some of the data privacy people there, I've asked people here and they all kind of look at me and say, "I don't know." You would assume the right to deletion would mean that a 15-year-old with a Norton rescue disc, couldn't recover your information in 10 seconds. But most computer systems, including archiving systems and the information management systems and so on, don't necessarily do an unrecoverable deletion. They do a standard computer delete. And I've asked attorneys in Europe, which the EU, they're very much more advanced, at least strategy wise, around this stuff, I think than we are, and they couldn't tell me either for sure if an unrecoverable deletion was required.
Priya Keshav:
The data should be deleted or anonymized, that's what the law requires. So if the data has been deleted, so as to when the req data cannot be attributed back to that individual, which also includes a way to anonymize it, then that would be reasonable under the regulation. That is the regulation. But then when you're talking about unrecoverable, there are multiple aspects to deletion, so one is you are talking about not being able to pull that back from the hard drive, if I'm decommissioning the hard drive. I may not be decommissioning the hard drive, nor am I going to be subjecting it to that sort of requirement. But when I am decommissioning the hard drive, if it did contain personal information, then I would kind of say that would be reasonable security to ensure that you are properly disposing it of, so whereas nobody would recover it.
But if something is live and part of a database, then you're not talking about the magnetic disc and decommissioning it, you're talking about deletion in the context of a database. And so there, you're looking at it as it's deleted to a point where you will not be able to re-identify the person. And that could be a deletion, that could be an anonymization, that could be something else. It could be just an encryption that cannot be decrypted anymore. There's so many different ways to do it, and you take the approach that is appropriate based on that particular situation.
Bill Tolson:
People talk about, well, you need to do a digital shred and a 5015.02 DOD, and that's not that easy to do. But if Bill Tolson calls up Archive360 via a data subject action request and says, "Delete my information." If the applications that are managing the data don't have a digital threat capability, my thought, and I'm sure it's been many other people's thought, is encrypt that specific PII and then throw the encryption keys away. That's unrecoverable mostly, I would think, except by maybe a couple of governments could do something with it.
But I thought that was a really interesting question that when I first started asking it a year or two ago, it kind of surprised people, they hadn't thought of. In fact, that is very close to a, I'm sorry to be doing this to you, there's another question, the right to deletion means you have to, if given no other barriers, you need to delete information, what about my PII on backups? Backup tapes or spinning disc or whatever, that's not necessarily addressed in the regulations either. Do you have any opinion on the right to deletion and backup data?
Priya Keshav:
No, it's actually addressed.
Bill Tolson:
Oh, really?
Priya Keshav:
It depends on the circumstances and the technical mechanisms that are available to you. So that's what they would say. In most cases, you don't have to delete the backup if it is not going to be restored. You would only have to worry about the PII when it's being restored. If it's not being restored, you could leave it as it is and let it be overwritten per the normal cycle. But that again, would depend on if the PII that is part of RPI, because I don't like to use the word PII, PII has got a very specific definition, personal information is a better way to describe it for privacy purposes.
If the personal information that is present in the backup will not pose a specific risk, and that is very context specific. So you couldn't answer some of these questions genetically, you have to look at the particular situation, but if it is not going to pose a lot of risk, then it makes sense to let it sit till it gets overwritten. But if for some reason, if it's going to get restored back at that point, you have to go back and fix it.
Bill Tolson:
Yeah, that's what I was told by some privacy lawyers in the EU as well, is that their thought would be if I request a deletion, then somebody somewhere keeps track of all those deletion requests and the next time a tape or a backup is restored, then those particular data subject requests are then assigned and deleted. But having to respond each time maybe 10 or 15 or 20 times a month that somebody asks, having to go through all your backup tapes or something just to address that one, obviously that doesn't make sense. I thought it was pretty interesting.
I don't think we've talked about this yet, but there's various state privacy laws, we also have the federal bill in the House of Representatives, the ADPPA, and actually it made it out of the Energy and Commerce Committee last year, I think on a vote of 52 to four or 52 to two, something like that. But it was being held up because of a couple clauses, the main clause that California didn't like was the idea of preemption. A federal bill would preempt all the state privacy laws, and that was an issue for the state of California. What's your thoughts on the need for a federal law to preempt all the state laws, and do you think that could happen with this one?
Priya Keshav:
I think it's probably tough, but that's just my point of view. I think it's more likely to be a minimum, but then I also believe the longer it takes for us to get a federal privacy law and the more the number of states that have actually got comprehensive privacy laws, the more difficult it's going to be to preempt.
Bill Tolson:
So your opinion is that potentially that preemption clause may be weakened or taken out?
Priya Keshav:
I think it's going to be harder to get the preemption, but that's my opinion. We can all look at the crystal ball and see and say, "This is going to happen or that's going to happen." We have no idea. But I also believe the longer it takes for us to get a federal privacy law, and if every state puts in the effort to draft what's appropriate for them, for their citizens, it's going to be very difficult for the federal government to sort of preempt all of that. But it's going to be easier for them to just say that we set up minimum, and if some states want to go above and beyond, they can. But that's again, who knows?
Bill Tolson:
Right. And I believe with the ADPPA, the enforcing authority would be the FTC, is that correct?
Priya Keshav:
I don't know what-
Bill Tolson:
Potentially.
Priya Keshav:
Exactly, yeah. FTC broadly has taken a bigger role in enforcing and privacy has been on their agenda or one of their agendas for a while now. And you can kind of see that with some of the cases, with Twitter paying $150 million in civil penalties, for using information for a different purpose. In this case, they'd collected phone numbers for the purposes of using it for MFA, but then decided to use it for targeted advertising or marketing.
And you see that with Kochava, they filed a lawsuit against Kochava for selling data that tracks your location data, especially movement of data to reproductive health clinics, place of worship and other sensitive location. You see that with Epic, where they ended up with a $275 million fine for violating children's privacy law, because you could see that there was a whole bunch of information that was embedded inside the game, and it made it so much easier for kids to sort of do things where it was obviously not something that kids are not capable or at least you need parental consent before you collect children's data, and so they felt that they were violating the children's privacy laws.
You see a lot of enforcement from FTC and they have been very clear in stating that privacy will remain a focus area for them. So if something, I mean, obviously if we have a federal privacy law, I can see how FTC plays a role in enforcing it, but then we don't even have one yet, so it's kind of hard to say what that will look like, what enforcement will look like, till we get to a more solid bill that looks like it'll pass or it passes.
Bill Tolson:
Yeah. One of the other controversial provisions within the ADPPA, is the private right of action, which I think is it California that includes private right of action?
Priya Keshav:
Mm-hmm.
Bill Tolson:
But I don't think the other states, in fact, in talking to the various state senators who authored the other state bills, they all said, "Yeah, we all tried it, but we had to negotiate it out. It never would've passed." But the private right of action for those listening is the idea that a data subject, an individual, under certain circumstances can sue a company for misuse of their data or a breach or whatever it happens to be. But in the four other state bills, it's really up to the state attorney general to do the enforcement, and they don't allow, at least yet, individuals to sue companies, that all has to be done by the attorney general, that's incorporated into the ADPPA as well, and I thought that would probably be the one that would be negotiated up pretty quickly.
Not that I care one way or another, but I know it's been a major issue, at least at the state level on all of the various privacy laws as well. So that was really interesting. I'm hoping that the feds do do something, we need to do something about obviously privacy at the federal level. Another thing, and this is across all the state bills, I think it's also GDPR, it's also in Canada's privacy bill C-27, but it's this idea of they include data security within the data privacy bill. Now my problem, and a lot of people tell me I'm wrong, but I'm going to go ahead and state it anyway, my problem with the state privacy bills is they use the same data security nomenclature and definition and plus or minus a couple of words, they all say that the data collector or processor must use reasonable security to secure data.
My issue has been what is reasonable security? That could differ pretty wildly, and I've asked all of the state senators, as well as lots and lots of subject matter experts, wouldn't it be a better idea, at least at the state privacy bill level, to incorporate a little bit more prescriptive data security requirements? I mean, not saying, "Well, you must use vendor X's perimeter security and you must use vendor Y's, this and so forth." But the federal cybersecurity executive order that President Biden put out in 2021, it named some simple stuff. Number one, you got to move to the cloud, but must use multifactor authentication, you must encrypt data, and they also say applications must be designed on zero trust. My question to many of the state senators has been, why couldn't you have said all PII in transit and at [inaudible 00:26:20] must be encrypted? And nothing against the state senators, but they're not technical folks and they've taken input from other people. But what's your thought on maybe adding a little bit more prescriptive requirements around data security?
Priya Keshav:
The reason why reasonable security, to me, makes sense is because you can't secure an IOT system the same way you secure a computer. There is no way, if they came up with three things that you're supposed to do, there's going to be a fourth thing that is necessary to actually secure that data and it's not going to be very helpful to come up with. You have to do zero trust, you have to do MFA, you have to do data encryption, because sometimes MFA might not be the solution or data encryption looks different in certain environments. So I think it makes sense that they say reasonable security, and there are a lot of things to look to for guidance on what is reasonable security. If you are part of the payment card industry, there are data security standard PCIDSS, that you can rely on, on what that should look like for payment card industry.
You can look at NIST security framework as a guidance. You can look at the ISO framework for guidance. So there are, for example, California, the attorney general has listed 20 controls in the Center for Internet Security, CIS, critical security controls, as minimum level of controls that might be necessary. So there is a lot of guidance out there when it comes to security, and most companies are already following some of these things. A lot of companies are probably following NIST or ISO as their framework and standard, there are audits that they usually go through and there are certifications that they may have obtained. So it makes absolutely no sense to sort of put list of security controls and the fact that they just said what is reasonably secure, probably makes more sense to me because it means that what is necessary given the context of the data, the type of environment, the type of technology. And I think from a consumer standpoint, you want it to be contextual because if I told you to do X and X doesn't work really well for that particular situation, I haven't really secured that data. That's my opinion.
Bill Tolson:
Okay. And a lot of people have said the same thing you just did, and it's hard to argue with that, but I think there was a case decided last year, maybe it was the Wawa Convenience Stores. I'm in Colorado, I've never seen a Wawa, but I guess they're pretty big on the East Coast. And I guess part of the decision was the judge having a list of security requirements that the company now needed to take to equal reasonable security. One of them was encryption, but that was a very specific case, but I thought that was really interesting. The other thing is for those of you who know who the Sedona Conference is, a large legal think tank, and I used to belong to them, but they put out some really great material. One of my podcasts, I interviewed one of the Sedona Conference authors who created the Sedona paper around defining what the legal definition of reasonable security was when it came to cyber.
Really interesting stuff, but it comes down to reasonable security legally comes down to an algorithm and it gets much more complex than that. The fact that Sedona had already addressed this and really written a really interesting paper, part of it I could understand and others, I couldn't. At least I think they were getting input from others as well as to, we need to nail this down a little bit. But for those of you listening, if you can go look for the Sedona Conference paper on reasonable security, I think it'd be really interested. I think it was from 2021, but yeah, really interesting stuff. You've mentioned, and I've mentioned it several times now as well, the idea of DSARs or data subject access requests, and there's a lot around this, but this goes back to, and most bills, including Canada's, but the idea of data subjects rights.
You have the right to query a company as to what kind of data do you have on me, how long have you had it, how's it being used, has that ever been sold? And oh, by the way, I want it to be deleted. Now for a data subject to request a company, say I'm requesting Archive360, what data do you have on me and have you sold it? Those requests are put into the company, usually through their website via a data subject access request. And I know in Europe, in the EU, companies over there have been experiencing DSARs for a while now, since GDPR came out. In fact, I think it was in 2021, IDC and Gartner both estimated that just based on GDPR requests, the average number of data subject access requests that a company in the EU was receiving on a monthly basis, was 147, and that the average cost to respond to the DSAR was $1,400, times 147, that's like 200 grand a month.
I thought, gee, if you can imagine the United States, as more and more states have laws come up and they have the ability to put DSARs in, how are companies going to be able to respond to that potentially rising number of those requests and what's that going to cost? How is the company going to be able to manage that? And you mentioned, when you were introducing your company, that you work with companies around DSARs, just wondering what you think the future for companies around DSARs are?
Priya Keshav:
Like you said, yes, I have been supporting companies in responding to DSARs for a very long time now. You see a trend where the number of requests are constantly up, so it's not necessarily constant, it's growing. And part of it is what you said, which is the number of states that have sort of introduced rights, so which essentially allows more customers to be able to submit these requests, and the other part of it is more awareness. So as they sort of realize that they have these rights and that they can submit it, and when they hear about some of the issues in privacy in the news, it just builds this awareness and a habit of being able to sort of ask for either a deletion or opt out or access request. I think the number you stated, I do expect that not all requests are the same, but requests for deletion is not the same as a request for access.
Depending on the company, the level of automation that is present, as well as data environment, the fulfillment of DSAR can be pretty complicated or pretty simple. So I do agree that 147 does not surprise me, there are companies that receive a lot more than that on a regular basis, but if you have the right kind of tools, the right kind of process, you start to learn to be able to manage some of this in a more automated fashion where it scales. But that doesn't mean I'm saying that there is no burden, it's a big burden on companies to be able to fulfill this request. And part of that requires all the proactive stuff that we're talking about, privacy impact assessments, privacy by design, being able to understand data minimization. So all the concepts that sort of is part of every privacy law. If you've understood and have implemented some of those things, then the backend piece, which is the reactive piece of actually just responding to the DSAR becomes much easier.
Bill Tolson:
That's really great. There's two questions on that. I've been doing a lot of writing on ransomware and extortionware and those kinds of things. Do you think DSARs, by ransomware and extortionware could be weaponized, like deluge a company with thousands of DSARs of made up names and stuff, and still the company has to at least respond to say, "Is this real or not? Does that data subject name actually exist? Do we have to go through and search stuff?" I'm just wondering if part of a ransomware or an extortionware attempt could be just overloading a company with DSARs that they have to at least make some sort of attempt to respond.
Priya Keshav:
I would kind of say most companies I'm expecting would have reasonable checks and balances against something like that. There are ways to detect if a bot is kind of sending you data and stop the bot from submitting such requests. Most companies also have multi-step validation process that sort of helps in identifying spams and things like that. So there are some checks and balances that are put in place into the process that should help. Not to say that it will completely eliminate any kind of risk, but one of the biggest risks with DSAR is giving the wrong information to the wrong person or deleting, based on some requests that shouldn't be deleted. So there's a lot of thought put into making sure that we're honoring the request, but also ensuring that it's a valid request.
Bill Tolson:
Eventually, I would assume there would be some sort of automation that would be able to respond, some AI or something. But the other question is along the same vein, and I know we're running out of time, so I think this will be our last discussion point, but I've made a point in a lot of writings over the last year or so, the idea that if a DSAR is input into a company, then that company has to make an attempt to respond to it. I shouldn't even say that, shouldn't be an attempt, they need to respond to it. My question is this, if PII, usually in the companies I've been around and I've consulted with those kinds of PI, could be spread around the company like crazy, marketing people could have spreadsheets of potential clients sitting on their laptops and sales, something else, and usually PII is not in a single place within an organization and it can kind of permeate the place.
So my question is this, and I put this to many information management people as well, but for companies to be able to respond to a DSAR request, how would companies know what I have on my laptop or what other people within the company have on their works stations that might pertain to the DSAR, but they don't have visibility? My supposition is this, I think we're reaching an inflection point where companies are going to have to manage all of their electronic data, not just the five or 10% that are considered regulatory records, but everything, so they can respond when these DSARs start coming through. Am I way off base there?
Priya Keshav:
I absolutely agree. When it comes to data minimization and retention, one of the biggest pivots that companies have to make is stop focusing on records and focus on all of the data, especially personal information tends to be in places that are probably not corporate records. And so you have to look at everything, you have to look at are you collecting the minimum necessary and nothing more? Who has access to it? Where is it being stored? Who is it being shared with? Are you oversharing or accessing the data? It is only processed for the purpose for which it is collected, then you're promptly disposing of that information. So it's the entire life cycle of the information. But for privacy, again, what is relevant is just the personal information, whereas corporate information could be much larger than that. There might be other pieces of information that will not fall under the purview of privacy regulations. But when it comes to privacy, for sure, there's some overlap, but a very small overlap with records. You're looking at managing all of the information and the overlap with records is somewhat coincidental rather than relevant.
Bill Tolson:
And that goes against, at least in the United States, corporate culture of all the data I have on my company laptop, I kind of consider mine. The records are the companies. There's a cultural shift that's going to need to come up that says, yes, the company, IT is going to have to manage everything, whether it's syncing to a central location or different technologies, whatever. But I think to be able to respond to these DSARs and also manage this stuff, because like you say, based on consent and other things, stuff can't be kept forever. I think we're reaching that point, and I've told information management people this, that we're reaching to the point where you're going to have to manage a lot more data, not just the five or 10% data records, but everything in the company, because some of that data, a decent amount is going to have some sort of risk and liability to it, so it's going to have to be known and also managed. So I think at least on the records management, information management side, they're looking at a growth in their responsibilities.
Priya Keshav:
Yeah, makes sense.
Bill Tolson:
Well, Priya, this has been very enjoyable and I think it's been great. Hopefully you had fun as well. Your comments and discussion points I thought were fantastic. I really want to thank you for making the time today. And for those listeners, if anyone has questions on this topic or would like to talk to a subject matter expert, please send an email mentioning this podcast to information, I-N-F-O, @archive360.com and we'll get back to you as soon as possible. You can also directly email Priya at pkeshav@meru, M-E-R-U, data.com. And also check back at the Archive360 resources page for new podcasts with leading industry experts, like Priya, on diverse subjects, including data security, data privacy, information management, and archiving, records management, eDiscovery and regulatory compliance. Again, thank you very much, Priya, really appreciate it.
Priya Keshav:
Thank you for letting me be part of it.
Questions?
Have a question for one of our speakers? Post it here.