Description:
In our latest episode, Bill Tolson and special guests Rep. Steve Elkins, Representative (DFL) District: 49B Minnesota House of Representatives discuss consumer privacy legislation. Steve is the co-author of the Minnesota Consumer Data Privacy Act bill. During this podcast, Bill and Steve take a deep dive into the details of the bill and Rep. Steve Elkins views on consumer privacy in general.
Blog
State Data Privacy Laws Leave More Questions than Answers
Read more on the challenges organizations will face with the myriad of different privacy regulations emerging from multiple states. How will organizations manage each set regulatory requirements?
Speakers
Steve Elkins
Steve was first elected in 2018 to represent Minnesota House District 49B and served on five committees during his first term: Transportation, Local Government, Commerce, State Government Finance and Government Operations. Steve has a degree in economics from the University of California at Berkeley. He worked as a transportation economist in the airline industry for the first 15 years of his career. When he left the airline industry in the early 90s, he wanted to stay involved in transportation policy, so he began his public service career by volunteering to serve on Bloomington’s Traffic and Transportation Advisory Committee in 1995 and also served on the Bloomington School District’s Transportation Task Force. About 25 years ago, Steve changed careers and mastered the discipline of IT Information Architecture. Over the last decade of his private sector career, his work was focused on Health Information Technology, and he worked as an Information Architect at OptumHealth in Eden Prairie for 7 years until his retirement in 2018.
Bill Tolson
VP of Global Compliance & eDiscovery
Archive360
Bill is the Vice President of Global Compliance for Archive360. Bill brings more than 29 years of experience with multinational corporations and technology start-ups, including 19-plus years in the archiving, information governance, and eDiscovery markets. Bill is a frequent speaker at legal and information governance industry events and has authored numerous eBooks, articles and blogs.
Transcript:
Bill Tolson:
Welcome to the Information Management 360 podcast. This week's episode is titled the State of Consumer Privacy Legislation, a conversation with Minnesota State Representative Steve Elkins. My name is Bill Tolson and I'm the vice president of Global Compliance and eDiscovery at Archive 360. Joining me today is Minnesota State Representative Steve Elkins of District 49B. I think I have that right. Correct, Steve?
Steve Elkins:
That's correct.
Bill Tolson:
All right. And Representative Elkins or Steve is the coauthor of the Minnesota Consumer Data Privacy Act Bill. And we'll spend a lot of time talking about it and Steve's thoughts around privacy and consumer privacy in general. But with that, Steve, again, thanks for taking the time joining me today from your busy schedule. I know that you have something lined up right after this. So, we will keep it within the guidelines. Quickly, how long have you been serving in the Minnesota legislature?
Steve Elkins:
I'm in my second term. I was first elected in 2018.
Bill Tolson:
Great. All right. So, you coauthored the Minnesota Consumer Privacy Bill in I believe it was February of this year.
Steve Elkins:
Yeah, third version of the bill. So, what I've been doing is I've been taking a lead from Washington State Senator Reuven Carlyle. So, this is the third version of his bill that I've introduced in Minnesota. The first one was in May of 2019, very end of my first session in the house. And so, I've been working on this for about two and a half years now.
Bill Tolson:
You began working on it shortly after you joined the legislature, right?
Steve Elkins:
Yeah. I am an information architect, IT information architect by vocation, about 25 years of experience doing data management, data architecture type of work. And at the end of the 2019 session, just to get it introduced and out there into the wild for discussion during the interim between periods, it's actually a lobbyist for Microsoft named Amos Briggs, who taught me the first version of this bill.
Steve Elkins:
And I didn't really know what its provenance was at the time. But I agreed that this was an important and interesting topic. And that given my background in information architecture, I would probably be the most logical person in our legislature to take this on.
Steve Elkins:
But I made it very clear with Amos when I agreed to take the bill that I would be taking it on as the convener of the discussion and not necessarily as an advocate for the bill in that form. As I read through it, this was Senator Carlyle's (WA), I only learned this later, but it was his first effort at crafting such a bill. And it had a lot of gaps. So, the first time I actually read it through in detail, it was like, oh, god, it doesn't cover this. It doesn't cover that.
But by the time his second bill was introduced at the beginning of 2020, it was dramatically improved. And it's like, yeah, he covered this, he covered this, he covered this. So, the 2020 version of the bill was dramatically improved from the first draft.
And then this year's version of the bill didn't get his bill until beginning of the year this year. And I know he was kind of getting delayed being distributed in draft form because his staff put a lot of work between November elections last year and the beginning of this year to try and pick up as much as possible in the way of useful definitions out of the California Privacy Act, which the California voters passed last November
So, the basic thrust of his bills, basically, he's trying to provide the same basic data privacy protections that are contained in the California laws, the CCPA, and now they're CPRA. But a lot of the California laws have been kind of hurriedly drafted or in the case of the most recent one that CPRA have come through the initiative and referendum process in California, they haven't been through the normal legislative process.
And speaking with Assemblywoman Jacqui Irwin in California, who has been in the middle of California's efforts, she will freely admit that, especially with the CCPA, they had about a week to throw that together to head off an initiative and referendum initiative about three years ago.
And so, that California language, it's commonly acknowledged just isn't all that well drafted. I think the reason why legislators in other states are gravitating towards the Washington bill as a template rather than the California bill is that they're just better drafted. The bill that I've introduced this year, it's a House 514 92 is basically a clone of The Washington Privacy Act as it was originally introduced by Senator Carlyle last January.
And so, it's been Minnesota [inaudible 00:05:08] Washington has a tradition of having long preambles on their bills that explain the objectives. We don't do that in Minnesota. So, my bill doesn't have that preamble. And of course, all of the references to state laws Washington had to be changed to refer to the equivalent Minnesota state laws as well.
The only substantive change between the bill I have as it was introduced in Senator Carlyle's bill this year is that I generalize somewhat the definition of locational data privacy, the specific location. A lot of the legislation that's out there, I think, in California, as well will draw these arbitrary quarter mile or a third of a mile circle around the place and say, you have report on a person's location within this arbitrary circle.
And my problem is as a 25-year information architect, you spend a lot of time doing transportation work. I'm actually transportation economist by education. I realized that there's actually no practical way to have anything that you can enter into a database field that could be guaranteed to comply with any of those arbitrary standards. And so, I generalize the definition of specific location. Otherwise, it's pretty much going to be if you've read the Washington Privacy Act and are familiar with it, you're pretty much going to be familiar with my bill as it was introduced [inaudible 00:06:31].
Bill Tolson:
Well, that was a question I was actually wondering yesterday was with the kind of ongoing failure of our federal government to really do much on privacy, the states are starting to do their own thing. Like you say, Virginia and California, Colorado, or where I'm at passed a bill. And then Connecticut, New York, others that I've had a little interaction with, they're all trying to bring up their own bills.
Bill Tolson:
And in talking to industry people, everybody's worried that we're going to have 50 plus state bills that are wildly different or even slightly different. And it's going to be extremely difficult for industry to kind of keep track of all the various differences. I was wondering, it sounds like you are to a certain extent, but is there any attempts by state legislators to kind of work across state lines to maybe take the best of the various bills and make them so that they're not wildly different to maybe make it easier for industry to be able to follow?
Steve Elkins:
Yeah. Literally, everybody I talked to about this issue agrees that ultimately, this is an issue that Congress needs to resolve. The problem is, is that everyone that I talked to about this issue also agrees that there's not a snowball's chance in hell [inaudible 00:07:50] to do so anytime soon. As a result, the states are serving their traditional role as the laboratories of democracy.
But we are all very cognizant of the greatest fear in the business community is that we are going to end up as a result with this complete hodgepodge of 50 completely different state level bills, and that it will be hard for national businesses to conform to all of them. I think that's why so many of us are actually gravitating towards Senator Carlyle's bills as a basic framework, so that the bills that passed in California and Colorado this year are both based on the Washington Privacy Act as a foundation.
So, the National Conference of State Legislatures has put together a comparison. In their case, they put together about a 45-page, very detailed comparison of definitions, provisions, and a significant number of the bills, individual bills that are being introduced in the individual states are built on the Washington Privacy Act.
That's the reason why I asked the lobbyists from Microsoft, why are you bringing me this bill? Why is Microsoft interested in having a state level bill? And he said, "We're trying to peddle this Washington bill as a state template so that we don't have to comply with completely different bills in every state."
Bill Tolson:
Well, that makes a lot of sense.
Steve Elkins:
A lot of us are speaking with each other at this point. The National Conference of State Legislatures had a data privacy summit in Alexandria, Virginia in September, and I provided an opportunity for many of us who are working on state level legislation to meet each other in person for the first time and collaborate.
Steve Elkins:
You mentioned the state of Connecticut has a bill that is pursuing a different approach, but I'm in contact with Senator James Maroney there, he's taking the lead back. We have a conference call that he's organized for Monday, 20th, where a bunch of us who are working on this legislation are going to get together and compare notes on what we're doing.
Bill Tolson:
That conference you mentioned, I actually had a call with Colorado State Senator Lundeen yesterday, who's one of the coauthors of the Colorado Privacy Act. And I mentioned that I was going to be putting podcasts with you today. And he said, "Yes. I remember meeting Steve." And I think it was the same conference you mentioned.
Steve Elkins:
Yeah. He was there. And Senator Bob Rodriguez, who's a democratic coauthor on that also present. I got his business card right in front of me. And I think he's actually part of the group that will be getting together on a Monday.
Bill Tolson:
I have an agreement from Senator Lundeen to record a podcast next month with me around the Colorado Privacy Act. And I'll follow up with Senator Rodriguez as well. And I have worked with at arm's length with the Connecticut and New York legislators authors as well, just offer a little opinion from a vendor's point of view.
Bill Tolson:
But one of the things I wanted to ask you said that, yes, obviously, the states are working together for all the reasons you already talked about or some of the states are. One of the things, and I've been obviously following the CCPA and CPRA, since it kind of emerged as something to pay attention to, one of the things, obviously, in the CCPA were some interesting provisions that obviously, well, not obviously, but that wasn't in your bill, I don't think was in the Washington bill, or even the Virginia bill or so forth, was the idea of private right of action of a look back period with the California bill, and then the idea of presumed damages. Did you think about any of those as to whether they would make sense for Minnesota?
Steve Elkins:
The private right of action is probably the single most controversial issue related to these bills. It's not included in the Washington Privacy Act. And it's not been included in my bill because I haven't started amending it at all yet.
Bill Tolson:
Well, it wasn't included in Colorado.
Steve Elkins:
It wasn't. No, I don't think it was in either the Virginia or Colorado bill. So, it hasn't made it into any of the bills that have actually passed. I feel kind of ambivalent about it myself. What I'm planning to do, since I know that it will come up during the discussions is that I'm working with our local ACLU and trial lawyers to draft an amendment to have it included. And in the Minnesota State Legislature, my bill will have to make two stops. We'll start it in commerce committee, and then they'll have to move from there to the judiciary and civil law.
Steve Elkins:
So, I'm going to have an amendment drafted that would add a private right of action that will be discussed when the bill arrives in the judiciary and civil law commission, which would have jurisdiction over that particular issue, and we'll chew it up there and see what happens. But I'm not going to fall on my sword on this either way. I want to get a bill passed. In effect, it's going to turn out to be a huge obstacle to getting the bill passed. It's an issue on which I'm willing to compromise either way.
Steve Elkins:
I know this issue is the main reason why the bill hasn't passed in the state of Washington itself. The advocates on both sides of the issue are in the house versus the senate are dug in on their positions on this. And it's preventing the bill from actually passing in the state of Washington, which is unfortunate, because that's where a lot of the seminal work on this providing our bill template is originated out of work that has been done there.
Bill Tolson:
Does the private right of action potential provision kind of match up with this California idea of presumed damages. If a system was breached, you don't have to show actual damages. The fact it was breached and the data could have been taken is basically the endpoint that's where fine start being placed, and so forth?
Steve Elkins:
I really haven't dug in to that part of the California bill that closely.
Bill Tolson:
That hasn't shown up anywhere else, including in the EU's GDPR, which is kind of the granddaddy of them all. But this idea of presumed damages, you don't have to show actual damages, just the fact that a breach occurred means that there probably will be damages, therefore, you're going to start paying right away.
Steve Elkins:
Most states already have breach laws under which that kind of stuff is considered. One of the things I'm starting to look at is should the existing breach law be integrated with this bill? So, that's one of the things I'm going to be looking at. Since I took on this project, as soon as you introduce one of these bills in any state industry immediately descends upon you. At this point, I have an email notification list related to this bill that has over 70 stakeholders. But this one, nobody has really lobbied me that much on this particular issue.
Steve Elkins:
One issue I think that was raised from the Virginia bill is about providing an ability to cure, whether it's a breach or other violation of privacy bill. And I'm disinclined to include an ability to cure provision in the bill, but it is one issue that I'm studying more closely.
Bill Tolson:
Yeah. I obviously did a little research on you because I knew I was going to be talking to you. And seeing your 25 years in the IT related field, I was kind of excited, because I could maybe bring up slightly more technical things that you would have some opinion on.
Bill Tolson:
One of those topics that I've been wondering about for several years now ever since the GDPR came out was, I guess the wording around what's expected from organizations on how they are supposed to protect data. We get into security and those kinds of things, but the language is usually pretty high level, like an organization should take all reasonable measures to ensure that the data is secure.
Steve Elkins:
Glad you're bringing this up, actually, because that is, in fact, the one area where I think that as a long-time information architect that I can bring some unique value to the table. There aren't a whole lot of experienced information architects in state legislatures around. In the Washington frame, there is a section that talks about companies having to write up an annual assessment of their efforts to protect data and protect its privacy.
Steve Elkins:
And that's probably going to be the one area that I put a lot of effort into. I should say, man, I'm at a point where I've been collecting all of this feedback for two and a half years now, I still have people reaching out to me. I'm going to be meeting with some folks from the Electronic Frontier Foundation on Friday. But after that I have called a timeout.
Steve Elkins:
I've left my collaborators in terms of drafting the bill, house research and our revisers office here. I know that if you guys can clear a time on your calendar for me from this Friday, on through the end of the year, I'm pretty much going to retreat to my man cave in the basement and just be working on amendments to the bill. And a lot of them are going to be related to this topic. So, I intend to be in that annual assessment section asked companies to describe in much more detail than any of the other legislation I've seen so far, what are you doing to include data privacy by design principles in your policies?
Bill Tolson:
That's exactly what I've kind of been complaining about. I'm a member of the cybersecurity tech cord, which is a worldwide organization of companies that worked with the UN and work with the EU and others around privacy and cybersecurity and stuff like that. I brought this up with him as well. And I realized that at the legislation level, you don't want to be seen as advocating specific vendor technology that you just can't do that. All the other vendors wouldn't like it.
Bill Tolson:
I mean, the SEC Rule 17 for financial services gets a little close to that with worm storage and all this other kind of stuff. And this is for the EU GDPR as well, but we've reached that point in technology availability and cost of instead of saying controllers shall establish, implement, maintain reasonable mystery of technical, physical data security practices is could we soon take the next step and say, all personally identifiable information shall be encrypted and maybe encrypted while at rest and while in transit.
Bill Tolson:
This is the biggest thing I don't understand. You hear these breaches all the time. And all this data was taken, whether it's a cyber theft or whether it's a ransomware attack or an extortion ware attack. If the data was encrypted, then a lot of that wouldn't happen. I mean, under GDPR, the breach notification they allude to in the GDPR, that if the data was encrypted, and the encryption keys were not kept with the encrypted data, then a breach did not happen and breach notification does not kick in.
Steve Elkins:
Yeah. You may very well see in my amended bill language like that being used. So, the last 10 years of my career were spent in health informatics. And the last seven that were spent as a senior information architect in the Optum UnitedHealth Group. One of the projects I worked on there was an initiative to identify and eradicate any last vestiges of social security numbers in the Optum application database.
Bill Tolson:
Oh, nice.
Steve Elkins:
Yeah. And UnitedHealth Group has long used the rochet metadata repository on all of its databases or all of its subsidiary companies. And so, we were able to scan the metadata repository, find all of the potential columns in any application that might potentially contain a social security number. Anytime we found one, we sent a notification to that subsidiary and said, "You have 60 days to either erase any social security numbers in that field or encrypt it. yOu have two choices." And I think more companies should be doing stuff like that.
Bill Tolson:
Oh, absolutely. And I talked to insurance providers as well, cybersecurity insurance providers. And I asked them, "Why don't you tell your policy holders that they must encrypt, that they must adopt role-based access controls. They must begin adopting zero trust architectures?" I mean, these are all minimum requirements that are not vendor specific. There are technology that's out there now. Many, many, many companies provide them. And I asked the insurance, "Why don't you do that?" And they say, "Well, we do. But we have to leave somewhat of an opening."
Bill Tolson:
So, basically, the choice for the people who want the cyber liability insurance is either you adopt these things or your premium is going to be 3x, or 10x, or whatever it happens to be. It's amazing that a lot of companies will still say we'll pay the more. We don't want to get into this other technology stuff.
Steve Elkins:
I know that when I start writing this stuff into the amended version of my bill that people are in business community are going to ask me, "Well, who will be the audience for all this information?" And I'm going to tell them that it's going to be investors and insurance companies.
Bill Tolson:
Yeah, exactly. Even now, the cyber liability insurance policies only cover a small portion of any potential losses. And you're paying huge amounts for that. I'm really glad to hear you talk about adding some more particular kind of requirements. I mean, for years, I've always wondered, why does a company allow some junior intern to download an entire database full of PII, put it on their laptop to work on it over the weekend, and the laptop gets stolen out of the trunk? There should be a rule in the company that says you cannot download databases with sensitive information on it, period.
Steve Elkins:
Yeah. Another policy or practice that was in effect at Optum was after the anthem breach, UHG really ramped up its security efforts by every UnitedHealth Group laptop is now encrypted when you log off. So, there's kind of a pain because every time you would log on to your computer in the morning, it would take about eight minutes to boot up. [inaudible 00:21:49] without encrypting the data on the laptops desk, but it's kind of necessary these days.
Bill Tolson:
Those are probably base level requirements that like you say, they should be absolutely required. Even a company should say we must do this to lower our overall liability in case something happens.
Steve Elkins:
That's exactly why UHG was doing it. They saw what happened to Anthem after that huge breach. And they concluded we don't want to be that company.
Bill Tolson:
Yes. Don't want to be above the fold and on the front page of the Wall Street Journal talking about the 100 million records you just lost.
Steve Elkins:
Yeah.
Bill Tolson:
That's no fun. We talked about the lack of federal guidance. I did notice that both Senator Gillibrand of New York and Senator Moran of Kansas have introduced privacy bills in 2021 in the US Senate. Do you know anything about those? Have you had a chance to look at them?
Steve Elkins:
Yeah. The International Association of Privacy Professionals, and I think probably a couple of other groups as well as been tracking the federal legislation. And I've looked at some of the bill summaries that they and other organizations have provided. And so far, I'm not seeing any bills that have even been introduced in Congress that are anywhere near as comprehensive as either the California or the Washington frameworks.
Bill Tolson:
Yeah. I agreed.
Steve Elkins:
They've just been rule compared to what this coming up out of the states right now.
Bill Tolson:
I noticed just five days ago, I read a story or it was press release basically said the FTC, Federal Trade Commission filed an advanced notice of proposed rulemaking with the Office of Management Budget that initiates consideration of a rulemaking process on privacy and artificial intelligence.
Bill Tolson:
And it goes on to say the FTC filing lays out the FTC's intent as seeking to curb black security practices, limit privacy abuses and ensure that algorithmic decision making does not result in unlawful discrimination. So, it's interesting that the FTC is doing it. Obviously, the downside is a future administration can basically avoid it. But in the meantime, if the Congress isn't going to do anything, I thought this was pretty interesting.
Steve Elkins:
There's one provision in the Washington framework. There's a very good definition of basically, I think it's termed something like decisions that have important legal consequences or something like that. So, basically it's intended to address things like your credit reports or the use of your credit score or other kinds of algorithms to make decisions about by companies about whether they're going to rent you apartment or what your car insurance is going to cost you. That's one area that I'm looking more carefully at that I may be up a little bit.
Steve Elkins:
Because I think that if a company is going to use an algorithm black box, whether you characterize it as artificial intelligence or not, I actually think that as an information architect, the term artificial intelligence is currently the most widely used, misused and abused term in business right now.
Steve Elkins:
But I think that if you're turned down for rental insurance rates are exorbitant that I think you'll have to have similar rights as you do under the Fair Credit Reporting right now to look at the data that was being used to make that decision and have the reasons for the decisions spelled out to you in plain English. A company shouldn't just be coming back to you and saying, "Well, we made this decision because the black box told us to do."
Steve Elkins:
So, that's an area that I also think it's on my list of things to beef up a little bit more to give consumers more rights to question the decisions that are being made about them from black boxes.
Bill Tolson:
I know some of the draft laws, and I think California, Colorado, I'm not sure about Virginia, but they do have provisions in there, especially to California bills, that says that the data subjects can basically request that their PII not be utilized with machine learning or AI capabilities within the company. You were sort of talking about that.
Steve Elkins:
Well, I think about that. I mean, I think that's one thing. But in many cases, these black boxes are being used to make decisions that you really don't have any meaningful choice to opt out of. If you in fact, took that option and opted out of having your PII used to make those kinds of decisions, there are many public services available in the market. You would just not be able to avail yourself of.
Bill Tolson:
That was going to be my followup question is companies now including mine, if you ever ran across the term predictive coding for litigation, and eDiscovery, those kinds of things, those were the early uses of machine learning in looking at huge document datasets, millions upon millions, and very successful. I actually got the US courts to adopt the use of this machine learning for doing automated review versus hiring contract attorneys for 2,000 hours to do it.
Bill Tolson:
But what I was going to say is now companies, especially because most companies are relying on cloud platforms, like the Microsoft Azure, like AWS, like Google, like Oracle, that have this huge layer of technology stacks. For example, in Azure, they have various technology capabilities, including a very robust machine learning an AI capability, which a lot of companies use now.
Bill Tolson:
But my question is this, soon, companies are going to be utilizing machine learning and AI to do a lot of their automation around their data. And is it realistic to be able to tell, for example, a company that is utilizing California data subject PII to say, "Well, no, you got to take these guys out of that." I'm not sure in the near future, that's going to be doable technology wise. What do you think about that?
Steve Elkins:
Yeah. I think there's a lot of discussion about how meaningful opt out provisions in these bills really are, just in terms of the practicality of their implementation. Very few people are actually opting out very often under the California laws, which have now been in effect for a year or two.
Steve Elkins:
And that's causing some amount of consternation among state legislators who are working in this field. California has given California consumers some of these opt in or opt out rights concerning their personal data. But not that many people seem to be taking advantage of them to suggest that there are practical obstacles to the effective implementation of these laws the way they've been written so far. So, that's an object of quite a bit of discussion right now. And I think that you have this Colorado law that I think is trying to get at some sort of universal opt out mechanism.
Bill Tolson:
Right. I read last week of the week before, it was an IDC report that was talking about DSAR or data subject access requests as part of, gee, I want the right to be forgotten. You got to tell me what kind of data you're holding on me, that the average US company is receiving around 157 of those per month, and that the average cost of responding to the DSAR is around $1,400 each, just in time and all that kind of stuff.
Bill Tolson:
Obviously, that's going to keep going up. And I'd look at that as cost of doing business. I mean, to protect the privacy of people, you're going to have to spend money to do it. You're going to have to pass costs along. That's the way it goes, but you have to secure that data. And I think companies, organizations, hopefully are getting a wakeup call that says you've had since GDPR started to come out to start thinking about how you protect data transfer data around all of those kinds of things.
Bill Tolson:
And companies are now starting to offer some pretty good solutions around that. And it all comes down to a simple rule when you're talking about information governance or information management. You can't manage your information if you don't know what you have.
Steve Elkins:
Did you read the story in Wired a couple of weeks ago about Amazon?
Bill Tolson:
Yes.
Steve Elkins:
I thought that was ... Again, from the perspective of a longtime information manager, I don't know whether the allegations in that story are true or not. But point is that whether they're true or not, the kinds of things that were described in that story were appalling and should never happen. For our listeners, I mean basically that story said that Amazon on the sales side, not the AWS side, really didn't know what data it had had or where it was stored.
Steve Elkins:
Having come out of an organization like UnitedHealth Group where they have these universal global metadata repository, where it clearly the company does know all of the data it has and has the tools that allow you to find it, that ought to be a basic best practice for any company is that you got to know what data you have and how to find it.
Bill Tolson:
How do you comply with regulations or any of the regulations from the federal state local governments if you don't know what they have?
Steve Elkins:
Yeah.
Bill Tolson:
And I've been saying this for several years now is 80% of all data within an organization is sitting on individual laptops workstations. The IT group, the company, centrally, has no idea what that data is, whether it has PII in it, whether it has IP, whatever. And I think we're eventually, and I've been saying this for a couple years now, where a company is going to have to move toward managing all data, not just records for regulatory retention requirements, or whatever.
Bill Tolson:
But because of eDiscovery, because of the privacy regulations, and all kinds of other things, you have to know all of that data so that you can act on it. If you get a right to be forgotten requests delete my information and you go to the central repository, say, "Yeah. We deleted it off Salesforce and we deleted it off the file servers." But there's numerous copies sitting on laptops and stuff, you're in violation.
Steve Elkins:
Yeah. At UnitedHealth Group, because there was such a culture of good data management, the Optum side of the business in particular was growing rapidly through acquisition. Most of the companies that Optum was acquiring did not have such rigorous practices. And I can remember working on projects, healthcare projects, and Optum Health, where an employee of a new acquisition that was involved in the project, people would be sending the spreadsheets with patient data with PII and protected health and information on spreadsheets.
Steve Elkins:
Every time there was a new acquisition, there always had to be there was an orange washing process to make sure that the people who work for these companies that were being acquired were properly acculturated to the culture of data protection that you don't stand people's PHI around on spreadsheets.
Bill Tolson:
Especially with the fines involved with HIPAA and HITECH. And it's like, wow, that's where security really needs to be absolutely rigorous because that PHI is the most sensitive data that a human can be associated with.
Steve Elkins:
I once worked on a project for a financial services company where all of the programmers who were developing software, the product management, memorize the social security number of the company's largest insurance agent because he has such a diverse client base that the data for his brokerage firm had the most interesting and useful test cases.
Bill Tolson:
Wow. Getting back to some of these new provisions that could be added to privacy bills, one that I was really interested in, I think it was the 2020 privacy bill that was put forth in New York State. And nothing generally out of it didn't make it out of the legislature that year. But one of the interesting things and I hadn't seen this before was it had a data fiduciary provision in it. Have you run across that? Do you have any thoughts on that concept?
Steve Elkins:
I have not heard it referenced in that way. But I think that that's potentially a useful construct.
Bill Tolson:
It said the data fiduciary obligation prohibits the use, processing or transfer of the citizens personal data without their documented consent and requires the business to exercise the duty of care, loyalty and confidentiality expected of a fiduciary.
Steve Elkins:
I like that.
Bill Tolson:
I thought that was really interesting and difficult for companies because all of a sudden, you could be asking yourself, "Well, Jay, if we're going to use all this PII and marketing campaign or a sales campaign, how does it benefit the individual data subjects and does it?" I think trial lawyers would have a field day with something like that, but it does put that notice of.
Steve Elkins:
Yeah. No, I'm on the commerce committee here. And anytime the word "fiduciary" appears in any legislation, the business community goes berserk.
Bill Tolson:
I could see that. But I think in certain cases with some of these unnamed companies that have been misusing data for a long time, they have gigantic legal staffs and they could fight lawsuits forever. But still, I think maybe putting some or talking about a provision like that. And I expect the EU especially around GDPR to eventually add something like that, because they are so far out in front of this, it's scary almost, but I think that would shut down so many businesses, at least their revenue streams that would send a really interesting message, right?
Steve Elkins:
Yeah. Clearly, the online advertising industry does have two main standards for our online advertisers treating the data that comes into their possession. But clearly, these are voluntary standards. And there are many companies involved in the online advertising business that are not adhering to either of these standards. I thought it was alarming that The New York Times story after the January 6th insurrection, somebody leaked to them a large trove of locational data in which only had the advertising industry IDs on it, so it's hypothetically pseudonymized.
Steve Elkins:
But they were still able to find vendors within a couple of days were able to reassociate those advertising industry IDs with actual people and do some fairly detailed reporting on the comings and goings of a number of the people who participated in the January 6th interaction. So, I'm happy that on the one hand, some of those people are being called to account, but thinking about how the times obtaining the data and more importantly was able to deanonymize it was alarming to me.
Clearly, there are people who are in the business of taking pseudonymized data that only contains the advertising agency ID and reidentifying it, there's a business for that out there and that shouldn't exist. I'm probably going to put something in the bill that makes it explicitly illegal to try to reidentify or depseudonymized data that has been intentionally identified or pseudonymized.
Bill Tolson:
That brings up an interesting related topic that this happened two years ago. Basically, I think there are anywhere from 250 to 300 personal identifiers that organizations and even cyber criminals use to identify somebody.
And two years ago, a group of scientists slash programmers had developed a code base that with only, I think it was 14 personal identifiers, real random stuff, brown eyes, lives in 80528 area code or ZIP code, drives an Xterra, real general stuff that this code base using any 14 randomized personal identifiers would have a 95% accuracy rate in actually identifying you down to your name, address and social security number. By the way, those scientists released the code base on the internet for free.
Steve Elkins:
Boy. There was a famous paper that's out there have been circulating for over a decade, where a guy found with just four attributes. If you knew a person's date of birth, gender, and ZIP code, that's all you knew about a person, date of birth, gender, ZIP code, you could reidentify the person with something like an 80% probability.
Bill Tolson:
Wow. That is amazing. The fact that you cannot be anonymous anymore. One of the things that software, you probably know this software, you can anonymize data, you can redact it, you can anonymize it, which means even at field level encryption level, you can go in and take a piece of PII and anonymize it, and then throw away the encryption key. And of course, when you get into pseudonymization, then you can always recover that data. But it's to the point now where you don't want to give anybody any information.
Steve Elkins:
Most people would be shocked that there's a ton of information about us that has always been public, a lot of information, except that if you wanted to find any of you have to go down to the county courthouse and manila file folders for hours.
Bill Tolson:
Yeah.
Steve Elkins:
And almost all of that information has been digitized now and is widely available. I mean, just think about your property tax records, property records of the county, business registrations with a secretary of state's office. And there are companies out there that are hoovering all of this information up and integrating it and publishing it. And I tell people, if you think you can be anonymous, just go out and Google your own name plus the name of your city and you will be shocked at what pops up.
Bill Tolson:
Yeah. It's amazing. Related. I probably get at least once a week, an email from the company that looks at my identity and stuff and make sure it's not being misused. I think I get a warning from them once a week that says your email and password are found on the dark web. Change your passwords again. It's like you're changing your passwords almost every week now.
Bill Tolson:
The scary thing is, is these breaches can happen and they could take 200 million records. And the company doesn't even know what happened for nine months. So, imagine what's going on in all of that time. It is truly scary.
Bill Tolson:
Now we're running out of time. One of the things I wanted to at least ask you, the GDPR was the first major privacy security law that brought up this idea of privacy impact assessments, do companies that holds personally identifiable information, should they be looked at once a year or once every three years, what happens to be to make sure that their processes, their security procedures, so forth are at a certain level?
Bill Tolson:
And with GDPR, conducting a privacy impact assessment with the fall of the Privacy Shield GDPR now says, "Well, you got to conduct privacy impact assessment." Anytime you're going to transfer data from the EU to, in this case, the US, and it has to be done by third party and so forth. And I'm wondering if you think something like that within a state bill, need to conduct privacy impact assessments on a regular basis, whatever that timeframe is, would make sense.
Steve Elkins:
Yeah. I think so. There is an article in the Washington Privacy Act that speaks to that. And as we discussed earlier, that's an area that I'm probably going to beef up in my version of the bill. And I think there are seven principles of data privacy by design that are commonly accepted at this point. And I'm looking at the seven principles of data privacy by design and figuring out how to include that responsibility to abide by those seven principles and my bill.
Bill Tolson:
Yeah. That's perfect. That also, and again, I've been referring back to the GDPR lot here lately. But one question, do you see the need or a requirement, for example, in a Minnesota Privacy Bill to either limit or have more controls over the transfer of Minnesota citizen data outside the United States?
Steve Elkins:
Well, I think one of the big challenges of trying to do this at the state level is this general issue that business community is so concerned about, just in terms of how do I comply with bills in 50 different states? That one may have to be one that waits until Congress finally looks at all of our collective body of work and acts on it themselves.
Bill Tolson:
Yeah. That's a great point. I hadn't thought of it that way. But that's absolutely perfect.
Steve Elkins:
And there are definitely things that we can try and legislative at the state level that are really going to be very effective until there is national elective legislation. And there does need to be national legislation. I've told people, "Look, I'm under no illusions that anything that I'm working right now is going to exist in the long term and form in which I get it passed." But I do hope all of us who are working on this issue at the state level, hope that we're influencing the future form of the legislation that Congress will eventually pass.
Bill Tolson:
That's absolutely right on. Absolutely. So, Representative Elkins, or Steve, sorry, I think that about wraps up this edition of the Information Management 360 podcast. I really want to thank you for this really insightful and actually enjoyable discussion today around the importance of data privacy and what the states and state reps representatives like you are doing. I think you're leading the Federal Congress on this and I think in the long term, it's going to help.
Bill Tolson:
So, if anyone listening has questions on this topic or would like to talk to subject matter experts, please send an email mentioning this podcast to info I-N-F-O at Archive 360 dot com. And we'll get back to you as soon as we can. Also check back on the Archive 360 resources page for new podcasts with leading industry experts on a regular basis. And also, you can search iTunes, Spotify, Google for these podcasts as well.
Bill Tolson:
In fact, and I think I mentioned this, but I'll mention it again next month, I'll have Colorado State Senator Glenn Dean, the coauthor of the Colorado Privacy Act as a podcast guest as well. So, Steve, again, really interesting for me, hopefully, you had a good time and really want to thank you for taking the time.
Steve Elkins:
I enjoyed the conversation as well. And thank you for the opportunity.
Bill Tolson:
Thank you, sir.
Questions?
Have a question for one of our speakers? Post it here.