Description:
In this episode we interview Ryan Harkins, Sr. Director of Public Policy at Microsoft. Ryan discusses the state of data privacy legislation in the US and Microsoft's role in driving the need for strong US government data privacy policies. As states no longer wait for the US to lead the initiative, businesses are left struggling to comply with a myriad of privacy policies with different specifications, definitions and penalties for non-compliance.
Blog
The Clock's Ticking on Federal Agencies to Implement E.O. 14028
The federal data security and data privacy landscape is changing with recent issuance of President Biden's Executive Order (EO) 14028...
Speakers
Ryan Harkins
Senior Director, Public Policy
Microsoft
Ryan Harkins is the Senior Director, Public Policy at Microsoft, where he leads efforts to advance a range of issues, including state and federal privacy legislation, the regulation of AI, and democracy. He frequently testifies before legislatures and speaks at conferences and other events. Ryan also serves as an Adjunct Professor at Seattle University School of Law, where he teaches courses on privacy law, cybercrime and the Fourth Amendment, the impact of AI on civil liberties, and voting rights and election law.
Bill Tolson
VP of Global Compliance & eDiscovery
Archive360
Bill is the Vice President of Global Compliance for Archive360. Bill brings more than 29 years of experience with multinational corporations and technology start-ups, including 19-plus years in the archiving, information governance, and eDiscovery markets. Bill is a frequent speaker at legal and information governance industry events and has authored numerous eBooks, articles and blogs.
Transcript:
Bill Tolson:
Welcome everyone to the Archive360 Information Management 360 Podcast. This week's episode is titled A Discussion on US Privacy Legislation With Microsoft’s Ryan Harkins. My name is Bill Tolson and I'm the Vice President of Compliance and E-discovery at Archive360. Joining me today is Ryan Harkins, Senior Director of Public Policy at Microsoft. Ryan, welcome, and thanks again for joining me today to discuss what's going on with data privacy legislation in the US in what will probably be a really interesting discussion.
Ryan Harkins:
Thank you, Bill. Thank you for having me and I'm happy to be here.
Bill Tolson:
Excellent. Well, Ryan, it seems as though we are witnessing an explosion of privacy legislation, not only in Europe and other countries around the globe, but here in the US as well. I understand that Microsoft has been calling for privacy legislation for quite some time. Why is that?
Ryan Harkins:
As you know, Bill, we have long taken privacy seriously at Microsoft, and it really dates back to a memo that Bill Gates wrote at the company over two decades ago, the Trustworthy Computing memo. And it was a memo in which Bill was predicting the issues that would become increasingly important as computing and software and services moved online, namely as the world moved towards what we now call cloud computing, and he predicted that several issues among others would be crucial to earn the public's trust in online services.
The first was availability. Online services should always be available when they're needed. The second was security. As he put it at the time, quote, "The data our software and services store on behalf of our customers should be protected from harm." And the third was privacy. In short, users should be in control of their data. And that commitment ultimately led us to start calling for comprehensive privacy legislation in the United States in 2005. And it's that core commitment, which is why we continue to call for new privacy laws today. In our view, robust laws are needed, new laws, to address real and serious privacy concerns that the public has raised. And without new laws in place, our view is that it will be exceedingly hard for the industry to earn back the trust of the public. It seems clear to us that the industry has lost trust of the public, and in order to fix and remedy that, we need, among other things, new laws on the books.
Bill Tolson:
Yeah, no, that's a great way to look at it. I didn't realize that Microsoft had actually been at this since 2005. That's great. We're witnessing this explosion of privacy legislation around the world. I think there's approximately 145 countries now with some form of privacy legislation, but also I think the last two years, probably with the start of the California CCPA, there have been several other states that have put privacy legislation up. Many states have actually put bills out, but what are we at now? Five states, Ryan that have actually passed privacy laws?
Ryan Harkins:
Yeah, that's right, five states have passed laws. California, of course, was the first, passing the California Consumer Privacy Act in 2018, a law that they of course updated with a ballot initiative, the California Privacy Rights Act in 2020. Virginia was second, passing a law that in many respects took the framework that had been introduced in a bill in our home state of Washington, the Washington Privacy Act, a bill that unfortunately has not gotten over the finish line in Olympia, but Virginia became the second. Colorado then passed what I think is a stronger version of that kind of a law, a law based upon the framework that was originally embodied in the Washington act. And then Utah, and now Connecticut, became the fourth and fifth states to pass comprehensive privacy laws.
Bill Tolson:
I've actually had several of the state senators, the co-authors of their laws on, and I haven't had Senator Lundeen from Colorado on yet, my home state, but they have said that they will when they get some time. But I've talked to Senator Marsden in Virginia, Senator Cullimore in Utah and several others. And they've all had really interesting takes on their bills. I guess one of my questions is with, I think the obvious outcome is going to be by next year, there's going to be many more states that have passed privacy bills. With this wave of state privacy bill activity, do you think that's a good thing?
Is that a healthy thing or do you think could it, well, it probably would, but it'd be interesting to track that over time, but what kind of issues is that going to cause for businesses in general, trying to track these different laws? They are relatively similar in many cases, many of the rights are the same and stuff like that, but I've noticed from bill to bill, some of the definitions are slightly different. The fines are different. The exemptions are different, things like that.
In talking to companies that I deal with on a daily basis, they're all starting to get just a tiny bit leery. Tiny is probably the wrong word. They're getting somewhat leery about how they're going to track all these things, all of these various bills as they pile up on each other. And I personally think it is a good thing that now the states are, with industries help and with industries probably pushing, like you said, Microsoft's been working at it for so long, that they're starting to take these things seriously. And they're probably getting some very good feedback from their citizens saying we need this. So I would suspect that Microsoft sees this new push of state privacy legislation as being a good thing?
Ryan Harkins:
Absolutely. I think it's a great thing. As I mentioned, we started calling for comprehensive privacy legislation in the United States 17 years ago. We were not able to get a lot of traction at the federal level for a number of years. And it became clear really four years ago or five years ago in 2017 when Alastair Mactaggart, a successful real estate investor in California, pulled a team together and created a ballot initiative in California to pass a comprehensive privacy law. And that initiative really was the impetus to the California legislature in 2018 ultimately passing, the California Consumer Privacy Act, the first comprehensive privacy law in United States history.
And we think that that is super important for the reasons I mentioned before, namely, that concerns about privacy and the amount of data that is being created and stored and analyzed and shared about every one of us by the tech industry, by other companies, is it has become increasingly concerning to people. And if we don't pass new laws to establish rules of the road, to provide consumers with strong and credible privacy protections, while also providing businesses with the certainty that they need to know what they need to do to comply with the law, what they need to do to continue to use data in beneficial and innovative ways, then we'll see the trust of the public, of our customers, of governments, et cetera, in online services undermined. And so it's really important, in our view, to pass new privacy laws in order to earn back some of that trust.
And to some extent, the United States really has been behind in this regard. If you step back and look at the historical arc of privacy and data protection regulation, the United States traditionally has taken a sectoral approach or a laissez-faire approach to regulating data. The United States, we have not had an omnibus comprehensive privacy law to address all privacy issues across all industries in one place. Instead, we've had narrow issue specific or sector specific privacy laws.
So think HIPAA and the HITECH Act, which addresses privacy in healthcare. Think the Gramm–Leach–Bliley Act, which provides some protections for privacy in the financial industry. Think the Children's Online Privacy Protection Act or COPPA which addresses the collection of personal information from kids under 13 online. And under girding all of that, we've had consumer protection laws like the FTC Act Section 5, which prohibits unfair deceptive trade practices, which have been applied to the collection and processing of personal data to provide some measure of broad based protection, but that system still at its heart has been issue specific or sectoral specific. And that means that there are gaps.
And that system has been in direct contrast to the system that our friends in the European Union have long employed, where beginning in 1995, when they passed the Data Protection Directive, they have long had an omnibus, comprehensive data protection law to address privacy issues across industries. And that's really important. It's become increasingly apparent that the United States needs to have a similar, broad based comprehensive approach. As I mentioned, much of the rest of the world has already started racing forward. The Europeans updated their law, the Data Protection Directive with the General Data Protection Regulation or GDPR. A growing list of countries around the world have begun adopting their own data protection laws, which more or less are based upon the GDPR.
And so we think it's been important to support the advancement of new privacy laws wherever that can happen here in the United States. I mentioned it became apparent five years ago that the states were no longer going to wait for Congress to act. And California became the first, followed by Virginia, Colorado, Utah, and Connecticut. We made a decision that after 17 years of waiting for a federal law, it was important to lean in and be supportive of efforts in state capitols to try and advance the ball and pass comprehensive privacy laws. And so while we and others would, of course, love to see a uniform standard for the entire country, we also think it's important to support the efforts of lawmakers and state capitols to make progress on this issue and pass comprehensive privacy laws.
Bill Tolson:
So Ryan, do you and your group or Microsoft have a particular state privacy bill or law that you think is the most comprehensive and closest to achieving the privacy rights you think citizens are wanting? Do you have kind of an example, a bill or law that you say, "This is what we really thought was closest to what we were hoping"?
Ryan Harkins:
I don't know that I would pick a favorite of all the laws that have passed. I do think it's important in this space to avoid letting the perfect be the enemy of the good.
Bill Tolson:
Yeah.
Ryan Harkins:
And with a topic as complex as privacy, no matter what law passes, you could always perhaps pick at a provision here or a comma there and say, "Well, maybe we could do things differently," but I do think most of the laws that have passed contain a lot to like. And in some respects, I think it's important for policy makers to continue to innovate in this space, to continue to look for ways that are better to better regulate privacy or to look for new ideas and new concepts that may help address concerns that their constituents or other stakeholders have with respect to the regulation of data.
In our view, there are several key components that we think ought to be part and really have to be at minimum part of any comprehensive privacy law for it to be credible, for it to provide strong and credible protections to consumers, for it to provide clear rules for businesses so they can innovate responsibly, and frankly, for the law to be interoperable with global privacy laws.
The first is that the definitions have to be strong and have to ensure that the law will apply to the kinds of modern online data sets that are used to track consumers on the internet today. And that means, among other things, that the definitions, whether it's of personal data or personal information or personal identifiable information, whatever term is used, that that definition has to apply to cover things like targeted advertising profiles or other commercial data sets that are oftentimes stored not directly with a consumer's name, but with what laws often call identifiable information. So information stored with a cookie ID, an IP address, a device identifier or other persistent unique identifiers that are used to track consumers. And failure to cover those kinds of modern online data sets would effectively render the protections in a privacy bill, if not meaningless, much less effective because it's those types of data sets that are used to track consumers online today.
And this is a space where the words on the page always matter when it comes to laws, but they really matter here because small changes in language and changes that might appear to be relatively innocuous can have an outsized impact. I think we've seen, unfortunately, efforts by some in industry to try and chip away at those definitions to either narrow the definition of personal data or perhaps expand the definition of de-identified data or introduce other concepts, whether it's synonymous data or other things, in a purported effort to lay the groundwork and argue in the future that modern online data sets the kinds that are used to track consumers, or at least some portion of them, would somehow not be subject to a privacy bill's provisions.
Bill Tolson:
That kind of brings up a related question now that you brought that up. Tell me if I'm wrong, but I think of the five state privacy bills out there, there is a difference, and I don't know the numbers, but there's a difference potentially with say the right to deletion. And if I'm going down a path that doesn't make sense, tell me, but I remember, actually I think it was one or two of the senators I talked to, that if I put in data subject access requests or whatever that various data is calling it to, "What data do you have on me?" And then I might say, "Okay, I want my data deleted." I thought that some states were kind of parsing that to say, "Well, okay, we will delete the data that we got from you filling out forms or giving us consent for something, but other PII that we got from other publicly available information, we're going to keep."
And then I think other states that I read it was if you put in a deletion request, it doesn't matter where you got the data, if you have any PII on that data subject, that all has to be removed. Am I making any sense there?
Ryan Harkins:
No, you are, and you're putting your finger on, it's another great example of how the words on the page really matter. And you're also highlighting the second component that we think is incredibly important for any privacy law. And that is to have robust rights that will empower consumers to protect their data. And we have seen a difference in the right to delete across the different privacy laws.
So just as an example, in California's law, the right to delete applies to personal information obtained, quote, "From a consumer." And so what that means is that arguably the right to delete would not apply to personal information that a company obtains from a source other than a consumer.
Bill Tolson:
Right.
Ryan Harkins:
So for example, personal information that you purchase from a data broker, it also could lead to arguments that the right to delete would not apply to inferences derived from data. And so those are two important categories that would be left out. If you fast forward to the law that passed in Virginia, Virginia would apply the deletion right to personal data that is obtained from or about a consumer.
Bill Tolson:
Got it.
Ryan Harkins:
So under Virginia, the right to delete would apply to two of those three categories I mentioned. It would apply to personal data that a business gets directly from the consumer. It would apply to personal data that a business obtains from another source, like purchases from a data broker, but it arguably also would not apply to the third category. And that is inferences that are derived about a consumer from other data.
Bill Tolson:
That's interesting. I hadn't thought of it that way. That is really interesting.
Ryan Harkins:
Yeah. And then if you fast forward to the law that passed in Colorado, Colorado's deletion right applies to personal data concerning a consumer, which means that it is the broadest and it would apply to all three categories. It would apply to personal data that a business gets directly from a consumer. It would apply to personal data that a business gets from a third party source, like a data broker. It would apply to inferences that a business derives about a consumer from other sources of information. And that's a right that we support. That form of a deletion right is in GDPR. It's in other privacy laws around the world. And personal data is personal data. And if we're going to provide consumers with rights to control their data, those rights ought to apply to personal data no matter whether the company obtained it directly from the consumer, from a third party source like a data broker, or whether they have inferred or derived the information from other data sources.
Bill Tolson:
That is wild. I absolutely agree with everything you just said and think it's great. I'm sitting here thinking about what are organizations, what are companies going to have to do that, and we all do it, we collect data for newsletters and all kinds of neat stuff, but now a company is going to have to track what PII they're collecting, where they got it from and where that data subject is resident as to, "Well, gee, when they ask for a deletion, do we just delete the stuff that we got via a form," or all of this other stuff you talked about? Or are companies going to be faced with maybe a high watermark type of reaction that says, "Okay, we don't ask to de delete data. We just delete it all. We don't try to track all that stuff because over time that's going to be relatively impossible anyway."
Ryan Harkins:
The system could become unnecessarily complex. You're highlighting, Bill, one of the reasons why having a uniform federal standard across the entire country would be helpful.
Bill Tolson:
Yeah.
Ryan Harkins:
I think the companies also could take another path. And that is to say, "Look, we're going to take the protections that, say, California provides, or perhaps the protections that Colorado has provided and simply apply those across the entire country to all consumers." That's an approach that we at Microsoft have taken. We committed several years ago to applying the rights that are at the heart of the GDPR to consumers worldwide. We also committed a couple of years ago to apply the rights that are at the heart of California's new privacy law to consumers across the United States, because we think providing those protections to all of our consumers and all of our customers is especially important.
Bill Tolson:
And you mentioned that federal law, and we'll talk about that in a little bit, but a federal law that actually is part of the law includes preemption of the state bill. So you're dealing with one law versus 51 or whatever it happens to be. I know when talking to companies that I talk to, they're all hoping for a single bill that they can follow versus trying to parse all of these other ones as well.
But one thing I wanted to ask, as I've said, we've done podcasts with various state legislators. One of the podcast guests I had on several months ago was New York State Senator Kevin Thomas, who co-authored several privacy bills in New York over the years. They still, I don't believe, have gotten one through yet, but in various earlier versions of his privacy bills, and including I think this year's as well, he's always included a requirement to obtain opt-in consent for all collection, sharing and transferring of personal information. That's one thing that I talked to him about during the podcast. He felt very strongly about that versus opting out, for example.
But one of the things that he has included in all those bills as well is he wants to put a section in there that imposes affirmative duties on companies that collect and use personal information. Earlier on, he's called it several things, but I think in his latest bill, he referred to it as a duty of loyalty and care. Do you have an opinion on that? Because I don't think I've seen that in any other of the state bills. Can you explain what that is?
Ryan Harkins:
Sure. I would say first to back up at the outset, I'd like to applaud and thank Senator Thomas for his leadership in trying to advance comprehensive privacy legislation in New York. He has been working on this issue for several years. I certainly appreciate the thought and the care that he has put into his bill. And it's why we issued a memo this past May expressing support for the framework that he has developed and the manner in which he has continued to hone and tweak it as the years have passed. I think that he does a lot of things well in the New York Privacy Act and has a lot of the components that we think are incredibly important to any comprehensive privacy bill.
The first, of course, is that it has strong definitions and to ensure that the bill will in fact apply to modern online data sets. He also does in fact provide rights to consumers that are important to empower people to control their data. And he takes, particularly in the most recent draft, he takes a nuanced approach to consent, which is in our view designed to help ensure that consent will be effective, that it will actually empower consumers to make specific informed choices in a way that makes a real difference.
As you mentioned, Bill, we've seen legislation over the past couple years, in some instances, that would purport to require opt-in consent for all data collection and use. And that kind of approach in my view is both over broad and ineffective and will lead to unintended consequences that will fail to advance privacy for consumers. And the challenge is that in order to ensure that a privacy law applies to the kinds of modern online data sets that pose real privacy concerns, it has to define personal data broadly.
And so personal data has to be defined to cover identifiable data, so data associated with persistent unique identifiers like IP addresses and device IDs and cookie IDs. And that means that personal data will in fact, as it should, cover basic web browsing data that your computer sends to each website simply by virtue of navigating to the site. And so if a website is required to obtain opt-in consent from consumers just to collect that type of data, just to collect your IP address and associated information that's sent to the website in conjunction with your IP address, it effectively means that every website or online service that consumers access on the internet will have to be gated.
Bill Tolson: Yeah.
Ryan Harkins:
And so anytime you want to go to a website, you as a consumer will be forced to stop and navigate some type of consent experience to determine whether you provide consent to send to the website basic information that your computer sends to every website just by going to the website.
And the end result is likely to be something that academics call notice fatigue, where users get so used to having to deal with these consent experiences and can become irritated by them so that they will simply click through them without reading them and without really being empowered to make an actual choice.
Even if users wanted to read through all of those kinds of consent experiences and privacy policies and the like, there are simply too many of them for consumers to be in a position to do that. All the way back in 2007, two researchers at Carnegie Mellon University, Lorrie Cranor and Aleecia McDonald, conducted a study and determined at the time that it would take the average American something on the order of 76 working days just to read all of the privacy policies for all the websites that they encounter on the internet over the course of the year. It's just not workable. And frankly, that kind of approach is asking consent to do too much.
Now, some bills, and Senator Thomas's previous iterations of his bill have attempted to resolve some of those concerns by either purporting to limit the need to obtain consent for basic web browsing data by perhaps stating that a website doesn't need to obtain consent for that initially, but then if the consumer doesn't provide consent through that gated experience, you have to promptly delete it, or by
The fundamental problem in my view is that websites will inevitably come up with clever ways to try and skirt those kinds of prescriptive rules and convince consumers to consent. It's why I say bills that would try to require opt-in consent for all data collection and use are not just over broad, but they're counterproductive. And in some ways, they actually let businesses off the hook because if you can convince a consumer to provide, quote, unquote, consent, the business can then just move forward and do what it wants.
And so, in a way, it's a way of shifting the burden of regulating online privacy off of businesses and onto the shoulders of each consumer, which is really not fair to consumers. And I think in some ways, allows companies to abdicate their responsibilities. And it's also not necessary to do that in order to really provide rigorous, robust protections for consumer privacy.
And so I think a smarter way to think about consent is for policy makers to identify those specific processing activities that they think pose greater privacy risks to consumers, and that therefore ought to require some form of consent. And we've seen various laws in lots of different bills take different cracks at doing that. So they'll define or identify sensitive data. So data relating to race, ethnicity, religious views, biometrics, specific geolocation information, et cetera, and say, "In order to process that kind of data, you need to get consent."
And then we've seen them identify other activities that they think pose higher privacy risks. So the sale of data, processing personal data for targeted advertising, profiling consumers for important decisions, like whether to provide someone with access to credit, housing, insurance, employment, education, et cetera, and say, "For those activities as well, we think there should be a consent requirement." And then once you've done that, you can then ask the next question. And that is, "What standard do we think ought to be met in order to say consent has been obtained?" Is it opt-in? Is it opt-out? Or is there an option in the middle of those? Say opt-out with a requirement that there be some kind of easy button or a what some people call an opt-out preference signal. In other words, a place where consumers, perhaps in your web browser, perhaps in your operating system, on your computer, a place where consumers can press a button and at one place and one time exercise their opt-out right with respect to certain activities for all websites at one place and at one time.
Bill Tolson:
Right.
Ryan Harkins:
That's effectively the approach that Senator Thomas has taken with his bill. He would require opt-in consent to process sensitive data. He would require opt-out consent for things like the processing of personal data for targeted advertising, data sales, et cetera. And then would also require compliance with an opt-out preference signal to allow consumers to exercise their opt-out right easily in a way that's effective, in a way that won't force consumers to have to go to hundreds or thousands of different websites to exercise that opt-out right individually with each one of those sites.
Bill Tolson:
Yeah, no, that's very important. The whole idea of opt-out versus opt-in and the various levels of it does confuse people. And just a shout out to Senator Thomas, his podcast is by far the most downloaded of all of them so far so he's very popular. Very good speaker. He has a great voice, by the way. But if listeners have a chance, go back and listen to the podcast with Senator Thomas. It was very, very good.
So Ryan, the other state privacy bill topic that has come up a lot in the podcast with legislators is the concept of the private right of action. So far only the CCPA-CPRA in California includes a private right of action. So can you explain what a private right of action is? And do you think other bills that include, and I say bills, not laws because we'll get into that, but other bills that include a private right of action, is that an issue for the bill or is this a right that eventually is going to be included, do you believe, in more and more of the privacy laws?
Ryan Harkins:
Well, all a private right of action means is that a consumer would have the ability to file a lawsuit in court. And unfortunately, this issue more than any other has helped derail efforts to pass privacy laws across the country. Four of the five laws that have passed in the states, Connecticut, Colorado, Utah, and Virginia, do not provide private rights of action. They vest enforcement authority solely in the hands of the attorney general. California does have a private right of action, although it's a fairly narrow one. They provide a private right of action for data breaches that result from negligent security practices.
And we've seen this issue in our home state of Washington, for example, where Senator Reuven Carlyle, who was the author of the framework in the Washington Privacy Act, a bill that I mentioned, unfortunately, has not been able to get over the finish line, but a framework that has ultimately found its way more or less into four of the other laws that have passed, the laws in Colorado, Connecticut, Virginia, and Utah. But this issue, whether to include a private right of action or not or the failure to get agreement on it, helped prevent a privacy law from passing in Washington state.
And it's really too bad. I think that when the concept of private right of action comes up, oftentimes there are some stakeholders that view it as all or nothing, as black and white. You either have it or you don't. And in reality, there are a lot of different iterations of things that you could provide consumers to empower them. For example, to ensure that their rights are carried through without perhaps subjecting businesses to broad class action liability for technical violations of what would be a brand new and incredibly complex regulatory regime where there aren't any actual harms at issue.
And those are the sorts of concerns we hear from some businesses, that anytime the words private right of action come up, people become very concerned and I think their head goes to worst case scenario. "We're going to be subject to impressive class action liability for complying with a really complicated new regime in cases where there might not even be real harms." And there are strong views on all side of the enforcement issue, and especially on this topic of whether there should be a private right of action.
We have really tried to encourage people on both sides of the issue over the last few years to come together and find workable solutions, find solutions, for example, that could empower consumers to enforce their rights. And that could look like a number of things. It could look like permitting consumers to file lawsuits to seek injunctive relief in court, or to seek damages for actual harms that are caused by violations of their privacy rights. And there are things that I think on the flip side that could be done to allay the concerns that you may hear from businesses that they will be unfairly targeted with lawsuits for technical violations of the laws.
Bill Tolson:
Yeah.
Ryan Harkins:
And this is a topic that we've seen Senator Thomas take an innovative approach to. We've seen other state bills attempt to include innovative approaches on this topic. And it's an issue that I think will come up and it is coming up with respect to federal privacy legislation. And this issue is too important to fail to get privacy laws over the finish line, because I think there are workable solutions in this space that reasonable people from the advocacy community and civil society, policymakers, lawmakers, and reasonable people from the business community could agree upon.
Bill Tolson:
Yeah. In my recording of podcasts with the senators from Virginia, Colorado, and Utah, I brought up this private right of action. And they all said yes, in their original bills, they had included it or things like it, but they had to basically let them be negotiated out to get the bill to pass. They said it was such a sticking point that they'll revisit it with amendments in the coming years, but to get the bill passed into the law, they had to let it go.
Ryan Harkins:
Yeah, and I think that's the political calculus that other policy makers have had to make. So it's an issue that we just have to keep working on in my view.
Bill Tolson:
Sure. Yeah. On another subject, I've been noticing this in articles, but also in talking to privacy advocates, they've suggested that there's this group, this actually little known group, I didn't know about them, called the SPSC organization. I think SPSC stands for security and privacy in speech communication. They actually work with, I believe, tell me if I'm wrong, Ryan, but they attempt to work with various state co-authors and authors of privacy bills to help them create privacy laws.
But the pushback from many has been that the organization, and it's made up of businesses, tech firms and stuff like that, they really advocate for less prescriptive and generally weaker privacy regulations. Specifically, they indicate that the organization's interest focuses on making the potential state privacy laws more tech industry friendly, which I sort of understand, but by doing this, they're potentially making the state privacy laws less likely to actually protect user personally identifiable information as effectively as they should. Does Microsoft have a stance on this organization? The SPSC?
Ryan Harkins:
So the organization, yes, it's the State Security and Privacy Coalition.
Bill Tolson:
Oh, okay. I was wildly wrong on that.
Ryan Harkins:
No, you were close, but it is an organization that has a broad membership. And there are companies who are members of it from a variety of industries. So it includes big tech, it includes internet service providers or telcos, it includes retailers and data brokers among others. And frankly, we share the concerns that others have raised about SPSC's tactics and the role that they have been playing in legislative discussions on privacy. And they appear to be working in tandem with another trade association, TechNet, to influence privacy laws in ways that in our view are counterproductive.
We used to be members of both SPSC and TechNet, but we dropped our membership in both organizations two years ago, because in our view, they weren't on the same page as our company on privacy. We are interested in trying to pass strong, credible privacy legislation, legislation that will not only provide industry with clear rules of the road, but will actually provide consumers with robust privacy protections. Because passing credible legislation is essential to achieve what in our view should be the overarching goal of privacy laws. And that is to provide consumers with real protections and earn back the trust of the public in technology and online services.
And from my vantage point, a lot of SPSC's efforts have led either to privacy bills failing or to amendments that, as you put it, weaken or water legislation down so that it would have a very little impact. And that includes efforts to narrow the definitions of personal data or expand the definition of de-identified data or use other terms or techniques largely in order to try to argue that laws will not apply to modern online data sets or at least that there are certain modern online data sets that they can argue will for some reason or somehow not be subject to a bill's protections.
And it includes efforts to try to narrow rights that are provided consumers, either narrowing the deletion right in some of the ways we discussed earlier, not including a correction right, or trying to narrow the definition of sale so that the right to opt-out of the sale of your personal data would somehow be narrower. And I think to some extent, it also has included an effort on their part to oppose or fail to consider reasonable enforcement regimes, reasonable compromises on that issue that could empower consumers to enforce their privacy rights.
So because of all of those things, we dropped our membership in those organizations a couple of years ago, and I would really urge them to reevaluate their approach and to try and be more supportive of efforts to pass strong and incredible privacy bills in other states across the country.
Bill Tolson:
Yeah, that's absolutely fantastic. I fully agree. That's one of the issues that I've talked to lots of people about, we won't get into it here, but eventually will the state privacy laws where they mention security requirements, will they be more prescriptive in the future? Right now, they're not very prescriptive and a lot of people kind of bring that up, but we can talk about that at another time.
But we have one final topic here, Ryan, that I think we want to touch on. And it's around what we've already mentioned several times in this podcast and it's around federal privacy law. It seems like, and you mentioned this early on, it seems as though there's a growing consensus from a range of organizations, including Microsoft for a long time, 17 years, like you said, for the passage of a federal privacy law.
In fact, I picked up on an open letter that the US Chamber of Commerce had published, I think back in January, directed at the US Congress, urging them to pass a national privacy law. And on May 23rd, The Philadelphia Inquirer published an opinion piece on the thing. And the opinion piece was titled Why We Need a Federal Data Privacy Law. The article really highlighted the need for un-overriding federal privacy law to simplify data privacy compliance requirements from smaller businesses and so forth.
And I've mentioned that, and I think you might have too, the idea of a business needing to track and manage 51 privacy laws with slightly differing definitions and stuff down to one preemptive law, which I think is great. And I think you may have mentioned this, but I'll ask it again, Ryan, what's Microsoft stance on a preemptive federal privacy bill that would kind of settle the state bills, take them off the table, so it'd make it easier for businesses to be compliant?
Ryan Harkins:
Well, I'd say a couple of things. First is that there has been a lot of activity recently at the federal level to get privacy legislation moving. And Senator Cantwell has been working on a privacy bill. There's another bill that people have referred to as the Three Corners Bill being sponsored by Representative Spallone and Representative Cathy McMorris Rodgers and Senator Wicker and we're thrilled by all of that activity.
As you mentioned, we've been calling for a comprehensive federal privacy law since 2005. And we would really encourage all the members of Congress to keep leaning in on this issue to find a path forward. And we stand ready and willing to help them in any way we can to try and pass a federal comprehensive privacy law.
The other issue that you were mentioning, Bill, is on preemption and should a federal privacy law preempt state privacy laws or laws that purport to regulate the collection and use of personal data. And of course we would love to have a uniform standard across the entire country that would provide more certainty and make it easier for businesses to comply with it. It would also potentially be better for consumers because you could ensure that everyone would enjoy the protections that at present only residents in California, Connecticut, Colorado, Utah, and Virginia will have.
But in our view, preemption ought to be the last issue dealt with in federal privacy legislation, because any federal privacy bill has to be worthy of preemption. Meaning that if it is going to preempt similar protections in the states, it has to be sufficiently strong and has to provide consumers with protections that are robust and protections that, at minimum, meet the floor that has been established by the five states who have passed comprehensive privacy laws.
So yes, absolutely. We would love to see a single uniform standard across the entire country. We think any federal privacy law has to be strong enough to be worthy of preempting state laws. And in the meantime, until we get a uniform federal comprehensive privacy law, we will continue to support efforts by state lawmakers in state capitols to make progress and pass comprehensive privacy laws to protect their residents.
Bill Tolson:
I really like the way that you discussed the whole idea of preemption and how it should be used. Like you said, having a very weak federal privacy bill that comes out and preempts all the great work that states have done would not go over well and would be actually very bad policy. So like you say, you've got to establish a floor before the preemption.
I actually, on the Three Corners Bill that you mentioned, I sat in on the, I think it was the House Energy and Commerce Committee hearing this week. I think it was about four hours and Kathy McMorris Rodgers and Representative Pallone and others were in it. And they got really deep into a lot of stuff. It was very good. Obviously they had experts on the panel too, talking about various things.
And I've heard, and I don't know, you tell me, Ryan, obviously it's going to be edited and changed and amended and all kinds of stuff, but I've been told by lots of people, whether they know or not, that this Three Corners Bill may have the biggest chance of actually making it into law eventually.
Ryan Harkins:
I don't know. I know that Representative Spallone and Representative Kathy McMorris Rodgers, Senator Wicker, have put in and are continuing to put in a lot of work on this issue. I know that Senator Cantwell and her staff are also putting in a lot of work on this issue. And we're thrilled and excited to see all of those congressional leaders committed to advancing comprehensive privacy legislation. And as I mentioned, we're ready to help them try and get that done in any way we can.
So there's one other component of a privacy law that we think is especially important. I mentioned privacy laws need to have strong definitions to ensure that they will, among other things, apply to modern online data sets. Second, they need to empower consumers with rights that are necessary to control their data. Third, there has to be strong enforcement in order to ensure that there's accountability, that companies will be held accountable for complying with the law. And the other is that in our view, there need to be affirmative obligations placed on the shoulders of companies to be responsible stewards of the personal data that they collect. Simply providing consumers with rights is necessary, but by itself, insufficient in order to provide real and credible privacy protections in the 21st century.
And there are a number of bills out there that take innovative approaches to doing just that, to imposing affirmative obligations on companies to process data responsibly, irrespective of whether a consumer decides to exercise his or her rights. And that can mean things like requiring companies to conduct risk assessments of their data processing activities, particularly activities that may prove higher risk or may cause a higher degree of potential privacy harms. It can include obligations of transparency, data minimization, limits on secondary use of data, duties to secure data. And there are other ideas that we've seen proposed, for example, prohibitions on using dark patterns. In other words, think prohibitions on being unfair or deceptive in the way you obtain consent from consumers or in enabling consumers to exercise their rights, perhaps duties to avoid abusive trade practices. And we've seen, for instance, Senator Thomas in his most recent privacy bill has included what he calls a duty of loyalty and I think of care.
Bill Tolson:
Yeah.
Ryan Harkins:
Which is also interesting, should companies proactively have duties to avoid processing the data they collect from consumers in ways that are deceptive, unfair, or otherwise abusive. So we think that there's more work perhaps to be done on the category of duties or affirmative obligations that companies ought to have, but it's clearly a key component that has to be part of any credible privacy law.
Bill Tolson:
Yeah. That's great. I'm glad you brought that up. And the whole idea, like you said, you mentioned dark patterns, and I've been following that for a while now and I've seen all kinds of what amounts to horror stories around dark patterns. I won't get into them now. I know we've gone a long way here.
Ryan Harkins:
Just on that topic and on efforts to advance privacy legislation more broadly, I'm really thankful and grateful for a lot of the work that lawmakers and other stakeholders have done over the past five years to advance this topic. This stuff is hard. Legislating is hard in general. I think you need a thick skin, not only to run for office and to be in positions of leadership, but to otherwise just engage in the process.
I certainly have seen over the past four or five years, some of our efforts to try to be supportive and engage in advanced privacy laws mischaracterized at times in news stories and the like. I think it's very easy sometimes to simply portray any efforts from companies as, "Well, they must obviously be seeking just to pass weak privacy laws or privacy laws that won't provide real protections." And I do think there are some companies and some in industry who unfortunately have fallen into the habit or the pattern of doing that, but it's not what we've been trying to do.
And in California, I think to start where it all began, Senator Hertzberg and Assemblyman Chau and Alastair Mactaggart and his team did a lot of great work in getting the first privacy law over the finish line. I have a lot of respect and admiration for all of those guys.
What Virginia accomplished was remarkable. And I think that we should all, or I at least am thankful for all the work that Senator Reuven Carlyle did in Washington state to try to advance privacy legislation. In Colorado in particular, Senator Rodriguez and Senator Lundeen and Attorney General Phil Weiser's office, they were remarkable and did a lot of great work to get their privacy law passed and over the finish line.
And Senator Maroney in Connecticut, he embarked on what essentially was a two year process. He pulled together a working group that had stakeholders from civil society and from the business community among others to provide feedback to ultimately get Connecticut's privacy law over the finish line.
So I just would thank all of those people for all the work they've done. And I think this is incredibly important work and I certainly stand ready, and I know that Microsoft and our company is ready to work with any other state lawmakers, as well as members of Congress, to try and reform the law and pass new comprehensive privacy laws.
Bill Tolson:
Yeah, that's great. And I think the work that Microsoft and you and your group has done has been great and well received. And obviously I think you'll continue to be involved. And I do have an agreement for some time in the future to have Colorado Senator Lundeen on the podcast so I'm looking forward to that, my home state. I think I sent him an email not long ago saying, "Senator, we need to schedule this because I've already had Senator Cullimore from Utah on." And for those of you who don't know, there's a big competition on everything between Colorado and Utah. So I think that'll spur him.
With that, Ryan. I think we'll wrap up this edition of the Information Management 360 Podcast. I really want to thank you for this. Really insightful and fun and enjoyable discussion today on this really important and timely subject. And the subject is not going to go away, as you said, there's still lots going on and it's going to continue. If anybody has questions on this topic or would like to talk to a subject manner expert, please send an email mentioning this podcast to info, I-N-F-O, @archive360.com, or my business email is bill.tolson, T-O-L-S-ON, @archive360.com.
Also, check back on our Archive360 resources page for new podcasts with leading industry experts like Ryan here on a regular basis. I also have several podcasts that I have recorded with state representatives and senators, especially the one with Senator Thomas that I think you would find really interesting. And in the next several weeks, we have several podcast recordings lined up. So there's another four or five or six that you'll be seeing come up on the resources page as well as these podcasts are published to iTunes and Spotify and many other platforms. But with that, Ryan, I want to thank you, again. I had a great time and hopefully you did too.
Ryan Harkins:
Thank you for having me, Bill, I enjoyed it.
Questions?
Have a question for one of our speakers? Post it here.