Presenters: Howard Fienberg, SVP, Advocacy, Insights Association; Stuart Pardau, General Counsel, Insights Association; moderated by: Melanie Courtright, CEO, Insights Association
Transcript Courtesy of Focus Forward & FF Transcription
Melanie Courtright: Hello, everyone that's joined us so far. Welcome. Very, very glad that you are here. We'll get started in just about two minutes. All right, so today's town hall is on state privacy law, compliance and developments. We're really excited to be with you today, just a couple of quick announcements. Today's session will be recorded and will be made available to you as well as a transcript along with the slides. It'll come to you as part of a wrap up of the town hall via email, but you can also find everything on our website in our town hall library there at insightsassociation.org. We will also – the transcriptions will be provided to you as well and those are thanks to our wonderful partnership with Focus Forward, apologies, thank you and we'll make those available to you as well. Also, just a note today, we're going to be doing a state privacy law, compliance and development. I would like to just remind you that the information that we're going to share today is not to meant substitute for any legal, financial or advice that might need to provided by your own attorney, accountant or financial advisor, and that is specific to your business. It cannot substitute for formal legal advice. If you need formal legal advice and you don't have a partnership with a lawyer, accountant or financial advisor, then you can reach out to us and we'll do our best to connect you with someone who may be able to help you. And then the last kind of housekeeping thing is that we will leave plenty of time for questions and answers at the end, but we'd love for you to bring up your Q&A pod if you have an actual question. We encourage you to put questions in the Q&A window, but we also would love for you to bring up your chat and be very active in the chat, offer opinions, share resources with each other, and then we save the chat and we can also make the chat available if there's resources in there that are important to people. With that, we are blessed and pleased to be joined today by Howard Fienberg, as you all know, Howard is our senior vice president of advocacy at the Insights Association. He's our formal lobbyist focusing primarily on consumer privacy and security. And then he's also joined by Stuart Pardau. He's our general counsel of the Insights Association, founder and principal and Stuart Pardau and Associates and based in Los Angeles. And both of these gentlemen are your very best friends when it comes to resources and happenings in this particular area of expertise. With that, I'm going to hand it off to Howard and Howard will kick us off.
Howard Fienberg: All right. I only really have a handful of things to cover with you everybody today but appreciate you all coming onboard. The legislative side is perhaps slightly less interesting for most of you than what you actually have to comply with. Thankfully, not all of these states actually passed laws this year, but all of them certainly gave consideration to it. We were particularly instrumental in helping to defeat privacy law attempts in Florida, Indiana, Iowa, Maryland, Washington and Wisconsin. We were trying at the same time to improve them as much as we could, but frankly, killing bills is much easier than improving them in a lot of cases, and bills died all over the country. Some of these were better than others, some of them attempting to replicate CCPA, others copying some of the other state bills and some just kind of freelancing it. And those bills are still kicking around in a few other states, Massachusetts and Michigan, Jersey, in particular those three go all year long, as do Ohio, Pennsylvania and DC. North Carolina's going to close up session in a few days, so hopefully that will be off the table. And New York, although they technically go out of session, anything can still be done in New York if they so choose. Next slide please. Sadly, didn't have a perfect record here, so we have a bunch of new laws coming into effect in 2023, over – as Stuart's going to go into specifics on all of them, but the ones you probably know of best are the California Privacy Rights Act, which replaces CCPA. That's coming into effect. The Virginia Privacy Law, CDPA, and the Colorado Privacy Act, both of those were passed into law last year. We were part of the back and forth on both of them to varying degrees of success, this year, we were involved in Utah, again, basically killed it a year ago, but once they decided to get rid of the private right of action, that bill was rammed through extremely quickly this year. And Connecticut, similarly, moved through relatively quickly once all the parties decided that it didn't need a private right of action for private lawsuits. A deal was worked out relatively swiftly in Connecticut, and again, those are all coming into effect in 2023. I've been much more focused over the last few weeks at the federal level, because that's where our general goal is. We've been working with a coalition we helped found called Privacy for America, trying to pass a federal privacy law with the overarching goal of trying to replace this cacophony of conflicting state privacy laws with a federal standard that is the most conducive for insights that we can get, certainly, without private lawsuits, if – that's certainly one of the goals. But preempting that mess of state laws is also very high on the agenda, and at the moment, three of the four leaders on privacy in the House and Senate are pushing together on its big compromise privacy bill, it's starting to clean up a bit. A bunch of amendments were made to it since they first introduced a draft a few weeks ago. So the version that they passed out of the house subcommittee just yesterday morning contains a bunch of requests – improvements that we specifically requested including carving a few things out of their definition of sensitive covered data, which requires an opt in to share and the online activities piece of that would have pretty much destroyed the audience measurement industry. The fact that they took our lead on that and get rid of that inclusion in the definition is a big win for us, and also clarification on audience measurement in the carve out from the definition of targeted advertising. They did in fact adopt our definition of market research that we worked on with Privacy for America over a long time, and that specifically, they've addressed market research as it relates to the use of incentives for participation in market research. The very nice – not a high level issue on the part of anybody involved in the bill, but a really big deal for us to make sure that we could still offer those, because they're really taking a hack at loyalty programs in a variety of ways. At the same time as all that's going forward, and that is going to go most likely to be marked up in the full committee probably sometime after the July Fourth recess. So some time before Congress leaves town in August, they will most likely have passed a bill at least out of committee in the House. The Democrat – lead Democrat in the Senate, Cantwell has turned her nose up at this bill, but I think that there's still room to negotiate, and there's still room to negotiate on a lot of the important points in the bill, even as it moves to the House, because it does have a limited right to private lawsuits and its preemption of state laws is kind of mediocre. There's a lot more work to be done. It is entirely possible that we could have a federal privacy law compromise passed in November, December after elections. So we've got a lot of work to do until then, and at the same time, the Federal Trade Commission has a full complement of commissioners and the chairman, Lena Khan is very eager to dig in on a variety of different privacy issues. She's very aggressive, not a friend to any kind of private business, so they will be engaged at the FTC most likely at the same time. And with that done, I want to turn it over to Stuart and I'll be piping in as needed during his presentation.
Melanie Courtright: Stuart, before you move on, one quick question from the group.
Stuart L. Pardau, Esq.: Please.
Melanie Courtright: And because it's sort of foundational, just would, in B2B research – Howard fielded a question about this, too. In B2B research, are some of the B2B datapoints considered sensitive or PII like respondent title, age, years in the field, name of employer? Are those considered to be sensitive?
Stuart L. Pardau, Esq.: Yeah. Most of these laws are designed to protect individuals and consumers, not businesses. I'd say you're generally free in those areas, but that doesn't remove your liability risk if there is a breach because people can still bring claims. And I think that when we sometimes focus – and I put myself very much in this category too – focus on the personal information side of it, we sometimes put in a second category the notion that you can still have a breach and not have a breach involving personal information. It could still be very damaging if you're harming a relationship with a client or you're inadvertently leaking confidential or other business proprietary information. Short answer is a lot of these laws don't cover the B2B, but it's still something to stay focused on and still results in liability risk.
Melanie Courtright: The Insights Association and part of our codes and standards does consider titles and company names to be PII. And we consider that you have an obligation to protect identity, let research remain anonymous. While it may not apply to some of the actual law around privacy, it still pertains to how we execute research, PII and no direct action.
Stuart L. Pardau, Esq.: Agreed. Agreed. This is the map we looked at before, the red states being the more current. The shaded ones are ones that have things in motion or consideration. Next one, please. CPRA, so again, looking at the big fish, California still remains the big fish for all the reasons we discussed. CCPA took effect in January of 2020. Voters, California voters, in November of that same year passed the CPRA. And under California, you can vote by direct methods and pass laws that way as well. And that's what we did with CPRA. That takes effect January 2023. And it's, I would say, in large measure, even adding further restrictions to business, making it even stricter in that way. Now, it is important to pause and note that once a law is passed, as CPRA has done, there's usually a process by which regulations follow. And the regulations is part of the rulemaking exercise. It's a legal process. But it's not done by the legislature. It's not done by the voters. It's done by the regulators. And it happens at a federal level, too. Same thing is occurring now in California state level. The new agency, which has been created by the CPRA, is promoting these regulations, putting out these regulations. And there's a comment period. There's an opportunity for individuals to comment, and Insights over the years has commented on these regulations. We'll certainly be commenting on them. And the CPRA has put these out in draft form, so it's currently under consideration. Nevertheless, so obviously, there's some fluidity to this. But there's still some things that have been put in stone in the actual CPRA law that the voters passed. And one of the key ones is, that we'll really focus on, is that second item in red, "or sharing." There was – and some of you who know me and have talked to me about these when I presented on these at other Insights presentations – there was always a question is, "Well, gee, we don't collect 50,000." Now it's 100,000. But in the day, it was 50. "We don't collect 50,000 records of California residents." Or there's another requirement here that's not on the list of, "We don't have revenue of 25 million or above. And we don't sell any personal information." And my answer always was, "Yeah, you may not have 25 million in revenue. And yeah, you may not have 50,000 records of consumers or households. But you're deriving 50% or more of your annual revenue from selling." And people would say, "No, we're not. We're not selling." And I would say, "Look, the way it's defined as selling under the law, under the CCPA, it is clearly more than just a colloquial sense of selling. It really means sharing." And that was the advice we gave. And it turns out that the CPRA clarified that point to make it, if it wasn't sort of 85% clear, to make it 100% clear. Now it includes sharing. Any sharing of personal information is pretty much what a lot of research companies are going to be doing. There have been a couple instances where I've had conversations with folks these past few years where I've been persuaded – or just educated is probably a better way to say it – that there's some that if you're truly handling aggregated data, as some of you are, then you're not covered by this. This only covers personally identifiable information. If you're only handling, if you're doing qualitative research where you're only looking at aggregate information, then there's a very good chance you're not covered by any of this stuff. Then it comes down to am I handling maybe anonymized information? Is it at a sort of respondent level, but it's somehow been deidentified so that it is truly anonymized or what's sometimes called pseudo anonymized? Possibly. We'll have to examine what does personal information mean? Personal information has a very broad meaning under California and under really most of these other states as well. California expressly includes examples such as not just name, address, email, or Social Security, or passport, or driver's license, but stuff like geolocation, stuff like IP addresses. But the basic principle which undergirds all of it is, if you can, as this last bullet point says, reasonably link or you can possibly reasonably associate that information within a consumer or a household, bingo, you're in personal information land and therefore covered by this law in most all these laws. Again, it’s going to be a very sort of case-by-case assessment, but we've got to keep our eyes on the ball. If you're only doing aggregate, and/or it's truly anonymized, meaning you cannot reverse engineer identities under any sort of plausible scenario, then yes, you are outside these laws. But any other sort of scenario, there's a very, very good chance that you are within it. Next slide, please. Some key changes to CCPA, they've kind of expanded the definitions of service providers, contractors, third parties. Again, we could spend a lot of time on this, but you'll have the benefit of the slide, and if there's other questions, we can try to answer them in Q&A Maybe. Key changes to the CCPA, there's some new consumer rights, the right to correct, and there is further limiting or processing of what we call sensitive personal information, which is stuff like ethnicity, religion, health, sexual orientation, but financial information as well is also deemed to be sensitive. There's these new requirements that you've got to put. There used to be a not only the "Do not sell my info" button, but now there is this "Limit the use of my sensitive personal information". So, there's got to have at least two methods by which you can put in these requests to limit the use of the sensitive information. So just know that there's a higher standard if you're collecting sensitive information. Even though the CCPA is based on opt out, there is, under the Virginia law, a more of an opt in feature when collecting sensitive information. We'll get to that momentarily. Next slide, please. Next slide.
Melanie Courtright: Great. Thank you, appreciate it. I'm going to leave the slide up for a minute just in case people want to gather your contact details. We've answered quite a few questions already in the Q&A and some of them live. There's another theme around how long should you keep data, looking for a bit more clarity. There was one question specifically saying, "We have data from around 2011, and could we keep it if we de-anonymize it, pseudo-anonymize it?" Howard gave a response too, but thoughts from either of you on how long people should keep data? And one person specifically said, "Please don't just say, contact your DPO." Is there anything more than that, that we can give?
Howard Fienberg: We've been mulling internally, within the association, the idea of a standard for the industry and still debating exactly what that would be. Because certainly anything longer than five years is kind of fishy. And as I mentioned in my answer to that question, the difficulty is in how you define de-identification, because how you think of it may not be how the law considers it. Stuart, you have your own opinion on this?
Stuart L. Pardau, Esq.: Yeah. There's some, as I said, some specific regulatory requirements. If you're dealing with, say, IRS, I think it's going to be seven years is what I've consistently heard. But when you get away from that, as far as what we do as an industry, you've got two competing things, you've got – well, three actually. One is, what are your clients demanding? Are your clients specifically requiring you to keep this for periods of time? I've seen that as a major feature of people's decision making. So that would be one question I would pose to everybody is, do your specific client needs demand or require you to keep it for x period of time, that will be a major factor in how I would advise somebody. Because I think that if you've got a contractual obligation, and there's a legitimate need, you could maybe keep it for three years, five years, or you may need to keep it for three months or three weeks. It'll really vary pretty significantly. Second thing is, don't forget this idea of data minimization. It's very real. It's not just some theoretical, academic sort of theory. I think it kind of started off that way, but it's become a standard now in these laws. And no one's defined what exactly it means. It's kind of like the idea of reasonability in the law, we've seen that word millions of times, no one can necessarily, in every case, define exactly, precisely what it means. It'll depend on the circumstances. Same sort of thing for data minimization, but you have to understand that it's out there, and the regulators are going to scrutinize how long are you going to keep [INAUDIBLE] for, which to me says, err on the side of deletion faster than not, be inclined to purge your systems faster. And then thirdly, what really makes sense for your own business. I would say, again, you may have some very legitimate needs for that. I think it's important to point out the following, there was a lot to cover and I obviously can't cover a lot of it. But under the CCPA or CPRA, as a service provider, you're actually allowed to keep certain pieces of information for longer periods of time, if it's designed for what we'll call product improvement, enhancement of consumer experience, benchmarking, things like that. So, it'll really depend on what it's being used for, as well. I know I'm not giving you a yes or no answer, but hopefully it's a little bit better answer than go talk to the DPO.
Melanie Courtright: Well, and then speaking as a researcher for a moment, we have a habit of just wanting to keep data forever. And so, we're going to have to start thinking more about is this data really– What's the shelf life of this data? How long is the data actually really– What's its utility lifeline? You could keep data for five years, but the way the world is changing, data that's five years old is not often that useful. So, we can also think about utility.
Stuart L. Pardau, Esq.: And to add to that calculation, Melanie, I would say the longer you keep something, the chances of something bad happening are naturally going to increase. I'm not saying it's going to be a material risk, necessarily, but you will not have a data breach of certain data if it doesn't exist, right? But if it's there on your network somewhere, it could happen. So, you don't want to deal with that. It's just a pain in every single possible imaginable way. So, I would urge people to think about that.
Melanie Courtright: Well, we are out of time. There was one more question that I don't know if Howard wants to– Just to make sure everybody sees it, about if ADPPA passes, will it override the states or will there end up being conflicts? And that's a good question, so why don't you take a second to answer that, and then we need to wrap up.
Howard Fienberg: Yeah, there will always be some conflicts, especially early out of the gate. Part of the reason, the big ticket items in fighting over that bill is going to be the extent to which it preempts state laws, and our goal was to preempt as many of them, and to as great an extent as we can. It has a fair number of loopholes for certain state laws that are really happy, that make the trial bar very happy, like the biometric privacy law in Illinois, and the data security private lawsuit provisions in California. So a work in progress.
Melanie Courtright: Great, thank you. Thank you, Stuart. Thank you, Howard. As always, thank you for everything that you do for the profession, for our industry, really appreciate it. A couple quick notes for you guys, our next townhall is in July, and it's on turbocharging your career, some really great speakers and participants in that. We have our idea forum coming up, August 9th and 10th. It's virtual, and it's all about inclusion, diversity, equity, and access. And we have a new report coming out next week, that's actually going to be a really important report talking about the census work that we did on our profession, and also the culture of inclusion. And then finally, really important, our CRC22 in Manhattan, October 26th to 28th, going to be a very large Hallmark event with some really cool experiences. So, make sure to put that on your calendar and you can register now for any of these at insightsassociation.org. We will, again, wrap this up, put a package together, and be in your inboxes, and if you need anything, feel free to reach out to me, email@example.com, or Howard, or Stuart, and thank you all very much for being here. Have a great weekend.