Presenters: Howard Fienberg, SVP, Advocacy, Insights Association; Stuart Pardau, General Counsel, Insights Association; moderated by: Melanie Courtright, CEO, Insights Association

Transcript Courtesy of Focus Forward & FF Transcription

Melanie Courtright: Hello, everyone that's joined us so far. Welcome. Very, very glad that you are here. We'll get started in just about two minutes. All right, so today's town hall is on state privacy law, compliance and developments. We're really excited to be with you today, just a couple of quick announcements. Today's session will be recorded and will be made available to you as well as a transcript along with the slides. It'll come to you as part of a wrap up of the town hall via email, but you can also find everything on our website in our town hall library there at insightsassociation.org. We will also – the transcriptions will be provided to you as well and those are thanks to our wonderful partnership with Focus Forward, apologies, thank you and we'll make those available to you as well. Also, just a note today, we're going to be doing a state privacy law, compliance and development. I would like to just remind you that the information that we're going to share today is not to meant substitute for any legal, financial or advice that might need to provided by your own attorney, accountant or financial advisor, and that is specific to your business. It cannot substitute for formal legal advice. If you need formal legal advice and you don't have a partnership with a lawyer, accountant or financial advisor, then you can reach out to us and we'll do our best to connect you with someone who may be able to help you. And then the last kind of housekeeping thing is that we will leave plenty of time for questions and answers at the end, but we'd love for you to bring up your Q&A pod if you have an actual question. We encourage you to put questions in the Q&A window, but we also would love for you to bring up your chat and be very active in the chat, offer opinions, share resources with each other, and then we save the chat and we can also make the chat available if there's resources in there that are important to people. With that, we are blessed and pleased to be joined today by Howard Fienberg, as you all know, Howard is our senior vice president of advocacy at the Insights Association. He's our formal lobbyist focusing primarily on consumer privacy and security. And then he's also joined by Stuart Pardau. He's our general counsel of the Insights Association, founder and principal and Stuart Pardau and Associates and based in Los Angeles. And both of these gentlemen are your very best friends when it comes to resources and happenings in this particular area of expertise. With that, I'm going to hand it off to Howard and Howard will kick us off.

Howard Fienberg: All right. I only really have a handful of things to cover with you everybody today but appreciate you all coming onboard. The legislative side is perhaps slightly less interesting for most of you than what you actually have to comply with. Thankfully, not all of these states actually passed laws this year, but all of them certainly gave consideration to it. We were particularly instrumental in helping to defeat privacy law attempts in Florida, Indiana, Iowa, Maryland, Washington and Wisconsin. We were trying at the same time to improve them as much as we could, but frankly, killing bills is much easier than improving them in a lot of cases, and bills died all over the country. Some of these were better than others, some of them attempting to replicate CCPA, others copying some of the other state bills and some just kind of freelancing it. And those bills are still kicking around in a few other states, Massachusetts and Michigan, Jersey, in particular those three go all year long, as do Ohio, Pennsylvania and DC. North Carolina's going to close up session in a few days, so hopefully that will be off the table. And New York, although they technically go out of session, anything can still be done in New York if they so choose. Next slide please. Sadly, didn't have a perfect record here, so we have a bunch of new laws coming into effect in 2023, over – as Stuart's going to go into specifics on all of them, but the ones you probably know of best are the California Privacy Rights Act, which replaces CCPA. That's coming into effect. The Virginia Privacy Law, CDPA, and the Colorado Privacy Act, both of those were passed into law last year. We were part of the back and forth on both of them to varying degrees of success, this year, we were involved in Utah, again, basically killed it a year ago, but once they decided to get rid of the private right of action, that bill was rammed through extremely quickly this year. And Connecticut, similarly, moved through relatively quickly once all the parties decided that it didn't need a private right of action for private lawsuits. A deal was worked out relatively swiftly in Connecticut, and again, those are all coming into effect in 2023. I've been much more focused over the last few weeks at the federal level, because that's where our general goal is. We've been working with a coalition we helped found called Privacy for America, trying to pass a federal privacy law with the overarching goal of trying to replace this cacophony of conflicting state privacy laws with a federal standard that is the most conducive for insights that we can get, certainly, without private lawsuits, if – that's certainly one of the goals. But preempting that mess of state laws is also very high on the agenda, and at the moment, three of the four leaders on privacy in the House and Senate are pushing together on its big compromise privacy bill, it's starting to clean up a bit. A bunch of amendments were made to it since they first introduced a draft a few weeks ago. So the version that they passed out of the house subcommittee just yesterday morning contains a bunch of requests – improvements that we specifically requested including carving a few things out of their definition of sensitive covered data, which requires an opt in to share and the online activities piece of that would have pretty much destroyed the audience measurement industry. The fact that they took our lead on that and get rid of that inclusion in the definition is a big win for us, and also clarification on audience measurement in the carve out from the definition of targeted advertising. They did in fact adopt our definition of market research that we worked on with Privacy for America over a long time, and that specifically, they've addressed market research as it relates to the use of incentives for participation in market research. The very nice – not a high level issue on the part of anybody involved in the bill, but a really big deal for us to make sure that we could still offer those, because they're really taking a hack at loyalty programs in a variety of ways. At the same time as all that's going forward, and that is going to go most likely to be marked up in the full committee probably sometime after the July Fourth recess. So some time before Congress leaves town in August, they will most likely have passed a bill at least out of committee in the House. The Democrat – lead Democrat in the Senate, Cantwell has turned her nose up at this bill, but I think that there's still room to negotiate, and there's still room to negotiate on a lot of the important points in the bill, even as it moves to the House, because it does have a limited right to private lawsuits and its preemption of state laws is kind of mediocre. There's a lot more work to be done. It is entirely possible that we could have a federal privacy law compromise passed in November, December after elections. So we've got a lot of work to do until then, and at the same time, the Federal Trade Commission has a full complement of commissioners and the chairman, Lena Khan is very eager to dig in on a variety of different privacy issues. She's very aggressive, not a friend to any kind of private business, so they will be engaged at the FTC most likely at the same time. And with that done, I want to turn it over to Stuart and I'll be piping in as needed during his presentation.

Stuart L. Pardau, Esq.: Well thank you very much Howard, and great to be here and I hope everybody's doing well. We've got roughly 40 minutes or so to present, and then keep the balance of the time for questions. There is quite a number of slides here and we will go through them, but some of them we'll go through extremely quickly. The key focus of this discussion is on the state privacy laws, and Howard mentioned some of them, and these are highlighted in red here, that are most relevant. California, Utah, Colorado, Virginia and Connecticut, these are some of the ones that are – have come most recently. Next slide please. What are we going to cover? Give you a little bit of background, do some refresh on some key high level points, try to adopt as best we can a big picture approach to identify some common steps, because this is a very sort of fractured, balkanized regulatory environment and has been in the United States for a very long time, which is why what Howard has been working on is so important at the federal level, because if you had some sort of overarching federal legislation, much in the way you have in the EU or Canada, it would I think help clarify things and make compliance with this sort of disparate set of requirements a little bit easier. At any rate, the system we have now is what we've got, and these are the states, California, Virginia, Colorado, Connecticut, Utah, which we'll spend a bit of time on. Also talk about some new state and federal laws very quickly, but most importantly, spend our time on what your compliance approach should be and with some key takeaways. Next slide please. Background, right? The basic issues is US is kind of different, in many respects. The US does not have a sort of overarching federal privacy law, what we have instead is a fractured system at the federal level, that is to say we've got laws that are based on the types of data that you're collecting. As people have deal with HIPAA, that's health information, financial information, that would be through Gramm-Leach-Bliley so it tends to be content specific, what laws develop around that. And then also separately, what I refer to as modality specific, that is to say if we're doing telephone research, people in the research industry have been unfortunately the victims of some of these lawsuits under the TCPA. And that has led to several challenges over the years of people getting sued and actually having quite existential threats for some of the smaller folks in the industry. Telephone, email could be covered by CAN-SPAM, so it's a whole set of different types of laws that may be contingent on the method in which you're collecting the information. Then we've got the sort of the state laws, right? And the big one has been California, California's of course the largest state in the union and has been really at the forefront of privacy law developments. These other states we've mentioned before have to a large degree tried to pattern their laws and themselves after the California law and that was the CCPA. And what has come after that is a proposition on the ballot before voters in the 2020 Election called the CPRA, the Consumer Privacy Rights Act, which is sort of is a bolt on to the CCPA adding further elements to it. We're in the process of that being adopted, that'll take – come online in 2023. But nevertheless, we look at the big picture here, and the big picture tells us the big areas of privacy law regulation have been in the European Union and in California. The focus of this of course is on US state laws, but contextually, it's important to bear that in mind. If you're dealing internationally, like in China for example, the new privacy law in China, very much patterned after the GDPR. So GDPR is – become very much a global standard in that regard, and when California came online, there are certain commonalities with GDPR, but certain very important differences as well. Next slide please. As I said, effective January 2023, the CPR comes online, it amends and substantially expands the CCPA, but there's other stuff happening. Virginia has passed the VCDPA, that takes effect also in January of next year, in July of next year, Colorado and Connecticut just passed a new law, and at the end of next year, the Utah one comes online. Next slide please. Given this sort of patchwork, right? You've got, at the federal level, you've got – you're dealing with healthcare, you've got HIPAA, if you're dealing with financial services, Gramm-Leach-Bliley, credit, you're dealing with the FCRA, you've got all these different sort of laws on that basis. If you're doing telephone work, if you're doing online work, you've got different laws which will cover you, so how do you really know? How can you comply? You layer on that – the various state laws, so here are some ten items that I would like to spend a little bit of time on, disproportionate I would say to the time for the rest of the slides, but it's I think time well spent, because doesn't matter which side of the fence you're on, whether you're on the client's side or whether you're on the research provider's side or you're a vendor supplier to a research company. All of these will touch and concern you in some way, and I would also like to point out that as much as we would like to be a 100% compliant with all these laws, as a practical matter, it's really impossible for any company, no matter how much you may have in terms of size and resources. It is a very important sort of assessment to make that at the front end is, what can we realistically expect to do given our situation and given the resources that we have? And that's an assessment every company makes regardless of size. And frankly, not just on the privacy and data security area, in many other areas of the law and compliance as well. With that, one key feature, of course, is focus on the big ones. Let's set aside GDPR for a moment because the focus here is on US domestic. CCPA. California is 3 trillion-plus dollar size economy. If it was its own economy, it would be the fifth – I believe the fifth largest economy in the world. About 15 or so percent of the US economy. Kind of hard to ignore it. And where does it come in? It comes in – companies can say – I've had folks come to me over the years, "Well, gee, I'm in Indiana or I'm in Alabama, what does California have to do with me?" It has to do with you in this way. If you're collecting information, if you're collecting personal information from individuals who reside in California, you fall within the ambit of its jurisdictional reach. That's why California becomes important. And I've met some companies that just have made a decision, we don't want to deal with that, so we'll exit California. And that's a decision that people need to make. But typically, that's not what companies are going to do. If you follow that lead of go for the big fish, California has – who are the first major privacy law. It's the most comprehensive and the most far reaching. I think it's fair to say that if you could develop a good compliance regime around California, you've – I'm not saying 100%. I mean, there are differences to be sure, as we'll see in a moment, particularly when you get around issues of collection of sensitive information. That can be more of a Virginia of an opt in versus California, it really isn't. It would be if it's for children under 16, and certain other narrow situations, but generally speaking, it's more of an opt out environment. You have these differences on the margins, but 85, 90%, you'll find some commonality. If you start with the big one in California, you make a reasonable effort to comply, that will go a long way, in my opinion. It's only one of the 10 bullet points, but it's a key one. Next one. The other is you'll want to look at your privacy policies and your other notices. This is, unfortunately, something that is a sunken cost. That is to say you can't kind of avoid it. And, unfortunately, because of the changes in the laws and the regulations, it does require updates, but it's not going to be something that's going to be overly extensive in most cases. Point there, though, is privacy policy is the promise that you make to the world, basically, in terms of how you handle your data collection, your storage, your sharing, your handling, and so forth. It's a promise to the world. And some companies, sometimes smaller companies on the supplier side, will say, "Well gosh, we'll just pick a privacy policy from one of our competitors or from company X that we think is a good company and we admire. We'll just leverage their template." The problem with that is that circumstances and usages are going to vary from case to case. And if you put things in the policy that are not accurate, or untrue, or quite untrue, that opens the door, really, to a boatload of trouble. You want to avoid that. You want to customize it for your particular usages, which then requires, as a predicate to that, the ability to go through and identify your practices and how you actually do things and have your policy properly reflect that. There's a lot of notifications that are required, notices – and particularly under the California one, but these others as well – to notify individuals when you're collecting sensitive information, how you're collecting it, and so forth. Review of your consent processes. Again, there may be some variation or is some variation between some of the states as the laws are coming online or as it relates to sensitive data at least. But you'll want to look at how you gather your consents and what they actually say. Good news is, it's a fairly straightforward exercise. It's just a question of getting the right language and positioning in the right way so it's clear and visible to consumers – or to the respondents, I should say. There are these documents called DPAs or data processing agreements. The DPAs are a relatively new feature in the arsenal of agreements that we see for companies, but they tend to be – not tend to be – they are actually designed expressly and exclusively for these descriptions of data processing. The idea being that just having a provision or two in the MSA would not be sufficiently adequate to cover how data is processed. And you're going to have this with anyone that you're sharing the information with. If you're a client and you're sharing a respondent list with the research company, you're sharing your company's personal information. You're going to want to entrust your research provider with that information, but subject to a tight agreement, and that would typically be covered by DPA. It will have what we call the representations and warranties, which are the promises and the indemnification, which gives you the sort of backing, if you will, when things go sideways, that the other party will step up and defend you in the case of a claim or some such situation. You want to also review your consumer request response processes, because this is a huge area. People share their information and then they want to be taken off your database. They want to basically have their information erased. You have to handle also the erasure requests. You have to handle these opt out requests. You have to have a mechanism by which you do this, and you have to have decent record keeping. That is an important feature, and most of these laws have such situations where they require that. Next slide, please. Next slide, please. Those are those are sort of five of the sort of areas. Here are five more sort of big picture, sort of key common steps, common sense things that I would recommend doing. But again, if you do all these things, it's not going to be 100%. Let's be very clear about that. But it will go a considerable distance in mitigating your risk and putting you in a zone of compliance. Because again, no company, no matter how much resources they may throw at this, is going to be 100% compliance with any of this stuff. Sixth one, review of your security practices. Sometimes we conflate privacy with security, colloquially used and conflated in our conversations, even among attorneys. But they're distinct areas. Privacy is what folks like me do. We try to advise on what should be in your privacy policy, how you should write it, what the consent should say, how your indemnification or reps and warranties should be crafted. How do you sort of navigate this compliance maze? But when it comes to security, that is to say how good are your networks really? How good is the security behind your network? Are you doing the penetration testing? Are you doing encryption? Do you have multifactor authentication for any sort of logins and other sorts of people when they have access? These are the sorts of features that are essential, but a distinct function performed by those qualified in that field: IT professionals, security professionals, and so forth. And in an ideal world, you'll have your security teams working very closely with your privacy teams because the two are really inextricably linked. The review of these practices is an essential feature of a good compliance program, and indeed, is a requirement of some of these laws, like under the CCPA, which we'll get to. Now, as a practical assessment, if you're on the client side, and you're a Fortune 50 company or a Fortune 500 company, you have a large and fulsome legal department. You have a very strong IT department. You've got a security group. You've got all these resources, which you should be liaising with, and drawing upon, and making sure that you're properly protected, and so forth. If you're a research company, again, you may also be large. Maybe not quite as large as Fortune 50 or 500, but still large with all those sorts of groups behind you. But it may also be the case that you're a relatively small company. You don't have a legal department. You don't have an IT department. You're going to have to make some hard choices in terms of how you're going to be allocating those resources to do these functions and to assess, where am I going to get the most for what I've got to do? And again, my view on that is, whatever you do is likely to be incrementally helpful. Every little bit will help. And this is not just me speaking, this is what some of the regulators in California and other places have been saying. And having been on the other side of some of these investigations at the state level, at the federal level, and not just in the privacy area, but generally, regulators will look more favorably on an institution that's under investigation if and when they have at least taken some reasonable steps to do something prior to the incident as opposed to ignoring it and maybe willfully and consciously ignoring it. From that perspective, I think you can really benefit, even if you don't do most of what you hope or believe you need to be done. That incremental concept I think is an important one. Data protection assessments, very closely related to their prior bullet point. There are wonderful features we have through ISO and Insights can definitely help you there for the ISO certification. That is an important thing and I think something that only helps you. But I would also add, to be clear, it's not necessarily a full substitute for doing the security. And, in fact, sometimes the security is a necessary precondition to getting the ISO certification to begin with. You'll want to delve further into that, but there's no question that these certifications are beneficial. And by the way, not just from a compliance perspective, but sometimes contractually, your client or if you're a service provider to the research company, is going to maybe mandate or require that you be ISO certified or at least meet those standards. For those reasons, as well, it's very useful. Employee training, that kind of speaks for itself. You can't hold people accountable to standards that they're unaware of. I think it is important to clearly state out what it is you are trying to achieve and develop your program in that way and have some basic training can be useful. Privacy by Design and FIPPS. FIPPS is the fair information practice procedures. There's a lot of acronyms. No shortage of the acronyms. FIPPS is one of them. Privacy by Design is another. The idea there basically is you want to have an organization that designs its processes, its procedures, its data collection approaches to a more privacy-centric view. And I think it is important to point out, as these sub bullets do, is this idea of data minimization. This is not, again, my term. This is a general term that's being more widely adopted by regulators. And the idea there is don't collect more than you need in terms of personal information and don't keep it for any longer than you need to. The idea is if you keep it around for long periods of time, you could get subject to data breaches and more likely to be attacked or subject to those breaches. And the idea is, in terms of what you collect on the front end and how you use it, you really only collect what you feel you need for that particular project. That sort of idea, that sort of notion, is gaining greater and greater traction. It used to be – and still I think largely is – a world of notice and consent. The idea is, "Well, you get somebody – you give them proper notice, you get the consents, you should be good to go." That still is a very useful principle. But as Howard can tell you, the FTC has been looking at that very issue. And I think they're looking at a broader view. That is to say, it's more than just notice and consent. It's about don't collect more than you need. And there's this idea of people's personal information almost being kind of a property right of those people. All these developments occurring, all this sort of subtext, you just need to understand in terms of how you navigate. And last but not least, insurance. I'm a big proponent of insurance. Because again, it mitigates your financial risk. And there is plenty of insurance out there that may be relevant in terms of cyber, and in terms of Arizona missions, and maybe even others. But insurance is part of that program. Those 10 items, not 100%, but I think if you can keep your eye on those 10 points, you'll go a long way to reducing your risk and being more in a compliance area. Next slide, please.

Melanie Courtright: Stuart, before you move on, one quick question from the group.

Stuart L. Pardau, Esq.: Please.

Melanie Courtright: And because it's sort of foundational, just would, in B2B research – Howard fielded a question about this, too. In B2B research, are some of the B2B datapoints considered sensitive or PII like respondent title, age, years in the field, name of employer? Are those considered to be sensitive?

Stuart L. Pardau, Esq.: Yeah. Most of these laws are designed to protect individuals and consumers, not businesses. I'd say you're generally free in those areas, but that doesn't remove your liability risk if there is a breach because people can still bring claims. And I think that when we sometimes focus – and I put myself very much in this category too – focus on the personal information side of it, we sometimes put in a second category the notion that you can still have a breach and not have a breach involving personal information. It could still be very damaging if you're harming a relationship with a client or you're inadvertently leaking confidential or other business proprietary information. Short answer is a lot of these laws don't cover the B2B, but it's still something to stay focused on and still results in liability risk.

Melanie Courtright: The Insights Association and part of our codes and standards does consider titles and company names to be PII. And we consider that you have an obligation to protect identity, let research remain anonymous. While it may not apply to some of the actual law around privacy, it still pertains to how we execute research, PII and no direct action. 

Stuart L. Pardau, Esq.: Agreed. Agreed. This is the map we looked at before, the red states being the more current. The shaded ones are ones that have things in motion or consideration. Next one, please. CPRA, so again, looking at the big fish, California still remains the big fish for all the reasons we discussed. CCPA took effect in January of 2020. Voters, California voters, in November of that same year passed the CPRA. And under California, you can vote by direct methods and pass laws that way as well. And that's what we did with CPRA. That takes effect January 2023. And it's, I would say, in large measure, even adding further restrictions to business, making it even stricter in that way. Now, it is important to pause and note that once a law is passed, as CPRA has done, there's usually a process by which regulations follow. And the regulations is part of the rulemaking exercise. It's a legal process. But it's not done by the legislature. It's not done by the voters. It's done by the regulators. And it happens at a federal level, too. Same thing is occurring now in California state level. The new agency, which has been created by the CPRA, is promoting these regulations, putting out these regulations. And there's a comment period. There's an opportunity for individuals to comment, and Insights over the years has commented on these regulations. We'll certainly be commenting on them. And the CPRA has put these out in draft form, so it's currently under consideration. Nevertheless, so obviously, there's some fluidity to this. But there's still some things that have been put in stone in the actual CPRA law that the voters passed. And one of the key ones is, that we'll really focus on, is that second item in red, "or sharing." There was – and some of you who know me and have talked to me about these when I presented on these at other Insights presentations – there was always a question is, "Well, gee, we don't collect 50,000." Now it's 100,000. But in the day, it was 50. "We don't collect 50,000 records of California residents." Or there's another requirement here that's not on the list of, "We don't have revenue of 25 million or above. And we don't sell any personal information." And my answer always was, "Yeah, you may not have 25 million in revenue. And yeah, you may not have 50,000 records of consumers or households. But you're deriving 50% or more of your annual revenue from selling." And people would say, "No, we're not. We're not selling." And I would say, "Look, the way it's defined as selling under the law, under the CCPA, it is clearly more than just a colloquial sense of selling. It really means sharing." And that was the advice we gave. And it turns out that the CPRA clarified that point to make it, if it wasn't sort of 85% clear, to make it 100% clear. Now it includes sharing. Any sharing of personal information is pretty much what a lot of research companies are going to be doing. There have been a couple instances where I've had conversations with folks these past few years where I've been persuaded – or just educated is probably a better way to say it – that there's some that if you're truly handling aggregated data, as some of you are, then you're not covered by this. This only covers personally identifiable information. If you're only handling, if you're doing qualitative research where you're only looking at aggregate information, then there's a very good chance you're not covered by any of this stuff. Then it comes down to am I handling maybe anonymized information? Is it at a sort of respondent level, but it's somehow been deidentified so that it is truly anonymized or what's sometimes called pseudo anonymized? Possibly. We'll have to examine what does personal information mean? Personal information has a very broad meaning under California and under really most of these other states as well. California expressly includes examples such as not just name, address, email, or Social Security, or passport, or driver's license, but stuff like geolocation, stuff like IP addresses. But the basic principle which undergirds all of it is, if you can, as this last bullet point says, reasonably link or you can possibly reasonably associate that information within a consumer or a household, bingo, you're in personal information land and therefore covered by this law in most all these laws. Again, it’s going to be a very sort of case-by-case assessment, but we've got to keep our eyes on the ball. If you're only doing aggregate, and/or it's truly anonymized, meaning you cannot reverse engineer identities under any sort of plausible scenario, then yes, you are outside these laws. But any other sort of scenario, there's a very, very good chance that you are within it. Next slide, please. Some key changes to CCPA, they've kind of expanded the definitions of service providers, contractors, third parties. Again, we could spend a lot of time on this, but you'll have the benefit of the slide, and if there's other questions, we can try to answer them in Q&A Maybe. Key changes to the CCPA, there's some new consumer rights, the right to correct, and there is further limiting or processing of what we call sensitive personal information, which is stuff like ethnicity, religion, health, sexual orientation, but financial information as well is also deemed to be sensitive. There's these new requirements that you've got to put. There used to be a not only the "Do not sell my info" button, but now there is this "Limit the use of my sensitive personal information". So, there's got to have at least two methods by which you can put in these requests to limit the use of the sensitive information. So just know that there's a higher standard if you're collecting sensitive information. Even though the CCPA is based on opt out, there is, under the Virginia law, a more of an opt in feature when collecting sensitive information. We'll get to that momentarily. Next slide, please. Next slide.

Yes, we talked about the right to opt out, it includes more expressly, this idea of sharing, you should know that there is also a right to know, meaning what have you been doing with my information, who have you been sharing it with. And now it's no longer limited to a 12-month look back, meaning, in theory, you've got to provide it going back indefinitely, in terms of the people want to understand where their information has gone and who it has been shared with, that's the right to know feature. This ties into, and there was a question that came up yesterday in advance of the presentation, which I saw, which was interesting around erasure and deletion, and document retention policies. And the short answer to that is that document retention, there's really no clear direction, under the CCPA at least. There's other laws that may be implicated. If you're talking about financial records, and the IRS, I think, will have a seven-year period. But, it creates the necessity to really determine, what is your practice? What does your business really require? What are your clients requiring? Sometimes clients would require you to keep this stuff for periods of time, and there's probably a very good reason you want to keep stuff for a period. You want to think in terms of, "Should we even have a document retention policy," because if you have one, you better be darn prepared to adhere to it. I would say it's something to think about very carefully. But let's just say that if you don't have one, or you have one and it's to regularly delete information, you can only go back so far as you have that information. So, it has some, let's say, incentives to maybe have a more aggressive deletion policy, at least as it relates to this right to know. Next one, please. We talked, again, about data minimization, talked about storage limitation, we can keep going. All right, key changes to the existing regs. These regulations, as I said, are still in flux. They just submitted them recently, they being California State, and it's giving the public an opportunity to comment on them. There's roughly eight or nine areas, I'll quickly go through some of them. First one is they talk about dark patterns. This is not a phrase unique to the California regulators. Oops, yeah. This is one not unique to the California regulators, but the dark patterns is really something that is talking about being transparent in your communications. If you have a "I accept" or "I reject" button, you can't sort of characterize it to push them into the "I accept", right? I don't think we see much of this in research, but there may be some of it of trying to maybe shame the person for rejecting by saying, "Oh, you really don't want to save? If you don't want to save money, press no," type thing. That's what they're focused on, it's really about symmetry, it's about having transparency. There's these enhanced rules for service providers and contractors, which I talked about, talked about the sensitive information. Just real quick on the opt out preference signals, this is one that I've only recently focused on, but this is potentially quite significant. Some of you may be aware that under California law, there's something called the Do Not Track signal or DNT. Under California, there's no specific requirement to adhere or attract those requests, meaning you have a browser setting that says we will not track from site to site, you're not even obligated under California law to adhere to it. What you are obligated to do is to tell people in your privacy policy, whether you adhere to it or not, that's the requirement. What we're doing here is opt out, which is not the same thing as Do Not Track, but same type of idea. These requests per these regulations have to be adhered to, meaning if you're going to get through the browser, some sort of opt out request, these preference signals will have to be monitored and recorded and adhered to. So, this is going to be something that may be quite relevant depending on your data collection methods. And I think one Howard and I will have to definitely delve into as we think about what sort of comments we'll be doing on behalf of the industry, and of course, we welcome your feedback on it as well, in terms of how it may affect you. Next slide, please. Next slide, please. This is Virginia. As I said, it takes effect in January 2023. I'd say the big difference– Well, it's not a big difference, it's a difference. The terminology differs, right? They use more of the GDPR language, about controller and processor. They've got slightly different thresholds for these requirements. But, it's still very similar, as you can see, this 50% reference to gross revenue from the sale, and I think their definition of sale is arguably pretty expansive too. Next one, please. So yeah, I talked about sensitive data. Here, it is more of an opt-in. Before you process that sensitive information, you've got to get the consent. And again, in sort of traditional survey research, you'd argue that's not a huge deal necessarily, to embed a sort of notification or a consent like that at the front end, gets little more challenging when we're talking about more passive data collection methods. Next slide, please. Unlike the CCPA and the CPRA, consumers have a right to appeal any denial of rights. This is interesting, and this is some of the other states, I think Connecticut and a few others have something similar, which basically means that if you don't like the answer you're getting initially, you have the right to appeal it within the company. And the company then has an obligation usually to say something like, "If you don't like your answer from us on this on this appeal, here's the way you could contact the Virginia State Attorney General, so they can begin an investigation." Next slide, please. So, again, broad definition of sell, though somewhat narrower than California's, but same idea that you're likely to be captured. Now I will say this, if you're not collecting any information in Virginia, then obviously you're not covered by this law and the same would be said for any of the other states. Next slide, please. All right, we can go through the next one too. Yeah, this is an important point and it's also a feature of California. There is this idea of assessments. Under the CCPA and CPRA, there's now going to be this requirement to do an assessment of your practices. Remember those ten big picture items I had at the very beginning of the presentation, that I identified security requirements and doing these assessments, this is going to be a requirement for most of you, to show that you've done it now. Here, it's qualified by processing that presents a significant risk. I think it's very possible, you could present your use case as it does– not presenting a significant risk. But if you're processing sensitive data, if you've got other sorts of what we'll call higher risk sort of things, you're likely to be put in this bucket. I think the conservative approach is going to be to do some form of assessment, because these are now increasingly being required in the California context and you'll see it here in Virginia. Next one, please. And we can do the next one. There's no private right of action. Howard touched upon that in his comments and presentation. This is an important element, and what it means in plain English is you're not going to get sued by another party, but you will potentially get sued, though, by the Attorney General. And same thing in California, with the one exception is that there is a private right of action in California when there's a data breach. So you can say it takes the litigation risk out in terms of private lawsuits, but government actions are still very much in play. Next slide, please. Next slide, please. Colorado Privacy Act. This is the one that's going to come in effect in July of 2023. Next slide, please. And you can see very similar sort of definitions of personal data, they are mimicking sort of the California concept, any information that's linked or reasonably linked. Important to note, I already talked about de-identified data, meaning anonymized data, purely anonymized or aggregate data. But the other one is actually an important exception, that's pretty common to these various laws, is publicly available data. And by that we mean stuff that may come from a public data source, like a registry for property, for real property, with the county usually, in the state that you're in. That would be publicly available data not covered by this. Next one, please. Here, it is narrower than the CCPA and CPR, but like the Virginia law, does have a sufficiently broad definition that I think is likely to also capture the activities that we're talking about in the context of research. Using the terms controller or processor. Next slide, please. And next one, as well. Yeah, we can keep going. You can see a lot of the similar sorts of requirements, these data protection assessments. Interestingly, nonprofits are not exempt in Colorado. So, that's not great news for the Insights Association. Sorry, Melanie. Next one, please. Connecticut is one of the more recent ones to come down. This is also going to take effect in July. Similar sorts of definitions, predicated on number of consumers, datasets, how much you're processing, your revenue, etc. Next one, please. And we can just fly through this, because we want to get to those slides at the end. And again, you'll have access to this. You can drill down a little bit more specificity. I've got some things to tie up some loose ends. Next one, please. Utah is another one that's going to take effect at the end of December, you can see a lot of the same similarities. It has an opt out for sensitive data unlike Colorado and Connecticut. Next one, please. There is no right to appeal. Now, here's one important difference, is under the Utah law, the sale definition does contain an exemption that allows the controller, meaning the party that has that data, disclosed to a third party if the purpose is consistent with the consumer's reasonable expectations. Well, you'd argue that certainly a traditional survey research, the clear expectation is that that information is going to be shared on that way and would be consistent with their expectations. So, you could argue that Utah may not even apply for traditional survey research. Next one, please. And next one after that. OK, No. Oh that's OK. Next one. We looked at those other States colored, I think it was the gray color. These are other ones that are considering the laws. Yeah, that's right. And Howard mentioned the bill, so we're talking about the federal law and that really would solve a lot of this really patchwork craziness. Next slide, please. Your compliance approach, as we said, CCPA, and GDPR are a good starting point. Look at your privacy policies, review your consents, importantly, determine are you collecting sensitive data, there's specific requirements for children. California, as I said, has a requirement under CCP for under 16. Note the dark pattern issue that regulators are now focusing on which in plain English just focuses, "Be clear. Be transparent about your way, don't be snarky, don't try to push people to certain types of outcomes or answers," which presumably none of you are doing anyway. But nevertheless, be aware that that's out there. Next one, please. Update your DPAs, have your vendor privacy controls in place, have security questionnaires, this is where you have security and privacy working closely together, review consumer requests, the training we talked about, which is a recap of those earlier slides. Next one, please. So, we talked about this concept of privacy by design or FIPS, the fair information practice principles, this kind of tells you what those principles actually stand for. And again, these are not my terms, these are terms that were developed, at least in the case of privacy by design by privacy advocates, and scholars and so forth, and now they have been kind of ingested and adopted by the regulators and by laws. So, privacy by design, it has some great summary bullets here, but it's basically integrate privacy into how you think of your products, of your services, and take an integrative approach, and likewise, with the fair information practice principles, kind of mimicking those concepts. Next one, please. So key takeaways, review and update your privacy policy, get those consents and notices in place, create and update your DPAs and other key agreement terms. Review your security practices, work with your IT security people. Again, security is one area, privacy is another, they're not the same thing, though they're clearly linked and highly related. Further risk mitigation measures should absolutely be considered. I'm a big proponent of insurance, it's good to take maybe a step back and look at your operations, your procedures, and say, "How can we better integrate thoughtful privacy in our decision making?" And sometimes that's hard to do, because people are going to say, "Where's the ROI there?" Well, the ROI is, I would argue, mitigating your risk and not having to pay later, but also, because your clients are going to be demanding it, and you're in an industry where privacy is valued, I would say, more than certainly most. So for those reasons, I think it's important. We talked about the concept of data minimization, and I'm a major proponent of training, because I don't think people can be held accountable to things that they're not aware of.

Melanie Courtright: Great. Thank you, appreciate it. I'm going to leave the slide up for a minute just in case people want to gather your contact details. We've answered quite a few questions already in the Q&A and some of them live. There's another theme around how long should you keep data, looking for a bit more clarity. There was one question specifically saying, "We have data from around 2011, and could we keep it if we de-anonymize it, pseudo-anonymize it?" Howard gave a response too, but thoughts from either of you on how long people should keep data? And one person specifically said, "Please don't just say, contact your DPO." Is there anything more than that, that we can give?

Howard Fienberg: We've been mulling internally, within the association, the idea of a standard for the industry and still debating exactly what that would be. Because certainly anything longer than five years is kind of fishy. And as I mentioned in my answer to that question, the difficulty is in how you define de-identification, because how you think of it may not be how the law considers it. Stuart, you have your own opinion on this?

Stuart L. Pardau, Esq.: Yeah. There's some, as I said, some specific regulatory requirements. If you're dealing with, say, IRS, I think it's going to be seven years is what I've consistently heard. But when you get away from that, as far as what we do as an industry, you've got two competing things, you've got – well, three actually. One is, what are your clients demanding? Are your clients specifically requiring you to keep this for periods of time? I've seen that as a major feature of people's decision making. So that would be one question I would pose to everybody is, do your specific client needs demand or require you to keep it for x period of time, that will be a major factor in how I would advise somebody. Because I think that if you've got a contractual obligation, and there's a legitimate need, you could maybe keep it for three years, five years, or you may need to keep it for three months or three weeks. It'll really vary pretty significantly. Second thing is, don't forget this idea of data minimization. It's very real. It's not just some theoretical, academic sort of theory. I think it kind of started off that way, but it's become a standard now in these laws. And no one's defined what exactly it means. It's kind of like the idea of reasonability in the law, we've seen that word millions of times, no one can necessarily, in every case, define exactly, precisely what it means. It'll depend on the circumstances. Same sort of thing for data minimization, but you have to understand that it's out there, and the regulators are going to scrutinize how long are you going to keep [INAUDIBLE] for, which to me says, err on the side of deletion faster than not, be inclined to purge your systems faster. And then thirdly, what really makes sense for your own business. I would say, again, you may have some very legitimate needs for that. I think it's important to point out the following, there was a lot to cover and I obviously can't cover a lot of it. But under the CCPA or CPRA, as a service provider, you're actually allowed to keep certain pieces of information for longer periods of time, if it's designed for what we'll call product improvement, enhancement of consumer experience, benchmarking, things like that. So, it'll really depend on what it's being used for, as well. I know I'm not giving you a yes or no answer, but hopefully it's a little bit better answer than go talk to the DPO.

Melanie Courtright: Well, and then speaking as a researcher for a moment, we have a habit of just wanting to keep data forever. And so, we're going to have to start thinking more about is this data really– What's the shelf life of this data? How long is the data actually really– What's its utility lifeline? You could keep data for five years, but the way the world is changing, data that's five years old is not often that useful. So, we can also think about utility.

Stuart L. Pardau, Esq.: And to add to that calculation, Melanie, I would say the longer you keep something, the chances of something bad happening are naturally going to increase. I'm not saying it's going to be a material risk, necessarily, but you will not have a data breach of certain data if it doesn't exist, right? But if it's there on your network somewhere, it could happen. So, you don't want to deal with that. It's just a pain in every single possible imaginable way. So, I would urge people to think about that.

Melanie Courtright: Well, we are out of time. There was one more question that I don't know if Howard wants to– Just to make sure everybody sees it, about if ADPPA passes, will it override the states or will there end up being conflicts? And that's a good question, so why don't you take a second to answer that, and then we need to wrap up.

Howard Fienberg: Yeah, there will always be some conflicts, especially early out of the gate. Part of the reason, the big ticket items in fighting over that bill is going to be the extent to which it preempts state laws, and our goal was to preempt as many of them, and to as great an extent as we can. It has a fair number of loopholes for certain state laws that are really happy, that make the trial bar very happy, like the biometric privacy law in Illinois, and the data security private lawsuit provisions in California. So a work in progress.

Melanie Courtright: Great, thank you. Thank you, Stuart. Thank you, Howard. As always, thank you for everything that you do for the profession, for our industry, really appreciate it. A couple quick notes for you guys, our next townhall is in July, and it's on turbocharging your career, some really great speakers and participants in that. We have our idea forum coming up, August 9th and 10th. It's virtual, and it's all about inclusion, diversity, equity, and access. And we have a new report coming out next week, that's actually going to be a really important report talking about the census work that we did on our profession, and also the culture of inclusion. And then finally, really important, our CRC22 in Manhattan, October 26th to 28th, going to be a very large Hallmark event with some really cool experiences. So, make sure to put that on your calendar and you can register now for any of these at insightsassociation.org. We will, again, wrap this up, put a package together, and be in your inboxes, and if you need anything, feel free to reach out to me, melanie@insightsassociation.org, or Howard, or Stuart, and thank you all very much for being here. Have a great weekend.