IA Comments on Risk and Cybersecurity in New California Rulemaking Process - Articles

Articles

31Mar

IA Comments on Risk and Cybersecurity in New California Rulemaking Process

Even as regulations implementing the new California Privacy Rights Act (CPRA) were being finalized, the insights industry had to raise concerns about a new round of rules regarding cybersecurity audits, risk assessments and automated decision-making.

These substantive areas were skipped by the California Privacy Protection Agency (CPPA) in the initial rulemaking (the results of which were made immediately effective on March 30, 2023).

In comments filed with the CPPA on March 27, the Insights Association called upon the agency to:

  • “In determining what processing presents a “significant risk” to consumers’ privacy or security, use a clearer, more concise approach than the European Data Protection Board’s Guidelines on Data Protection Impact Assessment”;
  • “Limit the cybersecurity audit and risk assessment requirements to firms that meet one of the first two prongs of the CCPA’s ‘business’ definition”; and
  • “Limit processing which presents a ‘significant risk’ to processing which occurs on a regular basis or a minimum number of times per year” or, alternatively “to processing of at least 100,000 records.”

The California Chamber of Commerce also filed comments with the CPPA:

  • “For cybersecurity audits, the regulations should (A) promote interoperability and align with the processes and goals set forth in existing legal frameworks or recognized standards and (B) afford a business the flexibility to use a risk-based approach, including by tailoring cybersecurity audits to the size and complexity of the business and the nature of the data and processing activity, and to conduct thorough audits internally”;
  • “Regarding privacy risk assessments… regulations should (A) prioritize compatibility with existing privacy statutes and (B) align with requirements in the statutory text and related requirements in the CCPA”; and
  • “With respect to automated decisionmaking rights, regulations should: (A) define automated decisionmaking to promote coherence across legal frameworks, (B) clarify that certain automated decisionmaking is not subject to opt-out and access rights, and (C) permit a business to provide meaningful information about automated decisionmaking through its privacy policy or similar disclosures, without revealing trade secrets.”

Read IA’s comments in PDF or below

The Insights Association (“Insights”) submits the following comments on proposed rulemaking related to cybersecurity audits, risk assessments, and automated decision making, per the invitation of the California Privacy Protection Agency (the “Agency”).

Representing more than 900 individuals and companies in California and more than 7,200 across the United States, Insights is the leading nonprofit trade association for the market research[1] and data analytics industry. We are the world’s leading producers of intelligence, analytics and insights defining the needs, attitudes and behaviors of consumers, organizations, employees, students and citizens. With that essential understanding, leaders can make intelligent decisions and deploy strategies and tactics to build trust, inspire innovation, realize the full potential of individuals and teams, and successfully create and promote products, services and ideas.

The California Privacy Rights Act (“CPRA”) is going to have a profound impact on the business community, including the market research and data analytics industry. Small and medium-sized research firms in particular will face tremendous costs in updating and expanding on their already-extensive compliance efforts in connection with the California Consumer Privacy Act of 2018 (“CCPA”). Accordingly, and on behalf of our members, we commend your decision to seek input and are grateful for the opportunity to comment.

1. In determining what processing presents a “significant risk” to consumers’ privacy or security, use a clearer, more concise approach than the European Data Protection Board’s Guidelines on Data Protection Impact Assessment (the “Guidelines”).

On page 5 of the Agency’s invitation for comments, the Agency asks about the benefits and drawbacks of following the Guidelines.

The Guidelines include nine different criteria for determining what processing operations are “likely to result in a high risk”; namely, (1) evaluation or scoring, (2) automated decision-making, (3) systematic monitoring, (4) sensitive data, (5) data processed on a large scale, (6) matching or combining datasets, (7) data concerning vulnerable data subjects, (8) innovative use or new technological or organizational solutions, and (9) when the processing itself prevents data subjects from exercising a right or using a service or contract.

We respectfully suggest that the Agency’s adoption of a similar approach entailing the application of so many different factors will result in an overly nebulous and at any rate unhelpful analysis that will create more problems than it solves.

The Guidelines stipulate that “[i]n most cases, a data controller can consider that a processing meeting two criteria would require a data protection impact assessment (DPIA) to be carried out,” and that “[i]n some cases,” a single criteria will be sufficient. It is not clear, however, how much weight should be given to each criteria, or whether there are any meaningful thresholds for individual criteria. Is the processing of a hundred records of sensitive data enough to qualify under criteria #4? A thousand? Ten thousand? How many data sets have to be matched or combined to trigger criteria #6? How much data concerning vulnerable data subjects is sufficient under criteria #7? These are the types of questions the Guidelines do not answer.

While the Guidelines do include some “examples of processing” purporting to illustrate the application of possible relevant criteria, these examples do not make the analysis any clearer. Accordingly, we strongly urge the Agency to implement clearer, more concise standards for what constitutes “significant risk” so that businesses have more meaningful guidance about whether they are subject to the cybersecurity audit and risk assessment requirements.

2. Limit the cybersecurity audit and risk assessment requirements to firms that meet one of the first two prongs of the CCPA’s “business” definition.

On pages 4 and 8 of the Agency’s invitation for comments, the Agency asks “What else should the Agency consider to define the scope of cybersecurity audits?” and “What else should the Agency consider in drafting its regulations for risk assessments?”

As the Agency is aware, there are three different ways for an organization to be defined as a “business” under CCPA: (1) annual gross revenues in excess of $25 million; (2) buying, selling, or sharing the personal information of at least 100,000 consumers or households; or (3) deriving 50 percent or more of its annual revenues from selling or sharing personal information.

Because the third prong is not tied in any way to business size or processing volume, it includes a substantial number of small and medium-sized firms in the market research and data analytics industry. Firms like this who are subject to CCPA solely on the basis of this third prong should be exempt from costly cybersecurity audits and risk assessments.

To comply with these requirements, small businesses will likely have to hire outside expertise and expend considerable expense relative to the size of their enterprise. Because the cybersecurity audits and risk assessments are already premised on processing that presents a “significant risk” to consumers’ privacy or security, we believe limiting these requirements as we propose would allow the Agency to balance the interests of small businesses without hampering the opt-out right of California consumers.

Alternatively, the Agency could limit the cybersecurity audit and risk assessment requirements based on smaller limits than those in the CCPA’s “business” definition (e.g., firms that do $15 million in revenue or deal with at least 50,000 records), to protect the smallest businesses from overly onerous regulatory requirements.

3. Limit processing which presents a “significant risk” to processing which occurs on a regular basis or a minimum number of times per year

In addition to limiting “significant risk” scenarios as described above, the Agency could also clarify that such processing must occur on a regular basis, or at least with some minimal frequency, to trigger the auditing and risk assessment requirements. It does not meaningfully further the spirit of the CCPA, and imposes particularly unnecessary burdens on small businesses, to require an audit and security assessment solely on the basis of one, two, or a handful of isolated instances of processing deemed to present a “significant risk” in a given year.

4. Limit processing which presents a “significant risk” to processing of at least 100,000 records

Alternatively, we suggest the Agency could incorporate some numerical trigger into what constitutes “significant risk” processing. For example, this number could track the figure in the CCPA’s “business” definition of 100,000 records, or the Agency could select some lower number. In any case, the underlying statutory language of the CCPA counsels in favor of some such numerical limit. The statute contemplates “significant risk to consumers’ privacy or security,” language which connotes larger concerns of aggregate risk, not every isolated presentation of risk to any individual consumer or small group of consumers.

Conclusion

We hope the above comments will be useful to you and your team, and we are happy to entertain any questions or concerns you may have about the market research and data analytics industry.

Again, we appreciate the opportunity to comment.

 

[1] Market research, as defined in model federal privacy legislation from Privacy for America, is “the collection, use, maintenance, or transfer of personal information as reasonably necessary to investigate the market for or marketing of products, services, or ideas, where the information is not: (i) integrated into any product or service; (ii) otherwise used to contact any particular individual or device; or (ii) used to advertise or market to any particular individual or device.” See Part I, Section 1, R: https://www.privacyforamerica.com/overview/principles-for-privacy-legislation-dec-2019/

About the Author

Howard Fienberg

Howard Fienberg

Based in Washington, DC, Howard is the Insights Association's lobbyist for the marketing research and data analytics industry, focusing primarily on consumer privacy and data security, the Telephone Consumer Protection Act (TCPA), tort reform, and the funding and integrity of the decennial Census and the American Community Survey (ACS). Howard has more than two decades of public policy experience. Before the Insights Association, he worked in Congress as senior legislative staffer for then-Representatives Christopher Cox (CA-48) and Cliff Stearns (FL-06). He also served more than four years with a science policy think tank, working to improve the understanding of scientific and social research and methodology among journalists and policymakers. Howard is also co-director of The Census Project, a 900+ member coalition in support of a fair and accurate Census and ACS. He has also served previously on the Board of Directors for the National Institute for Lobbying and Ethics and and the Association of Government Relations Professionals. Howard has an MA International Relations from the University of Essex in England and a BA Honors Political Studies from Trent University in Canada, and has obtained the Certified Association Executive (CAE), Professional Lobbying Certificate (PLC) and the Public Policy Certificate (PPC). When not running advocacy for the Insights Association, Howard enjoys hockey, NFL football, sci-fi and horror movies, playing with his dog, and spending time with family and friends.

Attachments

Related

Oregon Guidance on AI Restrictions and Requirements Under State Law

Oregon Guidance on AI Restrictions and Requirements Under State Law

Oregon Attorney General Ellen Rosenblum (D) recently released guidance for companies that Oregon'...

Read More >
New Jersey Privacy Regulator Offers Guidance for 2025 Law

New Jersey Privacy Regulator Offers Guidance for 2025 Law

With the Garden State’s new privacy law taking effect on January 15, 2025, time is short for insigh...

Read More >
Fighting for You: December 2024 Legislative and Regulatory Update

Fighting for You: December 2024 Legislative and Regulatory Update

While northerly climes were frosting over at the tail end of 2024, the Insights Association took a s...

Read More >
Artificial Intelligence 2024 Year-End Legislative Update

Artificial Intelligence 2024 Year-End Legislative Update

2024 featured some impactful new state laws impacting the insights industry’s use of artificial int...

Read More >
Improving Contractor Cybersecurity Act - H.R. 5310

Improving Contractor Cybersecurity Act - H.R. 5310

The Improving Contractor Cybersecurity Act (H.R. 5310) would require cybersecurity measures for fede...

Read More >
Digital Platform Commission Act - S. 1671

Digital Platform Commission Act - S. 1671

The Digital Platform Commission Act (S. 1671) would create a new federal agency to regulate online p...

Read More >
Members only Article - Please login to view