Global communication network concept.

Dechert Cyber Bits

 

Issue 53 - April 18, 2024


CISA Issues Proposed Cyber Incident Reporting Rules for Critical Infrastructure Sectors for Public Comment

On April 4, 2024, the U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (“CISA”) published a 447-page Notice of Proposed Rulemaking (“Proposed Rules”) in accordance with the Cyber Incident Reporting for Critical Infrastructure Act of 2022 (“CIRCIA”). The Proposed Rules mark the first set of comprehensive cybersecurity regulations for critical-infrastructure sectors by the federal government. The public comment period closes June 3, 2024. Under CIRCIA, the final rules must be published by September 2025.

CISA says it will use the reported information to streamline assistance and responses to cyberattacks by identifying real-time patterns, filling information gaps, quickly deploying resources in response, and informing others who may be targeted and/or affected. While individual reports will remain confidential, the Proposed Rules would require CISA to publish “aggregated, anonymized observations, findings, and recommendations” in public quarterly reports.

The Proposed Rules require “covered entities” in the 16 critical infrastructure sectors to report: (a) “substantial” cyberattacks within 72 hours; (b) ransomware attack ransom payments within 24 hours; and (c) any substantially new or different information discovered related to a previously submitted report to CISA. Covered entities will report the required information to CISA through a website which will be released concurrently with the final rule. In the interim, CISA intends to develop detailed and streamlined guidelines for reporting. The information required is expected to be more detailed and technical than the broad information currently required in public filings per SEC reporting requirements. All told, the Proposed Rules will cover an estimated 300,000+ entities.

Takeaway: The short reporting window CISA is proposing would increase pressure on entities operating in critical infrastructure sectors, as they would be required to commit technical resources to assist in determining whether reporting requirements are triggered while simultaneously responding to cyberattacks. In addition, this is one of the first laws that would require the reporting of a ransom payment, which will be a factor for companies considering whether or not to pay. Companies should assess whether they are subject to the Proposed Rules and consider taking this opportunity to submit written comments to inform the direction and substance of any final rule.


California’s Privacy Protection Agency Issues its First Enforcement Advisory

On April 2, California’s Privacy Protection Agency (the “Agency”) issued its first Enforcement Advisory (the “Advisory”). The Agency stated that by issuing such advisories, it hopes to encourage voluntary compliance with the California Consumer Privacy Act, as amended (“CCPA”) by periodically highlighting certain provisions of the CCPA. While the advisories are not legally binding, according to the Agency, they signal how the Agency’s interpretation of the CCPA will inform its enforcement priorities.

This Advisory focuses on the CCPA’s data minimization requirement. It reminds covered entities that the data minimization principle applies to all personal information collected, used, retained, or shared by a business, including any personal information collected pursuant to consumer requests received under the CCPA. Per the Advisory, businesses should collect consumers’ personal information “only to the extent that it is relevant and limited to what is necessary in relation to the purposes for which it is being collected, used, and shared.”

Using hypothetical examples, the Advisory suggests a four-part inquiry for businesses to ask themselves when determining whether requests for additional personal information are consistent with the data minimization principle, including: (1) what is the minimum amount of personal information necessary for the business to achieve its purpose; (2) is additional information really necessary in light of any personal information already in the business’s possession; (3) what are the possible negative impacts of such personal information collection; and (4) should additional safeguards be in place to address those potential negative impacts.

The Advisory also states that covered entities appear to be collecting more data than necessary when responding to consumer requests. Along these lines, the Agency reminded businesses that they are not required to conduct identity verification to process opt-out requests, and that while identity verification is required to respond to deletion requests, any additional personal information collected for verification purposes should be limited.

Takeaway: This Advisory makes clear that data minimization is an immediate area of focus for the Agency. Although the Advisory is not legally binding, covered entities should consider whether revisions to any of their applicable compliance policies and procedures are necessary. Covered entities should consider monitoring for the issuance of future Advisories to benchmark their compliance efforts against the Agency’s stated enforcement priorities.


UK Data Regulator to Emphasize Children’s Online Privacy in 2024

The UK Information Commissioner’s Office (“ICO”) recently outlined its strategy for the upcoming year under the Children’s Code of Practice, urging social media and video-sharing platforms to enhance children’s online privacy protections. Building upon the progress since the Code’s implementation in 2021, the strategy sets out that the ICO will focus on the following for the coming year:

(i) the default settings for children’s profiles being “private” and for geolocation settings being “off”;

(ii) not profiling children for targeted advertisements without compelling reasons;

(iii) how content recommendation algorithms (otherwise known as “recommender systems”) may create pathways to harmful content and encourage children to spend longer on a platform than they otherwise would, thereby providing platforms with further personal data;

(iv) how online services verify the age of users and obtain parental consent to use the information of children under 13 years old.

Finally, the ICO also stated that it intends to undertake audits of the development, provision and use of Ed Tech solutions in schools.

Takeaway: The ICO’s stated focus on social media and video-sharing platforms means that those platforms will want to review their compliance steps that they have taken to date, assess areas for improvement in light of the ICO’s pronouncements and prepare for intensified scrutiny by the ICO.


White House Issues Policy Regarding Federal Agencies’ Use of AI

On March 28, the White House Office of Management and Budget (“OMB”) issued a policy (the “Policy”) intended to simultaneously: (i) support federal agencies optimizing artificial intelligence to benefit the public; and (ii) establish measures to mitigate AI-related risks.

According to the Policy, by December 1, 2024, federal agencies using AI must establish procedures to assess, test, and monitor the impact of particular AI uses—essentially uses that can impact the rights or safety of citizens—on the public. The procedures will need to mitigate against any risk of discrimination resulting from the use of algorithms and provide transparency regarding the agency’s use of AI (including by maintaining an inventory of cases in which AI was employed). If the agency identifies risks from its AI uses, it must take steps to mitigate them. If the agency’s measures fail to sufficiently mitigate the identified risks, the agency must stop using the AI in question “as soon as it is practicable."

The Policy also directs agencies to expand or “upskill” their staff with respect to “AI talent,” including by designating “Chief AI Officers” to coordinate the use of AI across agencies and establishing AI Governance Boards. The Policy makes clear that agencies should address procurement-specific risks in the AI context.

After the December 1 deadline, agencies that do not comply with the Advisory will not be able to continue using AI absent proper justification. Possible justifications might include the conclusion that discontinuing use would increase risks to citizens’ safety or rights or would create an “unacceptable impediment to critical agency operations.”

Takeaway: The Advisory represents a major development for federal agencies in their use of AI and is likely to be a relevant benchmark for companies providing AI services and technology to government agencies. The Advisory may also be influential on federal agencies’ assessment of future AI-related regulations and perhaps a harbinger of things to come for the private sector.


U.S. FTC Releases 2023 Privacy and Data Security Update

On March 28, the United States Federal Trade Commission (“FTC”) released its Privacy and Data Security Update for 2023 (the “Report”). The Report highlights the FTC’s work over the past year in policy, rulemaking, and enforcement actions related to protecting consumer privacy and promoting data security.

In the enforcement space, the FTC noted that in the last year it brought 97 privacy cases and 169 Telemarketing and CAN-SPAM cases. The former included key cases in the areas of artificial intelligence, health privacy and security, geolocation tracking, children’s privacy, data security, and credit reporting and financial privacy.

As it relates to rulemaking, the FTC highlighted its breach notification amendment to the GLBA Safeguards Rule, which established breach notification requirements for financial institutions. The Report also noted the FTC’s proposed modifications to strengthen and modernize the Health Breach Notification Rule and increase protection for children’s data under COPPA, with proposed rulemakings in these areas having moved past the notice and comment period.

Lastly, the Report identified additional steps the FTC is taking to promote consumer privacy and data security, including through the issuance of policy statements, conducting studies, and filing reports, and hosting workshops, townhalls, and roundtables; and otherwise providing guidance to consumers and businesses via new educational materials (in both English and Spanish).

Takeaway: The FTC continues to maintain its focus on the collection and processing of consumer data, particularly sensitive information. The Report highlights what the FTC views to be its key achievements this past year, and companies should take note of the focus areas of the cases described above.


Dechert Tidbits

National Data Privacy Bill—the American Privacy Rights Act

On April 7, Democratic Senator Maria Cantwell (WA) and Republican Representative Cathy McMorris Rogers (WA), announced that they had reached an agreement on a bipartisan national data privacy bill. The draft bill (“The American Privacy Rights Act”) would restrict what consumer data technology companies collect, and give Americans the right to: (1) prevent the sale of their personal information; and (2) compel its deletion. The bill is a “discussion draft”; it has not yet been introduced formally in either chamber. Some key lawmakers have already weighed in on changes they would like to see made before passage.

Florida Passes Bill Providing Data Breach Immunity

On March 5, 2024, the Florida legislature notably passed the Cybersecurity Incident Liability Act (HB 473), which would immunize from lawsuits companies that have suffered a data breach, provided that the company (1) “substantially compl[ies]” with Florida’s data breach notification law; and (2) maintains a cybersecurity program that “substantially aligns” with certain industry standards or legal requirements (and is kept up-to-date with changes to the applicable industry standard/framework or law). Florida’s proposed law is intended to incentivize the adoption of cybersecurity measures to protect personal information by mitigating the costs of data breach class action lawsuits. The bill, one of the first of its kind, if not the first, is expected to become law upon approval by the governor, and companies will receive immunity for any claim filed on or after that date.

The U.S. and UK Ink First Partnership on AI Safety

On April 1, 2024, the United States and United Kingdom signed a Memorandum of Understanding (“MOU”), effective immediately, which will see the countries work together to develop tests for the most advanced AI models, following through on commitments made at the AI Safety Summit last November. Under their MOU, the UK’s AI Safety Institute (AISI) and its U.S. counterpart will develop a common approach to AI safety testing using the same methods and underlying infrastructure, explore employee exchanges, share information, and perform at least one joint testing exercise on a publicly accessible model. The MOU also commits the signatories to developing similar partnerships with other countries.

New Maryland Privacy Bill Imposes Stricter Protections, Including Data Minimization

On April 8, 2024, the Maryland General Assembly passed the Maryland Online Data Privacy Act of 2024 (“MODPA”) to limit data collection by companies and provide consumers with meaningful data privacy and security protections. If enacted, the bill would take effect on October 1, 2025. The Act includes specific provisions on data minimization for businesses that control or process personal data of at least 35,000 Maryland consumers or derive 20% gross revenue from selling personal data of at least 10,000 Maryland customers. It also prohibits the sale of sensitive data, bans targeted advertising to minors and the sale of minors’ data, and includes provisions for universal opt-outs and anti-discrimination prohibitions. The MODPA includes a 60-day right to cure at the Maryland Attorney General’s discretion, but that option will sunset in 2027.  


We are honored to have been recognized in The Legal 500 2023, Chambers USA 2023, nominated by The American Lawyer for the Best Client-Law Firm Team award with our client Flo Health, Inc., and named Law360 Cybersecurity & Privacy Practice Group of the year! Thank you to our clients for entrusting us with the types of matters that led to these recognitions.


Recent News and Publications



Dechert Cyber Bits Partner Committee

Brenda R. Sharton
Partner, Chair, Privacy & Cybersecurity
Boston
brenda.sharton@dechert.com

Vernon L. Francis
Partner, Senior Editor
Philadelphia
vernon.francis@dechert.com


"Dechert has assembled a truly global team of privacy and data security lawyers. The cross-practice specialization ensures that clients have access to lawyers dedicated to solving a range of client’s legal issues both proactively and reactively during a data security related crisis or a litigation."

"The privacy and security team collaborates seamlessly across the globe when advising clients."
- Quotes from The Legal 500, 2023

Dechert’s global Privacy & Cybersecurity practice provides a multidisciplinary, integrated approach to clients’ privacy and cybersecurity needs. Our practice is top ranked by The Legal 500 and our partners are well-known thought leaders and sought after advisors in the space with unparalleled expertise and experience. Our litigation team provides pre-breach counseling and handles all aspects of data breach investigations as well as the defense of government regulatory enforcement actions and class action litigation for clients across a broad spectrum of industries. We have handled over a thousand data breach investigations of all types including nation states, ransom/cyber extortion, vendor/supply chain, DDoS, brought by threat actors of all types, from nation-state threat actors to organized crime to insiders. We also represent clients holistically through the entire life cycle of issues, providing sophisticated, solution oriented advice to clients and counseling on cutting edge data-driven products and services including for trend forecasting, personalized content and targeted advertising across sectors on such key laws as the CCPA, CPRA and state consumer privacy laws, Section 5 of the FTC Act; the EU/UK GDPR, e-Privacy Directive, and cross-border data transfers. We also conduct privacy and cybersecurity diligence for mergers and acquisitions, financings, corporate transactions, and securities offerings.

View Previous Issues