Dechert Cyber Bits

Issue 95 - May 7, 2026


Risk Analysis or Risk It All: OCR’s Four Ransomware Settlements Reinforce Centrality of HIPAA Security Rule Compliance

On April 23, 2026, the U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) announced settlements with four entities (collectively, the “Settlements”) to resolve separate investigations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) Security Rule, each arising from ransomware attacks. In each case, OCR alleged that the relevant entities failed to conduct an accurate and thorough risk analysis under the HIPAA Security Rule. OCR also alleged that two of the entities had impermissibly disclosed protected health information, and that another had failed to notify affected individuals within 60 days. The four entities involved in the Settlements are:

Under the Settlements, the entities have each agreed to implement corrective action plans subject to OCR monitoring for two years and have made payments to OCR ranging from $225,000 to $375,000. None of the entities admitted wrongdoing in connection with their respective settlement.

In announcing the Settlements, OCR emphasized the importance of the HIPAA Security Rule, noting that compliance with it is a regulated entity’s best opportunity to prevent or mitigate the harmful effects of a successful cyberattack. OCR further recommended that covered entities and business associates take affirmative steps to identify where electronic Protected Health Information (“ePHI”) resides within their organizations, conduct and regularly update risk analyses, implement audit controls and authentication mechanisms, encrypt ePHI in transit and at rest, and provide workforce members with regular, role-specific HIPAA training.

Takeaway: The Settlements underline a structural paradox at the heart of HIPAA enforcement: the very act of complying with the law's OCR breach notification requirements can trigger the regulatory consequences that prove more enduring than the underlying attack itself. Each of the four settling entities were victims of a criminal act. They did what the statute demanded, disclosing the breach to OCR, and each was rewarded with a costly, multi-year investigation culminating in a fine and corrective action program. The settlements warrant close attention not merely for reaffirming that risk analysis is a non-negotiable compliance obligation, but for what they signal about the maturation and expanding reach of OCR’s enforcement apparatus. In addition, the diversity of the settling entities makes clear that no category of HIPAA-regulated entity occupies a safe harbor, and that business associates face the same OCR enforcement exposure as covered entities. Despite the overall deregulatory environment at the federal level, OCR remains active and companies subject to HIPAA need to have appropriate cybersecurity controls and conduct timely and comprehensive risk assessments under the HIPAA Security Rule. 


FTC Outlines a Quieter, Lighter-Touch Approach to AI Regulation at FTC Oversight Hearing; Addresses Key Consumer Protection Matters

On April 15, 2026, Federal Trade Commission (“FTC” or the “Agency”) Chairman Andrew Ferguson and Commissioner Mark Meador testified before the Senate Commerce Committee (“Committee”) in an oversight hearing (“Oversight Hearing”) covering the Agency’s enforcement priorities, its approach to artificial intelligence (“AI”), and various consumer protection matters (the “Oversight Hearing”). Chairman Ferguson stressed that the FTC should not serve as a “general, all-purpose AI regulator,” and that he views the Agency’s role as being limited to core matters such as targeting the use of AI to facilitate fraud and addressing misleading or deceptive practices regarding AI-powered tools. Further, Chairman Ferguson criticized comprehensive AI regulatory frameworks, such as those adopted in the European Union, citing a confidential FTC analysis that found compliance costs that he characterized as “a recipe for killing innovation.”

The Oversight Hearing took place with only two of the FTC’s five seats occupied, following President Trump’s removal of the agency’s two Democratic commissioners—a matter currently pending before the U.S. Supreme Court that we discussed in Cyber Bits Issue 75, Issue 74, and Issue 73. Throughout the Oversight Hearing, Democratic members of the Committee raised concerns regarding the FTC’s independence from the White House, while Chair Ferguson and Commissioner Meador maintained that the Agency has received no direct instructions from the White House on any FTC enforcement matter.

Beyond AI, the Commissioners also addressed consumer protection enforcement matters, including efforts to combat deceptive fees, enforce the Better Online Ticket Sales (“BOTS”) Act, and prepare for enforcement of the TAKE IT DOWN Act, which takes effect on May 19, 2026, and requires online platforms to remove nonconsensual intimate images. The Agency has also signaled heightened scrutiny in targeted sectors (most notably health care and life sciences), where the recent formation of a cross-functional Healthcare Task Force and a series of high-profile enforcement actions reflect a strategic reallocation of resources toward coordinated, sector-specific oversight.

Takeaway: The Oversight Hearing reinforces the current FTC leadership’s stated preference for a restrained regulatory posture with respect to AI, where the Agency appears intent on confining its role to fraud and deception enforcement rather than pursuing broader rulemaking. However, companies should not mistake this lighter-touch approach to AI for a general pullback in FTC enforcement activity. With respect to AI, the FTC appears to be moving away from prescriptive, horizontal rulemaking in favor of aggressive, vertically focused enforcement actions grounded in existing statutory authority, and the near-term compliance landscape may involve fewer new rules but no less regulatory risk, particularly where AI intersects with consumer-facing claims, pricing transparency, or health care delivery.


CalPrivacy Regulator Indicates Agency Hopes to Begin Compliance Reviews in 2026; Highlights Key Areas of Focus

The California Privacy Protection Agency (“CalPrivacy” or the “Agency”) hopes to begin conducting audits under the California Consumer Privacy Act (“CCPA”) this year, according to a Law360 interview with CalPrivacy’s Executive Director Tom Kemp.

In February 2026, CalPrivacy announced the creation of its Audits Division, which has been tasked with building a team capable of reviewing written responses and testing systems and applications for CCPA compliance. According to Mr. Kemp, such audits may be announced (or not) and may encompass businesses, service providers, contractors, and/or specific high-risk areas. Findings from audits may be referred to the Agency’s Enforcement Division, which has been very active recently, including settlements with: (i) American Honda Motor Co., as discussed in Cyber Bits Issue 73; (ii) Ford Motor Co. as discussed in Cyber Bits Issue 93;and Tractor Supply Co., as discussed in Cyber Bits Issue 84.

With respect to other matters in the Agency’s purview, Mr. Kemp indicated that CalPrivacy would adopt a “bite-sized” approach to future rulemaking, soliciting preliminary public comment before drafting regulations, and noted that the current comment period for data broker audit requirements ends on May 7, 2026. Mr. Kemp also highlighted the Agency’s continued enforcement of California’s Delete Act, noting that more than 170,000 consumers had registered for the Delete Request and Opt-Out Platform (“DROP”) within its first month of operation, ahead of data brokers’ August 2026 compliance deadline. Mr. Kemp also noted that CalPrivacy is sponsoring two bills in the California Legislature: (i) S.B. 923, which would expand consumer deletion rights; and (ii) A.B. 2021, which would establish whistleblower protections for privacy law violations.

Takeaway: CalPrivacy’s “audit” authority has always been a bit unusual—privacy regulators do not typically exercise an auditing power distinct from their enforcement powers, and the CCPA does not require that companies comply with these audits or otherwise detail how it intends to exercise its audit authority. As a result, it remains to be seen how the activities of the new Audit Division will diverge from the activities of the Agency’s Enforcement Division. Still, Mr. Kemp’s latest comments reflect the Agency’s commitment to building out an audit function, presenting one more way that companies may come under scrutiny for their CCPA compliance activities.


gears

Seventh Circuit Holds BIPA Amendment Limiting Damages Applies Retroactively

A new decision interpreting Illinois’s Biometric Information Privacy Act (“BIPA”) will have potentially significant implications for companies’ litigation exposure. BIPA prohibits private entities from collecting, capturing, disclosing, or otherwise disseminating anyone’s biometric identifiers without their informed consent. Under BIPA, Plaintiffs can recover $1,000 in statutory damages, and $5,000 for intentional or reckless violations. A 2024 legislative amendment clarified that repeated collection of the same biometric information, from the same person, in the same manner, constitutes a single violation. The Seventh Circuit’s recent ruling in Clay v. Union Pacific Railroad Company, No. 25-2185 (7th Cir. Apr. 1, 2026), held that the 2024 amendment is retroactive, reversing the majority view of the lower courts and significantly mitigating potential exposure for pre-2024 conduct.

By way of background, in 2023 the Illinois Supreme Court held that under BIPA a claim accrues with “every scan or transmission” of biometric data. Cothron v. White Castle System, Inc., 216 N.E.3d at 926 (Ill. 2023). The court acknowledged, however, that BIPA Section 20, the statutory damages provision, could yield “punitive and astronomical damage awards” under a per-scan accrual theory, and invited the legislature to “make clear its intent regarding the assessment of damages under the Act.” Id. at 929. In response, the Illinois General Assembly amended Section 20 by adding two clauses clarifying that any entity that collects biometric information “in more than one instance” from “the same person…using the same method” has committed “a single violation,” entitling the aggrieved person to “at most, one recovery under this Section.” Pub. Act. 103-0769, 2024 Ill. Laws 6788-89 (2024); 740 ILCS 14/20(b), (c). However, the Amendment did not specify whether the change applied retroactively to violations that occurred before August 2, 2024 or only prospectively, leading to split among federal district courts on the question covered in Cyber Bits Issue 67 and Issue 74.

In Clay, the Seventh Circuit consolidated three interlocutory appeals involving that question. The financial stakes were significant: absent retroactive application, one defendant alone faced potential exposure of $7.5 million in statutory damages, and another case carried a risk of billions in class-wide damages. Clay, No. 25-2185, at *6. The Seventh Circuit’s primary reasoning was that under Illinois law changes to remedy schemes are seen as procedural and, thus, presumptively retroactive. The court also noted that the 2024 BIPA amendment was passed in direct response to Cothron, where the state supreme court invited the legislature to “clarify” the amendment, suggesting the legislature understood itself to be clarifying, rather than changing, the law.

Takeaway: The Seventh Circuit’s retroactivity holding significantly caps per-person compensation for alleged BIPA violations that occurred before August 2, 2024 (BIPA carries a five-year statute of limitations). BIPA defendants should be mindful, however, that this ruling is a prediction of Illinois law and is not binding on state courts. While this ruling will materially reduce exposure in federal court, where most BIPA cases are litigated, this issue may not be decisively closed until Illinois state courts weigh in.


Dechert Tidbits

Dead on Arrival? Comprehensive House GOP Proposed Data Privacy Bill Introduced

House Republicans on the House Energy and Commerce Committee have introduced the SECURE Data Act, which would establish a comprehensive federal data privacy framework granting consumers rights to access, delete, and control the use of their personal data, and impose other requirements. The Bill is unlikely to pass in its current form due to the lack of bipartisan support.  

CalPrivacy Seeks Comments on Rules Regarding Employee Data

CalPrivacy is seeking preliminary comments through May 20, 2026, regarding potential rule changes related to the California Consumer Privacy Act, particularly regarding whether new regulations addressing employee data are necessary and whether current rules regarding disclosures that must be made in privacy policies make those policies confusing or unclear. The Agency is soliciting feedback on a range of topics, including privacy policy disclosures for consumers, challenges businesses face in describing data collection practices and providing opt-out links, differences in notice mechanisms across platforms, and workers’ expectations regarding the collection of their personal data.

Cybersecurity Agencies Publish Agentic AI Secure Adoption Guidance

On May 1, 2026, cybersecurity agencies from the United States, Australia, Canada, and New Zealand released jointly authored guidance discussing the “Careful adoption of agentic AI services” which discusses cybersecurity risks and challenges associated with adopting agentic AI, as well as best practices for securing agentic AI systems and defending against emerging agentic AI threats. The paper encourages organizations to incorporate AI security in their existing cybersecurity risk management programs and procedures.


In 2025, Dechert’s Cyber, Privacy & AI team achieved top individual and group rankings in The Legal 500 and Chambers USA. Global Chair and Partner Brenda Sharton, a Law360 MVP, and Partner Ben Sadun, a Law360 Rising Star, were recognized for their leadership and contributions to the team’s achievements. The team was also recognized in Law.com’s “Litigators of the Week” column for its recent victory for Flo Health, a matter that showcased the team’s strategic excellence. Thank you to our clients for entrusting us with the types of matters that led to these recognitions.



Content Editors

Sonia Brunstad, Eric GreenJames Smith

Production Editors

J.J. Jones and Austin Mooney

Partner Committee Editor

Timothy C. Blank


Dechert Cyber Bits Partner Committee


Dechert’s global Cyber, Privacy and AI practice provides a multidisciplinary, integrated approach to clients’ privacy and cybersecurity needs. Our practice is top ranked by The Legal 500 and our partners are well-known thought leaders and sought after advisors in the space with unparalleled expertise and experience. Our litigation team provides pre-breach counseling and handles all aspects of data breach investigations as well as the defense of government regulatory enforcement actions and class action litigation for clients across a broad spectrum of industries. We have handled over a thousand data breach investigations of all types including nation states, ransom/cyber extortion, vendor/supply chain, DDoS, brought by threat actors of all types, from nation-state threat actors to organized crime to insiders. We also represent clients holistically through the entire life cycle of issues, providing sophisticated, solution oriented advice to clients and counseling on cutting edge data-driven products and services including for trend forecasting, personalized content and targeted advertising across sectors on such key laws as the CCPA, CPRA and state consumer privacy laws, Section 5 of the FTC Act; the EU/UK GDPR, e-Privacy Directive, and cross-border data transfers. We also conduct privacy and cybersecurity diligence for mergers and acquisitions, financings, corporate transactions, and securities offerings.

View Previous Issues