Key Takeaways
New York S7263’s cross-professional scope and direct private right of action could create greater legal exposure for some companies than analogous laws enacted or considered by other states. Companies developing, deploying, or supporting AI-powered services should consider evaluating chatbot operations and outputs in the context of this developing area of law.
New York Senate Bill S7263 would prohibit a chatbot from “impersonating certain licensed professionals,” such as lawyers, physicians, and engineers. Specifically, the proposed amendment would bar a chatbot from providing “any substantive response, information, or advice or tak[ing] any action” that professional licensing laws already restrict natural persons from undertaking. The restrictions would apply to anyone who “owns, operates or deploys” such a chatbot. Significantly, S7263 would create a private right of action for actual damages and, for willful violations, costs and attorneys’ fees. AI proprietors would be required to provide “conspicuous” notice that the interactions involve AI, but doing so would not waive or disclaim potential liability. The bill advanced to the third reading on the Senate floor on March 4, 2026, and must still pass the Senate, the Assembly, and gubernatorial review before it can be enacted.
While New York is not the first state to target so-called AI “impersonat[ion]” of licensed professions, S7263’s cross-professional scope distinguishes it from analogous laws in several other states. For example, Nevada (AB406), Illinois (HB1806), and Tennessee (SB1580) restrict AI from providing mental and behavioral health services. Likewise, pending bills in Vermont (H.816), New Jersey (A799), Washington (HB2599), and Colorado (HB26-1195) target AI representations of mental health credentials and attach private remedies through state consumer-protection statutes. Like S7263, New Hampshire’s pending SB640 would prohibit using AI to provide services requiring any professional licensure, but unlike S7263, it would not confer a private right of action.
Despite S7263’s progress in the New York State Senate, its future remains uncertain. Virginia’s HB 669 may be instructive. When introduced, that bill similarly proposed to prohibit chatbots from providing responses that would constitute unlicensed practice across many professions, including law, and to implement a private cause of action. However, the bill was tabled and did not advance beyond committee.
Even if S7263 is not enacted, AI companies may still be at risk of liability. The FTC entered a consent order against DoNotPay, Inc. in January 2025, that required the company to stop representing that its AI service “operates like a human lawyer” to “generate[] demand letters and initiate[] cases in small claims court.” The FTC also ordered the company to pay $193,000 in damages. See Decision and Order at 3, In re DoNotPay, Inc., FTC Docket No. 232-3042 (Jan. 14, 2025). As this decision confirms, AI professional-practice representations already attract federal enforcement under existing consumer-protection authority. Legislation like S7263 could lead to further litigation.