State Privacy Laws: 2026 Changes & Compliance
January 2026
By
Axiom Law
As 2026 begins, privacy compliance landscape continues to evolve, as more states enact comprehensive data protection legislation. Businesses face a complex web of requirements across multiple jurisdictions, each with unique provisions for how they collect, process personal data, and protect consumer rights. For in-house legal teams and compliance professionals, staying ahead of these changes is critical to avoiding regulatory penalties and maintaining consumer trust.
In a recent Continuing Legal Education (CLE) course, Axiom Lawyer Ava walked legal professionals through the shifting requirements taking effect in 2026 and beyond. Her insights provide a roadmap for organizations navigating this increasingly complex regulatory environment.
Understanding Personal Data Under State Privacy Laws
Before diving into specific state requirements, it's important to establish what constitutes personal data under these laws. Personal data is broadly defined as any information that can reasonably identify, directly or indirectly, an individual or household. This expansive definition has created some operational challenges for businesses, particularly around the “household” component.
Only 35% of organizations surveyed have added “household” to their definition of personal data across all jurisdictions, even in regions like the European Union where household data is not explicitly included. This raises important questions about operational consistency and whether companies are taking a jurisdiction-by-jurisdiction approach, which can be difficult to implement from a business perspective.
Personal data encompasses a wide range of information, from basic contact details like names and email addresses to more sensitive categories. Many businesses mistakenly believe they don't collect personal data if they only gather business contact information. However, a business email address paired with a name constitutes personal data and falls under state privacy law requirements.
Sensitive personal data requires even more stringent protections. This category includes Social Security numbers, driver's licenses, passport information, biometric data, health information, precise geolocation data, and characteristics like race, ethnic origin, religious affiliation, sexual orientation, and political beliefs. Under most state privacy laws, processing sensitive personal data requires explicit opt-in consent from consumers.
Effective privacy programs start with clear definitions, explicit training, and experienced talent.
New State Privacy Laws Taking Effect in 2026
Three states will see their comprehensive privacy laws take effect on January 1, 2026: Indiana, Kentucky, and Rhode Island, and each brings unique requirements.
Indiana Consumer Data Protection Act
The Indiana Consumer Data Protection Act applies to for-profit businesses that control or process personal data of at least 100,000 Indiana residents or derive more than 50% of gross revenue from selling the data of 25,000 or more consumers. The law excludes employee and job applicant data, as well as information used in a business-to-business context.
Indiana's business obligations include providing clear and accessible privacy policies detailing the categories of personal data collected and processing purposes. Organizations must obtain opt-in consent before processing sensitive personal data and implement reasonable technical and organizational measures to protect information. The law also requires businesses to establish a consumer appeals process and avoid retaliating against consumers who exercise their rights.
Consumers in Indiana can confirm whether a business is processing their personal data, request access to that data once annually, correct inaccuracies, request deletion of data obtained by the business, and opt out of targeted advertising, data sales, or profiling activities.
Data protection impact assessments (DPIAs) are required for high-risk processing activities, including targeted advertising, data sales, profiling, processing sensitive data, and any activities presenting heightened consumer harm risks.
The Indiana Attorney General has sole enforcement authority, with fines up to $7,500 per violation. The law includes a 30-day cure period that does not sunset, giving businesses an opportunity to remedy violations before penalties are imposed.
Kentucky Consumer Data Protection Act
Kentucky's threshold mirrors Indiana's, applying to organizations that control or process personal data of at least 100,000 consumers or handle data from 25,000 or more consumers while deriving over 50% of gross revenue from data sales. The Kentucky Consumer Data Protection Act includes exemptions for data subject to HIPAA and the Gramm Leach Bliley Act, as well as for nonprofit organizations, higher education institutions, and entities assisting law enforcement with insurance-related investigations.
Business obligations under the Kentucky law emphasize data minimization and purpose limitation. Organizations must limit their collection and processing of personal data to what is adequate, relevant, and reasonably necessary for the stated purposes. Like Indiana, Kentucky requires opt-in consent before processing sensitive data and mandates reasonable safeguards to protect personal information.
Consumer rights under Kentucky's law parallel those in Indiana, including rights to access, correct, delete, and port personal data, plus the ability to opt out of targeted advertising, data sales, and certain profiling activities.
DPIAs are required for targeted advertising, data sales, profiling, processing sensitive data, and activities that present heightened risk of consumer harm. The Kentucky Attorney General enforces the law without a private right of action, and penalties can reach $7,500 per violation after a 30-day cure period.
Rhode Island Data Transparency and Privacy Protection Act
Rhode Island takes a slightly different approach with lower applicability thresholds. The law applies to businesses that control or process data of 35,000 or more Rhode Island consumers or handle data from 10,000 or more consumers while deriving over 20% of gross revenue from data sales.
What sets Rhode Island’s apart from many other state privacy laws is its requirement that businesses disclose the specific third parties with personal data is shared. This level of transparency goes beyond the general disclosures required in many other jurisdictions and requires careful documentation of data-sharing relationships.
Rhode Island requires opt-in consent before processing sensitive personal data, which includes racial and ethnic data, religious beliefs, health conditions, sexual information, and citizenship status. Processing data from known children must comply with the Children's Online Privacy Protection Act (COPPA), and often requires the support of experienced COPPA lawyers.
Consumer rights in Rhode Island include the standard access, correction, deletion, and portability rights, plus the ability to opt out of targeted advertising, data sales, and profiling. Rhode Island also allows consumers to designate an authorized agent to exercise rights on their behalf.
Businesses must conduct data protection assessments where processing presents heightened risk of harm and implement reasonable safeguards to protect personal data. The law prohibits discrimination against consumers exercising their rights and requires mechanisms for consumers to grant and revoke consent.
A notable feature of Rhode Island's law is the absence of a cure period. Unlike Indiana and Kentucky, businesses face immediate penalties for violations without an opportunity to remedy issues first. The Rhode Island Attorney General handles enforcement without a private right of action.
Many organizations have already updated their privacy policies to disclose third parties to whom they have sold or may sell personal data, demonstrating proactive compliance with Rhode Island's unique requirement.
States with Comprehensive Data Privacy Laws
By 2026, 15 states will have active comprehensive data privacy laws. This patchwork of state regulations creates significant compliance challenges for businesses operating across multiple jurisdictions, often requiring the advice of experienced data privacy lawyers. The foundational laws include:
- California Consumer Privacy Act (CCPA) - The nation's first comprehensive state privacy law, which has undergone several amendments and remains one of the strictest
- Virginia Consumer Data Protection Act (VCDPA) – Virginia was an early adopters, following California
- Colorado Privacy Act - Notable for recent expansions in scope
- Connecticut Data Privacy Act - Recently updated with new requirements
- Utah Consumer Privacy Act - Part of the 2023 wave of implementations
- Delaware Personal Data Privacy Act - Took effect in 2025
- Iowa Consumer Data Protection Act - Also implemented in 2025
- Texas Data Privacy and Security Act - Among the larger states to enact protections
In addition to comprehensive privacy laws, many states have enacted specific biometric data laws, including Illinois, Washington, and Texas. Even states with comprehensive privacy laws often include additional provisions specifically addressing biometric data collection and use.
2026 CCPA Amendments and Updates
California continues to refine and expand its privacy law. While the California Consumer Privacy Act established the foundation, subsequent amendments through the California Privacy Rights Act (CPRA) and ongoing regulatory updates have strengthened protections. These complex regulations often require businesses to consult to with CCPA lawyers to ensure compliance.
California remains unique among U.S. states in affording employees the same privacy rights as consumers. In fact, 73% of organizations report extending employee privacy rights beyond California to all their employees, regardless of jurisdiction. This approach simplifies operational processes by creating a single standard rather than managing different rights for different employee populations.
California compliance sets the standard in the U.S., and requires sustained expertise to maintain.
Key Updates from Connecticut and Colorado
Connecticut recently updated its data privacy act with several significant changes. The state now requires disclosure if personal data is being used to train large language models, reflecting growing concerns about artificial intelligence and data processing.
A major shift involves the Gramm Leach Bliley Act exemption. Connecticut has moved from an entity-level exemption to a data-level exemption. Previously, financial services institutions could claim an entity-wide exemption if they were subject to GLBA, meaning all their data collection activities were exempt from state privacy law requirements. Under the new approach, only data specifically covered by GLBA receives the exemption. For example, a financial services company that operates both consumer banking and non-GLBA business lines can no longer claim blanket exemption for all data. Non-GLBA data now falls under Connecticut's privacy law requirements.
Connecticut also expanded its definition of sensitive personal data to include nonbinary and transgender status, as well as neural data. Neural data refers to information gathered from measuring the activity of an individual's central or peripheral nervous system, including electrical, chemical, or other signals. This reflects growing awareness of emerging technologies that collect physiological information.
The state has placed expanded focus on minors, defining them as individuals under age 18 and requiring heightened protections for their data.
Colorado has implemented similar expansions. The state now defines minors as individuals under 18 and requires opt-in parental consent for targeted advertising, selling, or risky profiling of minors' personal data. Colorado has added precise geolocation and neural data to its definition of sensitive personal data and established stricter notice, consent, and deletion requirements for biometric data.
The Rise of Algorithmic Pricing Disclosure Requirements
New York's Algorithmic Pricing Disclosure Act, which took effect November 10, 2025, represents a new frontier in consumer protection. The law requires any entity domiciled or doing business in New York that uses algorithmic pricing to include a clear and conspicuous disclosure alerting consumers that their private data is being used to set prices.
Algorithmic pricing commonly occurs through mobile apps and loyalty programs where consumers receive customized pricing in connection with discounts. However, the practice can also result in consumers paying higher prices for hotel reservations based on their browsing history or location, or being charged more for consumer goods based on their purchase history.
However, many consumers are unaware of the Algorithmic Pricing Disclosure Act. For retailers and companies with loyalty programs, this represents a significant compliance gap. The law carries civil penalties of $1,000 per violation, and New York Attorney General Letitia James has actively encouraged consumers to file complaints.
This requirement reflects broader concerns about data protections and the use of consumer information to make automated decisions that affect pricing and access to goods and services.
Biometric Data: An Evolving Compliance Challenge
Biometric data has become a focal point of privacy regulation across multiple states. This sensitive personal data category includes fingerprints, facial recognition data, retinal scans, voiceprints, and other biological identifiers.
Illinois, Washington, and Texas have specific biometric privacy laws with stringent requirements, often requiring advice from biometrics lawyers. However, most states with comprehensive privacy laws also include provisions protecting biometric data and requiring specific disclosures. Organizations must check both comprehensive privacy laws and any standalone biometric laws in states where they operate.
Common uses of biometric data include facial recognition for device login, fingerprint scanning for timesheets and access control, and voice recognition for authentication. Each of these uses triggers notice and consent requirements under various state laws.
Biometric data elevates privacy risk and demands specialized legal experience.
Artificial Intelligence and State Privacy Laws
Artificial intelligence (AI) has become a major focus of privacy regulation, with several states enacting AI-specific laws alongside their comprehensive privacy statutes. California, Illinois, Maryland, Utah, Colorado, Texas, Nevada, and New York all have laws governing AI uses that intersect with privacy protections, often requiring advice from artificial intelligence lawyers.
California requires disclosure when AI likenesses are used for commercial purposes and when AI is employed in hiring decisions. Colorado requires reasonable care for high-risk AI systems to prevent algorithmic discrimination. Illinois mandates disclosure and consent when employers use AI in employment decisions. Maryland requires disclosure for AI use in employment contexts. New York City requires disclosures for AI used in hiring and requires chatbot disclosures. Texas has established a comprehensive framework for AI use, prohibiting specific harms like deepfakes and social scoring, which uses AI and data analysis to assign numerical scores to individuals based on their behaviors and characteristics.
The relationship between AI and personal data is fundamental. AI systems use personal data by scraping public information, tracking user activity, and analyzing user-provided input to train models. This raises concerns about misuse of personal data, biased outputs leading to discriminatory practices, and data breaches.
Organizations should develop clear AI policies that employees and consumers can understand. These policies should specify what tools are being used and for what purposes, identify what personal data is in scope and who the data subjects are, and outline safeguards against misuse. Privacy assessments should extend to AI systems, with particular attention to data processing agreements that include robust contract provisions around limitation of liability and indemnification.
AI governance succeeds when legal expertise keeps pace with technology.
Consumer Rights Across State Privacy Laws
Despite variations in specific requirements, state privacy laws generally grant consumers similar core rights. These include:
- Right to Know: Consumers can request information about what personal data is being collected, how it's being used, and how it’s being shared
- Right to Access: Consumers can obtain a copy of their personal data that a business has collected
- Right to Correction: Consumers can request that inaccurate personal data be corrected
- Right to Deletion: Consumers can request that their personal data be deleted, subject to certain exceptions
- Right to Portability: Consumers can request a copy of their personal data in a portable, usable format
- Right to Opt Out: Consumers can opt out of the sale or sharing of their personal data, its use for targeted advertising, and certain types of profiling
- Right to Revoke Consent: Consumers can withdraw previously given consent for processing sensitive personal data
- Right to Non-Discrimination: Businesses cannot discriminate against consumers for exercising their privacy rights
Some laws also provide a limited private right of action, allowing consumers to sue directly for certain violations, particularly those involving data breaches.
Operational Challenges and Compliance Strategies
The proliferation of state privacy laws creates significant operational challenges. Organizations may be subject to HIPAA for certain health data while also subject to state privacy laws for other information. They may operate under GLBA for some entities while state privacy laws apply to others. International operations add another layer of complexity with GDPR and other non-U.S. privacy laws. Strategies for managing this complexity include:
Considering the Business Impact: Implement a privacy framework that demonstrates compliance without being overly burdensome. Privacy by design should be incorporated when onboarding new products and services.
Using Data Protection Impact Assessments Strategically: Many organizations conduct DPIAs on all products and services to avoid the challenge of determining when assessments are required. Creating user-friendly DPIA templates with initial screening questions can streamline the process. If a business certifies that no personal data is being processed after answering preliminary questions, the assessment can be quickly completed.
Partnering with the Business: Legal and compliance teams should work collaboratively with business units rather than serving as barriers to operations. Assist with completing DPIAs rather than simply mandating them.
Maintain Robust Contract Frameworks: Ensure consistency across NDAs, MSAs, SaaS agreements, DPAs, and BAAs. Use consistent terminology, such as always referring to “personal data” rather than alternating between “personal data,” “personal information,” and “information.” Include robust indemnification and limitation of liability provisions in master service agreements and, when possible, in data processing agreements as well.
Focus on Transparency: Provide clear privacy policies and data subject access rights pages. Make it easy for consumers to exercise their rights and understand how their data is being used. A well-designed data subject access rights page can also provide a more controlled, privacy-protective approach.
Recent Enforcement Actions and What They Signal
Recent enforcement actions demonstrate that regulators are actively monitoring compliance and willing to impose significant penalties for violations.
The Texas Attorney General filed a lawsuit under the Texas Data Privacy and Security Act against a leading national insurer and one of its subsidiaries, alleging the company embedded a software development kit in third-party apps that collected and sold sensitive driving behavior and location data. The lawsuit claims Allstate failed to provide required notices, obtain consent, and offer opt-outs.
The California Attorney General announced a $1.55 million settlement against a major health information platform for collecting health data, earning revenue through personalized advertising delivered by third-party trackers, and failing to honor consumer opt-outs. The company also failed to honor Global Privacy Control signals, used personal data for purposes beyond what was disclosed, maintained insufficient contracts with vendors, and engaged in deceptive practices related to its cookie consent banner.
A leading social media company has received large GDPR fines for creating risks to data subjects’ fundamental rights and freedoms, demonstrating that international enforcement can have significant financial impact, and consulting a GDPR lawyer is often a good idea.
These actions emphasize several key compliance priorities: honor consumer opt-outs and Global Privacy Control signals, provide required disclosures at collection points, maintain robust vendor contracts, ensure cookie consent mechanisms function properly, and avoid using personal data beyond disclosed purposes.
Emerging Trends in Privacy Regulation
Several trends are shaping the future of privacy regulation:
Expanded Protections for Minors: More states are defining minors as individuals under 18 and implementing enhanced protections for their data. This includes stricter consent requirements for targeted advertising and data sales involving minors.
Neural Data and Emerging Technologies: As Connecticut and Colorado have demonstrated, states are beginning to regulate new categories of sensitive data related to emerging technologies. Neural data is just one example of how privacy laws are evolving to address technological advances.
AI Integration: The intersection of AI and privacy will continue to generate new regulatory requirements. Organizations using AI must understand how these systems process personal data and ensure compliance with both AI-specific laws and privacy regulations.
Algorithmic Decision-Making: New York's algorithmic pricing law may be a harbinger of increased scrutiny on automated decision-making systems that affect consumers. Transparency requirements around algorithmic processes are likely to expand.
International Alignment: The EU Accessibility Act, which took effect June 28, 2025, applies to products and services sold in the EU and requires compliance with web content accessibility guidelines (WCAG). By its nature, the act will require disclosure of health data when individuals request accommodations, creating overlap with privacy laws. Companies with international operations should watch for similar convergence between accessibility requirements and privacy protections.
Practical Steps for Compliance
To navigate the complex state privacy law landscape and ensure they’re prepared for future regulations, organizations should proactively:
Audit Data Processing Activities: Understand what personal data is collected, where it's processed, who the data subjects are, and whether sensitive personal data is involved.
Update Privacy Policies: Ensure privacy policies address all applicable state requirements, including specific disclosures required in states like Rhode Island. The policy must be accessible, as required by CCPA.
Implement Consumer Rights Mechanisms: Establish clear processes for consumers to exercise their rights, preferably through a dedicated data subject access rights page.
Conduct Data Protection Impact Assessments: Evaluate processing activities for risk, particularly those involving sensitive data, targeted advertising, profiling, or data sales.
Review and Update Vendor Contracts: Ensure data processing agreements include appropriate safeguards, clearly define responsibilities, and address indemnification and liability.
Train Employees: Provide regular training on privacy requirements, data handling practices, and how to respond to consumer requests.
Monitor Regulatory Developments: Stay informed about new laws, amendments to existing laws, and enforcement actions.
Extend Protections Where Practical: Consider applying the strictest applicable requirements across all operations to simplify compliance. For example, 73% of organizations already extend employee privacy rights beyond California to all employees.
Establish an AI Policy: If using AI systems, develop clear policies about what tools are used, for what purposes, and what safeguards are in place. An experienced artificial intelligence lawyer can help develop policies and frameworks.
Document Everything: Maintain records of DPIAs, consumer requests and responses, data processing activities, and compliance efforts. Documentation provides critical evidence in the event of regulatory inquiries.
State privacy laws are here to stay and will continue to evolve. While the patchwork of regulations creates challenges, organizations that take a proactive, comprehensive approach to compliance can not only avoid penalties but also build consumer trust.
Posted by Axiom Law
Related Content
2024 Legal Landscape: Six Strategies for GCs to Navigate In-House Counsel Challenges
Experts shed a light on the current state of the legal industry, emphasizing the crucial role of in-house counsel and how they’re struggling to uphold that role.
To Retain Talent, GCs Should Prioritize Mission Statements
Axiom's chief legal officer, Catherine Kemnitz, explains how GCs should prioritize mission statements to retain top talent in a recent Law360 byline article.
Best Practices for Product Counsel
Explore the history and evolution of the product counsel role and best practices for attorneys who carry out those functions.
- North America
- Expertise
- Must Read
- Legal Department Management
- Work and Career
- Perspectives
- State of the Legal Industry
- Legal Technology
- United Kingdom
- Australia
- Artificial Intelligence
- Hong Kong
- Singapore
- Central Europe
- General Counsel
- Spotlight
- Solutions
- Legal Operations
- Regulatory & Compliance
- Technology
- Data Privacy & Cybersecurity
- Featured Talent Spotlight
- Commercial & Contract Law
- Finance
- Corporate Law
- Large Projects
- Axiom in the News
- Capital Markets
- Global
- Tech+Talent
- Diversified Financial Services
- Videos
- Healthcare
- Cost Savings
- Investment Banking
- Budgeting Report
- Intellectual Property
- Labor & Employment
- Banking
- Financial Services
- In-House Report
- Insurance
- Law Firms
- Secondments
- Commercial Transaction
- DGC Report
- Litigation
- Real Estate
- Regulatory Response
- Retail
- Software
- Consulting
- Energy
- Legal Support Professionals
- Media
- Mergers and Acquisitions
- News
- Pharmaceuticals
- Recruitment Solutions
Get more of our resources for legal professionals like you.