12 Privacy and Data Security Considerations for Businesses Considering U.S. Market Entry in 2026

Key Takeaways

  • Companies relying on GDPR or other home-country frameworks often face gaps that could trigger enforcement actions or litigation. Entering the U.S. market can create immediate, unexpected privacy exposure.
  • State and sectoral privacy laws can materially affect go-to-market strategy. Where and how a company operates in the U.S. directly impacts compliance obligations.
  • Private litigants, not regulators, are often the biggest financial risk. Routine business practices like sending marketing messages, using website tracking tools, or using biometric data, as well as data breach incidents, can lead to class actions that create significant settlement exposure and defense costs, often exceeding regulatory penalties — although regulatory penalties for violations of certain privacy and data security laws can be substantial as well.
  • Data strategy decisions pose regulatory risk. Use of AI, data sharing, and cross-border data transfers are generating new waves of state and federal legislation and rulemaking and drawing increasing scrutiny from the DOJ, FTC, and state regulators.

Many non‑U.S. businesses assume that compliance with the European Union’s General Data Protection Regulation (“EU GDPR”) or a similar home‑country law will largely address U.S. requirements, but the U.S. framework is fragmented, highly sector‑ and state‑specific, and generates distinct regulatory and litigation risks that often are not addressed by compliance with home-country laws.

This article highlights 12 areas that in‑house counsel and business leaders should factor into data privacy and security diligence, contract terms, and governance when planning U.S. operations. These areas are not mutually exclusive and, in fact, they often overlap. This list also is not exhaustive. There are discrete federal and state privacy laws regulating a host of other areas not addressed here ranging from motor vehicle records and educational records to video rental records, library records, and loyalty program information, among others.

1. Sectoral Federal Privacy Laws Such as HIPAA and GLBA

While the U.S. does not have an omnibus privacy law, it does have a number of sectoral and issue-specific privacy laws. The Health Insurance Portability and Accountability Act (“HIPAA”) regulations, which regulate covered health‑related entities, and the Gramm-Leach-Bliley Act (“GLBA”), which regulates covered financial institutions, are prominent examples.

HIPAA governs protected health information held by “covered entities” (such as many healthcare providers and health plans) and their “business associates” (a broad array of entities providing services to covered entities involving the processing of protected health information on their behalf). The HIPAA privacy, security, and data breach notification regulations include specific contracting and compliance requirements that covered entities and their business associates must address.

Some states, such as Washington and Nevada have adopted robust health information privacy laws intended to fill gaps regarding the privacy of consumer health data where the HIPAA privacy rules do not apply and the Washington law includes a private right of action.

GLBA applies to a wide array of financial institutions, not just banks, and requires specific privacy notices, regulates the sharing of “non‑public personal information,” and imposes information security requirements.

Foreign businesses entering the health or financial services sectors should treat HIPAA and GLBA, if applicable, as primary regulatory regimes, not as mere supplements to home-country requirements.

2. “Comprehensive” State Privacy Laws

The U.S. still lacks a single federal EU GDPR‑style law, but more than 20 states have now enacted “comprehensive” consumer privacy statutes, starting with the California Consumer Privacy Act (the “CCPA” or “CPRA”) and followed by states such as Virginia, Colorado, Connecticut, Texas, and others. California is one of the most operationally demanding states — it created a dedicated privacy regulator (the California Privacy Protection Agency) and is comparatively aggressive with respect to enforcement.

Each state’s law is distinct, but they all typically include privacy notice requirements, consumer rights obligations (for example, access, deletion, correction, opt‑out of “sale,” and opt-out of targeted advertising), purpose limitation concepts, data minimization concepts, and vendor contracting obligations. While these laws apply across sectors, they do not apply to all businesses due to a range of different applicability triggers and exceptions. As a result, the impact of this category of state laws depends on the size and scope of a business’s operations and the states where business will be conducted. A threshold assessment of which state laws actually apply to an entity should therefore be considered a necessary first step in any U.S. privacy strategy.

3. Marketing and Communications Privacy

In the marketing and communications space, the U.S. federal CAN‑SPAM Act and similar state laws set rules for commercial email, including identification requirements, opt‑out mechanisms, and header‑information accuracy. The Telephone Consumer Protection Act (“TCPA”) and parallel state mini‑TCPA statutes heavily regulate telemarketing, SMS/MMS text messaging, and certain automated calling. Do-not-call list rules also apply particularly, but not exclusively, to telemarketing communications. Some of these laws have driven substantial class action litigation.

4. Website Tracking and Video and Call Recording

If a business intends to engage in video or call recording or use of website tracking technologies, additional U.S. laws regulate those activities. Federal and state wiretapping and eavesdropping statutes, as well as call‑recording laws, require one‑party or all‑party consent to record depending on the jurisdiction. Additionally, plaintiffs are increasingly challenging “session replay” and other online tracking technologies, such as cookies and pixels, under these legal frameworks.

For businesses considering U.S. physical retail stores or other locations in the U.S., notices regarding video surveillance also may be required. Businesses entering the U.S. market that are planning to engage in these types of activities should carefully review their practices in these areas to address U.S. compliance concerns and mitigate potential risk.

5. Children’s Privacy

There are a mix of federal and state privacy laws that businesses processing personal data about children must consider. U.S. federal law is anchored by the Children’s Online Privacy Protection Act (“COPPA”), which applies to online services directed to children under 13 or that knowingly collect personal information from such children. COPPA requires clear notices, verifiable parental consent before most data collection, limits on use and disclosure, and reasonable security, and it is enforced primarily by the U.S. Federal Trade Commission (“FTC”) and state attorneys general.

In parallel, an expanding set of state child‑focused privacy and “online safety” laws (for example, age‑appropriate design‑style codes and teen‑specific protections) for children as old as 18 are imposing additional obligations around profiling, targeted advertising, and default settings for minors, creating a multi‑layered regulatory framework.

6. Artificial Intelligence

New and proposed state artificial intelligence (“AI”) and automated decision‑making laws are proliferating focusing on AI transparency, data minimization, bias and discrimination risks, and the need for impact assessments where models rely on sensitive personal information or materially affect individuals (for example, employment, housing, credit, or access to essential services).

Non‑U.S. businesses may need to adapt AI governance programs built around EU GDPR, the EU AI Act, or other laws to address specific U.S. disclosure, consent, notice, and opt‑out expectations, as well as heightened scrutiny of training data, profiling, and the reuse of consumer and employee data for AI purposes.

7. Employee and Applicant Privacy

Businesses entering the U.S. market may be surprised by the patchwork of U.S. employee‑focused rules. The federal Fair Credit Reporting Act (“FCRA”) and similar laws in many states regulate the use of third‑party background screening reports regarding applicants and employees (as well as use of such reports for other purposes). State and local “ban the box,” “fair chance,” and anti-discrimination laws restrict when and how criminal history information can be requested and used during hiring, typically requiring delayed inquiries and individualized assessments. Other state laws restrict the use of credit reports and salary history information as part of the hiring process.

Employers also face state laws regarding lawful off‑duty conduct (for example, protecting certain lawful products or activities), drug‑testing constraints, and restrictions on requesting social media credentials or disciplining employees for lawful online activity, which collectively require careful coordination of global human resources and compliance policies.

8. Biometrics Privacy Laws

Several states have enacted biometric privacy statutes, with Illinois’ Biometric Information Privacy Act (“BIPA”) being the most prominent example that is frequently invoked in private class actions. These laws can apply to technologies such as fingerprint time clocks, facial recognition for physical or logical access, and voiceprints, often requiring informed consent, data retention limits, and secure disposal.

9. Cybersecurity Laws

Data security obligations are increasingly being codified not just as general “reasonable security” requirements, but as more detailed statutory standards and regulatory guidance. Many state privacy laws expressly require appropriate technical, administrative, and physical safeguards tied to the sensitivity and volume of personal data, and some state laws prescribe specific controls, risk assessments, audits, and governance structures (particularly in financial services and critical infrastructure contexts). In addition, California will soon require certain businesses covered by the CCPA to conduct cybersecurity audits and submit certifications. These state rules sit alongside, and sometimes go beyond, federal sectoral requirements (such as those under HIPAA or GLBA).

Businesses that have designed their security programs around GDPR or a single global standard should assess whether state‑specific or sector-specific mandates on security measures such as those regarding encryption, access management, multi-factor authentication, vendor oversight, incident response, board‑level reporting, and regulatory reporting require tailored enhancements for U.S. operations.

10. Data Breach Notification Laws

All U.S. states, plus territories, have data breach notification statutes that impose obligations to notify individuals (and sometimes regulators or credit bureaus) when defined personal information is accessed or acquired without authorization, often subject to specific notification timelines and notice content requirements. These laws differ on the scope of covered entities and covered data, whether they carry risk‑of‑harm exceptions, and whether delays are permitted for law enforcement needs, so multi‑state incidents require coordinated, state‑specific analysis.

In addition, some businesses are subject to federal breach notification rules under regimes such as HIPAA, the GLBA, or the U.S. Securities and Exchange Commission (“SEC”) requirements for data breach reporting by publicly traded companies. As such, businesses entering the U.S. market should consider developing U.S.-focused breach notification protocols to anticipate their response to a breach of U.S. personal data.

11. Government and Bulk U.S. Sensitive Data Transfer Regulations

Unlike EU GDPR and many national data protection laws, the U.S. has not traditionally regulated the export of personal data from the United States to other jurisdictions. In January 2025, however, the U.S. Department of Justice (“DOJ”) finalized regulations that restrict or prohibit certain “covered data transactions” involving bulk U.S. sensitive personal data or U.S. government‑related data with specified “countries of concern” and “covered persons.” The rule defines “bulk U.S. sensitive personal data” broadly to include categories such as certain personal identifiers, precise geolocation, biometric identifiers, health and financial data, and human ‘omic data when certain thresholds are met within a 12‑month period.

A separate law, enacted in 2024, also restricts the sale or transfer of personal data by third-party data brokers to “adversary countries” or entities under their control, which could apply instead of or in addition to the DOJ regulations. Compliance with these requirements necessitates an understanding of data flows given that contractual safeguards differ depending on whether the parties involved are U.S. persons, foreign persons, or covered persons under these regulations. These measures would apply in addition to any data transfer requirements that may be required by home-country data protection laws, such as EU GDPR.

12. Federal and State Unfair or Deceptive Acts and Practices Laws

The FTC and state regulators have long used federal and state prohibitions on unfair or deceptive acts and practices (“UDAP”) to bring actions against businesses that have failed to keep their privacy and data security promises, as well as to act against businesses that engage in unfair privacy and data security practices. While potentially overlooked because UDAP laws are broad prohibitions rather than detailed operational compliance regimes, federal and state regulators have brought hundreds of UDAP cases over the years. To avoid engaging in deceptive practices, it is important to ensure that a business’ public privacy and security promises are kept in practice. Additionally, unfairness claims do not require an unkept promise and can result, for example, if inadequate data security practices result in substantial injury to consumers that the consumer could not reasonably have avoided. As a result, businesses considering U.S. market entry should consider reviewing their privacy policies, notices and other promises, as well as their data security practices from this broader perspective in addition to more specific requirements applicable to their U.S. operations.

Conclusion: Regulatory and Litigation Risks

Many of the privacy and security laws and regulations discussed above provide private rights of action that make the U.S. litigation environment particularly attractive for class action plaintiffs. Meanwhile, federal and state regulators (including the FTC, sectoral regulators, state attorneys general, and specialized bodies such as the California Privacy Protection Agency) actively bring regulatory enforcement actions for privacy and security violations.

Compliance with the EU GDPR or other non‑U.S. data protection frameworks likely will support U.S. compliance efforts, but it is not determinative. Even in instances where U.S. federal and state laws share the same privacy protective goals of non-U.S. privacy laws, U.S. laws can differ significantly regarding issues such as scope, legal bases, consent standards, notice design and content, automated‑decision rules, and, crucially, private litigation exposure. Non‑U.S. businesses planning to enter or expand in the U.S. market should therefore consider undertaking a targeted U.S. privacy and data use assessment covering consumer, employee, and business-to-business data flows, to calibrate governance, contracting, technology, and insurance strategies to this distinct regulatory and litigation landscape.

For assistance with these issues, please contact AGG Privacy & Cybersecurity partners Kevin Coy or Erin Doyle.