Health Insurance Portability and Accountability Act

Last week, the California Privacy Protection Agency (CPPA) announced it will conduct a public investigative sweep of data broker registration compliance under the California Delete Act.

Pursuant to the Act, a “data broker” is “a business that knowingly collects and sells to third parties the personal information of a [California] consumer with whom the business

The Federal Trade Commission (FTC) has assumed the authority to enforce unauthorized data disclosures under the Federal Trade Commission Act (FTC Act). During the past three weeks, the FTC has used this authority to go after healthcare companies that disclose their customers’ personal data without permission.

On April 11, the FTC sued Monument, an online

*This post was co-authored by Josh Yoo, legal intern at Robinson+Cole. Josh is not admitted to practice law.

Health care entities maintain compliance programs in order to comply with the myriad, changing laws and regulations that apply to the health care industry. Although laws and regulations specific to the use of artificial intelligence (AI) are limited at this time and in the early stages of development, current law and pending legislation offer a forecast of standards that may become applicable to AI. Health care entities may want to begin to monitor the evolving guidance applicable to AI and start to integrate AI standards into their compliance programs in order to manage and minimize this emerging area of legal risk.

Executive Branch: Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence

Following Executive Order 13960 and the Blueprint for an AI Bill of Rights, Executive Order No. 14110 (EO) amplifies the current key principles and directives that will guide federal agency oversight of AI. While still largely aspirational, these principles have already begun to reshape regulatory obligations for health care entities. For example, the Department of Health and Human Services (HHS) has established an AI Task Force to regulate AI in accordance with the EO’s principles by 2025. Health care entities would be well-served to monitor federal priorities and begin to formally integrate AI standards into their corporate compliance plans.

  • Transparency: The principle of transparency refers to an AI user’s ability to understand the technology’s uses, processes, and risks. Health care entities will likely be expected to understand how their AI tools collect, process, and predict data. The EO envisions labelling requirements that will flag AI-generated content for consumers as well.
  • Governance: Governance applies to an organization’s control over deployed AI tools. Internal mechanical controls, such as evaluations, policies, and institutions, may ensure continuous control throughout the AI’s life cycle. The EO also emphasizes the importance of human oversight. Responsibility for AI implementation, review, and maintenance can be clearly identified and assigned to appropriate employees and specialists.
  • Non-Discrimination: AI must also abide by standards that protect against unlawful discrimination. For example, the HHS AI Task force will be responsible for ensuring that health care entities continuously monitor and mitigate algorithmic processes that could contribute to discriminatory outcomes. It will be important to permit internal and external stakeholders to have access to equitable participation in the development and use of AI.

National Institute of Standards and Technology: Risk Management Framework

The National Institute of Standards and Technology (NIST) published a Risk Management Framework for AI (RMF) in 2023. Similar to the EO, the RMF outlines broad goals (i.e., Govern, Map, Measure, and Manage) to help organizations address and manage the risks of AI tools and systems. A supplementary NIST “Playbook”  provides actionable recommendations that implement EO principles to assist organizations to proactively mitigate legal risk under future laws and regulations. For example, a health care organization may uphold AI governance and non-discrimination by deploying a diverse, AI-trained compliance team.Continue Reading Forecasting the Integration of AI into Health Care Compliance Programs

The Connecticut Data Privacy Act (CDPA), which became effective on July 1, 2023, provides Connecticut residents with certain rights over their personal information and establishes responsibilities and privacy protection standards for businesses that process personal information. Notably, the CDPA allows businesses a 60-day cure period to correct violations without penalties through the end of 2024.

On November 13, 2023, Governor Kathy Hochul released proposed cybersecurity regulations applicable to all hospitals located within the state of New York. The Governor has included $500 million in grant funding in her FY24 budget to assist health care facilities with upgrading their systems to comply with the new requirements.

According to the Governor’s press

In October 2022, Advocate Aurora Health notified three million individuals of a data breach resulting from its use of tracking pixels on its website for tracking website visitor activity. Now, this month, Advocate Aurora Health settled a class action stemming from that data breach for $12.25 million.

In its breach notification to patients, Advocate Aurora

State privacy laws are changing rapidly in the U.S. Here are summaries of seven new state laws that have been enacted and go into effect in the next few years. We anticipate that more state legislatures will continue to enact privacy laws to protect consumers due to the absence of a federal privacy law.

Under each of the acts summarized below, consumers will have the right to access their personal data, the right to correct inaccurate data, the right to data portability, the right to have their data deleted, and the right to opt out of targeted advertising of personal data. Businesses will be required to practice purpose limitation, maintain data security, get consumer consent for data processing, and complete regular data impact assessments. Businesses will be barred from discriminating against consumers who exercise their rights under the law and will be required to secure data processing agreements with service providers. Similarly, these laws each exclude financial institutions or their affiliates that are governed by, or personal data that is collected, processed, sold, or disclosed in accordance with, Title V of the Gramm-Leach-Bliley Act ; state bodies/agencies; nonprofit organizations; institutions of higher education; national securities associations registered with the SEC; and covered entities or business associates as defined in the privacy regulations of the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA).Continue Reading Seven States Have Upcoming Privacy Laws 

On May 17, 2023, the U.S. Department of Health and Human Services’ Office for Civil Rights (OCR) announced a settlement with MedEvolve, Inc. for $350,000. MedEvolve provides practice and revenue cycle management and practice analytics software services to health care entities. The settlement resulted from MedEvolve’s alleged violation of the Health Insurance Portability and Accountability

One of the challenging things about HIPAA (Health Insurance Portability and Accountability Act) enforcement is the fact that both the Office for Civil Rights and State AGs have jurisdiction to assess fines and penalties for HIPAA violations. The old double whammy.

States enforce those rights sparingly, but New Jersey is getting itself on the map

Last week, Diabetes, Endocrinology & Lipidology Center Inc. (DELC) of West Virginia reached a $5,000 settlement with the Office for Civil Rights (OCR) over  allegations that it failed to provide timely access to a patient’s health records.   The OCR alleged that DELC waited more than two years to send a minor’s medical records to their