Some writers (not from my great state of Rhode Island) act like Rhode Island has been behind the times when it comes to data privacy and security when discussing the state’s new privacy law. I feel a need to explain that this is just not so. Rhode Island is not a laggard when it comes to data privacy.

Rhode Island has had a data privacy law on its books for a long time, though it was not called a privacy law. It was the Rhode Island Identity Theft Protection Act, which was enacted in 2015. It was designed to protect consumers’ privacy and provide data breach notification. It was amended to include data security requirements in the footsteps of the then-novel Massachusetts data security regulations. It was a one-stop shop for data privacy, security, and breach notification. Still, it did not provide individuals the right to access or delete data and was not as robust as new data privacy laws. Rhode Island was an early state to include health information in its definition of personal information that requires breach notification in the event of unauthorized access, use, or disclosure of health information. Many states still do not include health information in the definition of breach notification.

But just so the record is clear, consumer protection has been in the DNA of Rhode Island’s laws for many years, and the new privacy law was an expansion of previous efforts to protect consumers.

The new privacy law in Rhode Island expands the privacy protections for consumers and is the latest in a wave of privacy laws being enacted in the United States. As of this writing, 19 states have new privacy laws, and Rhode Island makes it 20.

All of the privacy laws are fairly similar, except for California, which is the only state to date that provides for a private right of action in the event of a data breach (with requirements prior to the filing of a lawsuit).

That said, for those readers who will fall under the Rhode Island law and are in my home state, here are the details of the law (the Rhode Island Data Transparency and Privacy Protection Act (RIDTPPA)) of which you should be aware:

Continue Reading Rhode Island’s New Data Privacy Law

Wow! It’s hard to believe this blog marks the 400th Privacy Tip since I started writing many years ago. I hope the tips have been helpful over the years and that you have been able to share them with others to spread the word. 

I thought it would be fun to pick 10 (ok—technically, a few more than 10) Privacy Tips and re-publish them (in case you missed them) in honor of our 400th Privacy Tip milestone. We have published tips that are relevant to the hot issues of the time, but some are time-honored. It was really hard to pick, but here they are:

Continue Reading Privacy Tip #400 – Best of First 400 Privacy Tips

On May 9, 2024, Governor Wes Moore signed the Maryland Online Data Privacy Act (MODPA) into law. MODPA applies to any person who conducts business in Maryland or provides products or services targeted to Maryland residents and, during the preceding calendar year:

  1. Controlled or processed the personal data of at least 35,000 consumers (excluding personal data solely for the purpose of completing a payment transaction); or
  2. Controlled or processed the personal data of at least 10,000 consumers and derived more than 20 percent of its gross revenue from the sale of personal data.

MODPA does not apply to financial institutions subject to Gramm-Leach-Bliley or registered national securities associations. It also contains exemptions for entities governed by HIPAA.

Under MODPA, consumers have the right to access their data, correct inaccuracies, request deletion, obtain a list of those who have received their personal data, and rights to opt-out of processing for targeted advertising, the sale of personal data, and profiling in furtherance of solely automated decisions. The data controller must provide this information to the consumer free of charge once during any 12-month period unless the requests are excessive, in which case the controller may charge a reasonable fee or decline to honor the request.

MODPA prohibits a controller – defined as “a person who determines the purpose and means of processing personal data” – from selling “sensitive data.” MODPA defines “sensitive data” to include genetic or biometric data, children’s personal data, and precise geolocation data.  “Sensitive data” also means personal data that includes data revealing a consumer’s:

  • Racial or ethnic origin
  • Religious beliefs
  • Consumer health data
  • Sex life
  • Sexual orientation
  • Status as transgender or nonbinary
  • National origin, or
  • Citizenship or immigration status

A MODPA controller also may not process “personal data” in ways that violate discrimination laws. Under MODPA, “personal data” is any information that is linked or can be reasonably linked to an identified or identifiable consumer but not de-identified data or “publicly available information.” However, MODPA contains an exception if the processing is for (1) self-testing to prevent or mitigate unlawful discrimination; (2) diversifying an applicant, participant, or customer pool, or (3) a private club or group not open to the public.

MODPA has a data minimization requirement as well. Controllers must limit the collection of personal data to that which is reasonably necessary and proportionate to provide or maintain the specific product or service the consumer requested.

A violation of MODPA constitutes an unfair, abusive, or deceptive (UDAP) trade practice, which the Maryland Attorney General can prosecute. Each violation may incur a civil penalty of up to $10,000 for each violation and up to $25,000 for each repeated violation. Additionally, a person committing a UDAP violation is guilty of a misdemeanor, punishable by a fine of up to $1,000 or imprisonment of up to one year, or both. MODPA does not allow a consumer to pursue a UDAP claim for a MODPA violation, although it also does not prevent a consumer from pursuing any other legal remedies. MODPA will take effect October 1, 2025, with enforcement beginning April 1, 2026.

Congress is once again entertaining federal privacy legislation. The American Privacy Rights Act (APRA) was introduced by Senate Commerce Committee Chair Maria Cantwell (D-WA) and House Energy and Commerce Chair Cathy McMorris Rodgers (R-WA).

Unlike current laws, the APRA would apply to both commercial enterprises and nonprofit organizations, as well as common carriers regulated by the Federal Communications Commission (FCC). The law would have a broad scope but provide a conditional exemption for small businesses with less than $40 million in revenue and data on fewer than 200,000 consumers. However, this exemption would not apply if the small business transfers data to third parties for value. The APRA would require data minimization, i.e., prohibiting covered entities from collecting more personal information than is strictly necessary for the stated purpose.

The APRA defines sensitive data broadly as data related to government identifiers, health, biometrics, genetics, financial accounts and payments, precise geolocation, log-in credentials, private communications, revealed sexual behavior, calendar or address book data, phone logs, photos and recordings for private use, intimate imagery, video viewing activity, race, ethnicity, national origin, religion or sex, online activities over time and across third-party websites, information about a minor under the age of 17, and other data the FCC defines as sensitive covered data by regulation. Sensitive data would require affirmative express consent before transfer to third parties. Those meeting the definition of “covered entities would need to give clear disclosures and easy opt-out options.

Notably, the APRA is a departure from the current federal standard set by the Children’s Online Privacy Protection Act (COPPA), which places the cutoff at 13.

The APRA would require algorithmic bias impact assessments for “covered algorithms” that make consequential decisions. It would also prohibit discriminatory use of data. “Large data holders” and “covered high-impact social media companies” would face additional obligations around reporting, algorithm audits, and designated privacy/security officers.

While privacy professionals across the country will collectively groan at a law other than HIPAA using the term “covered entity,” the simplicity of a single standard rather than the current patchwork of state laws may just be worth the headache of two federal privacy laws using the same term with different definitions. However, it remains to be seen whether the APRA will make it to the Congress floor. We’ve reported in the past about attempts at a federal standard that ended up stalling in committee.

You can read the full APRA draft here.

Last month, Nebraska passed the Nebraska Data Privacy Act (NDPA), making it the latest state to enact comprehensive privacy legislation. Nebraska joins California, Virginia, Colorado, Connecticut, Utah, Iowa, Indiana, Tennessee, Montana, Oregon, Texas, Florida, Delaware, New Jersey, New Hampshire, Kentucky, and Maryland. The law will take effect on January 1, 2025.

The NDPA applies to entities that conduct business in Nebraska or produce products or services consumed by Nebraska residents and that process or sell personal data of Nebraska residents. Similar to other state consumer privacy laws, the NDPA exempts nonprofits, government entities, financial institutions, and protected health information under the Health Insurance Portability and Accountability Act.

Consumers are granted the following rights under the NDPA: rights of access, deletion, portability, and correction; the right to opt-out of targeted advertising; and the sale of personal data, and/or automated profiling. Similar to the California Consumer Privacy Act, the NDPA defines the “sale of personal data” as “the exchange of personal data for monetary or other valuable consideration.” The NDPA also requires businesses to obtain consent before processing consumer-sensitive data. Sensitive data includes personal data that reveals racial or ethnic origin, religious beliefs, a mental or physical health diagnosis, sexual orientation, citizenship or immigration status, genetic or biometric data processed to uniquely identify individuals, personal data collected from a known minor, and precise geolocation data.

Lastly, the NDPA will require businesses to conduct Data Protection Impact Assessments for all processing activities that involve targeted advertising, the sale of personal data, some types of profiling, the processing of sensitive data, or that otherwise present a heightened risk of harm to the consumer.

If the NDPA applies to your business, the business is subject to enforcement by the Nebraska Attorney General, but there is a 30-day right to cure violations of the NDPA that does not sunset.

The Federal Trade Commission (FTC) and the California Attorney General teamed up against California company CRI Genetics, LLC, filing a joint complaint against the company alleging that it engaged in deceptive practices when it “deceived consumers about the accuracy of its test reports compared with those of other DNA testing companies, falsely claimed to have patented an algorithm for its genetic matching process, and used fake reviews and testimonials on its websites.” In addition, the complaint alleged that CRI “used ‘dark patterns’ in its online billing process to trick consumers into paying for products they did not want and did not agree to buy.”

CRI agreed to pay a $700,000 civil penalty to settle the action, and agreed to change its marketing practices so it does not misrepresent that: its DNA testing product or service is more accurate or detailed than others; it can show the geographic location of ancestors “with a 90 percent or higher accuracy rate”; or it will show “exactly where consumers’ ancestors came from or exactly when or where they arrived, among others.

The proposed order also requires CRI to disclose the costs of products and services to consumers, obtain consent from consumers about how it may share DNA information, and delete the genetic and other information of consumers who received refunds and requested that their data be deleted.

The full FTC unanimously authorized the proposed order which will now be presented to a federal judge for approval.

We previously reported on the unfortunate data breach suffered by 23andMe last month and its implications. We never imagined how horrible it could be.

According to an October 6, 2023, posting by Wired earlier that week, hackers involved with the 23andMe breach posted “an initial data sample on the platform BreachForums…claiming that it contained 1 million data points exclusively about Ashkenazi Jews…and starting selling 23andMe profiles for between $1 and $10 per account.”

Several days later, it was reported that “another user on BreachForums claimed to have the 23andMe data of 100,000 Chinese users.”

The implications of posting account information, including potential genetic information of users for political or hateful reasons is real and happening in real time. According to news reports, the war in Gaza “is stoking antisemitism in the U.S.” and across the world. Preliminary data from the Anti-Defamation League shows a 388% jump of antisemitic incidents in the U.S. since Hamas’ attack on Israel on October 7, 2023.  

If you are a 23andMe user, it is important to find out for your safety and well being whether your genetic data was compromised and is posted by extremist threat actors. The Electronic Frontier Foundation published an article, “What to Do if You’re Concerned About the 23andMe Breach” providing more information about the background of the breach, the selling of information, and what you can do to protect yourself further, including deleting your data.

We have published blog posts before on sharing genetic information and the risk associated with the disclosure of such sensitive information.

Unfortunately, our concerns have been realized. On Monday, October 9, 2023, 23andMe confirmed that its investigation into a data security incident involving customer profile information shared through its DNA Relatives feature “was compiled from individual 23andMe.com accounts without the account users’ authorization” by threat actors.

Although its investigation continues, 23andMe is “requiring that all customers reset their passwords and are encouraging the use of multi-factor authentication (MFA).”

The company recommends that its customers take action to keep their account and password secure. It recommends:

  • Confirm you have a strong password, one that is not easy to guess and that is unique to your 23andMe account. If you are not sure whether you have a strong password for your account, reset it.
  •  Enable multi-factor authentication (MFA) on your 23andMe account.
  • Review its Privacy and Security Checkup page with additional information on how to keep your account secure.

We will follow this incident closely. In the meantime, if you or a family member has provided genetic information to 23andMe, you may wish to consider changing your password, telling your family members to do the same and follow the recommendations of 23andMe.

Nevada Governor Joe Lombardo recently signed into law a sweeping and restrictive consumer health data privacy law that requires covered entities (defined as any person who conducts business in the state or produces or provides products or services that are targeted to consumers in Nevada) to provide privacy rights to consumers who provide health data to companies not covered by other laws that apply to health care providers.

The broad law defines “consumer health data”, as “personally identifiable information that is linked or reasonably capable of being linked to a consumer and that a regulated entity uses to identify the past, present or future health status of the consumer,” and includes a long list of data elements including health condition, status, disease or diagnosis, social psychological, behavioral or medical intervention, surgeries, use or acquisition of medication, bodily functions, vital signs or symptoms, reproductive or sexual health care, gender-affirming care, biometric data, genetic data, precise geolocation data, and any data that is derived or extrapolated from information that is not consumer health data through an algorithm, machine learning, or any other means, but excludes shopping habits and gaming information.

The law requires covered entities to make available to consumers a privacy policy on the internet regarding its privacy of consumer health data, prohibits an entity from collecting or sharing consumer health data without affirmative, voluntary consent of a consumer, requires the entity to establish a process to obtain access to (and an appeal process for responding to) consumer requests, limits the entity from processing the data, prohibits entities from selling consumer health data, prohibits geofencing and prohibits discrimination against individuals for exercising their rights under the law. The law provides for a civil penalty of not more than $5,000 per violation and includes criminal provisions. It becomes effective on March 31, 2024.

Effective Date: January 1, 2023 

Your Rights and Choices

The California Consumer Privacy Act, as amended by the California Privacy Rights Act (collectively the “CCPA”) provides California residents with specific rights regarding their personal information. In addition to our Privacy Policy https://www.rc.com/privacy-policy.cfm this webpage further describes your CCPA rights and explains how to exercise those