Montana Governor Greg Gianforte has signed SB 351, the Genetic Information Privacy Act (GINA), which “requires an entity to provide consumer information regarding the collection, use, and disclosure of genetic data; providing for limitations and exclusions; providing for enforcement authority; and providing definitions.”

GINA requires entities that collect genetic data, defined as:

any data, regardless of format, concerning a consumer’s genetic characteristics, which includes but is not limited to:

(i) raw sequence data that result from sequencing all or a portion of a consumer’s extracted DNA;

(ii) genotypic and phenotypic information obtained from analyzing a consumer’s raw sequence data; and

(iii) self-reported health information regarding a consumer’s health conditions that the consumer provides to an entity that the entity:

      (A) uses for scientific research or product development; and

      (B) analyzes in connection with the consumer’s raw sequence data.

GINA applies to any entity that offers consumer genetic testing products or services (like 23andMe and Ancestry.com) directly to a consumer, or collects, uses, or analyzes genetic data. It does not apply to genetic testing that is covered by HIPAA, which would include genetic testing done through a physician or hospital.

GINA requires covered entities to provide clear notice to consumers about the collection, use and disclosure of their genetic information through their privacy policy and to obtain “express consent” from the consumer for the collection, use, and disclosure of the genetic data. Separate informed consent is required for the disclosure of genetic data to a third party.

The new law codifies basic notice and consent ideals for the collection of sensitive data from consumers that consumers may mistakenly believe HIPAA applies to genetic information. (See previous Privacy Tip on disclosure of genetic information). This emphasizes how important reading an entity’s website privacy policy is before you send a swab to a genetic testing company. GINA gives the Montana Attorney General jurisdiction to enforce violations of the law, including actual damages to a consumer, attorney’s fees, and costs and up to $2500 per violation.

Governor Mark Gordon signed the Wyoming Genetic Data Privacy Act into law on March 8, 2022. The law goes into effect on July 1, 2022.

The Genetic Data Privacy Act requires any business that collects genetic data from individuals to: (1) provide transparent information to consumers about the collection, use, and disclosure of genetic data before collecting it and (2) obtain express consent from an individual before collecting the genetic data. The Act also includes strict prohibitions on how the genetic data can be disclosed and retained. The law does not apply to covered entities or business associates collecting protected health information under HIPAA.

The law provides consumers with the statutory right to request deletion of the data when they are no longer being used or needed for the purpose for which they were collected. It also provides consumers with a private right of action to seek damages from anyone who violates the Act.

The Attorney General of Wyoming has jurisdiction to enforce the law, which carries penalties of up to $2,500 for each violation, actual damages for consumers who have been harmed, and attorneys’ fees and costs.

I have written about genetic testing kits before, but this subject matter is worth repeating. I find that people don’t always understand the consequences of sending a swab to a genetic testing company.  Consumer Reports recently came out with a study led by its Digital Lab experts entitled “The Privacy Problems of Direct-to-Consumer Genetic Testing” which prompted me to revisit this as a Privacy Tip.

This is always a fun topic during my Privacy Law class, and students are often shocked when we discuss the laws that apply—or, don’t apply—to this highly sensitive information.

That said, whatever you decide to do with that genetic testing kit you got for Christmas is your own personal decision. However, before you send it in, you may wish to read the genetic testing company’s privacy policy and the Consumer Reports Digital Lab experts’ report linked above. You may also wish to take into consideration your family members’ privacy, because when you submit your own genetic makeup to a private company, you are also submitting part of the genetic makeup of your whole family, as their information is part of your swab.

Everyone knows how I feel about those home genetic testing kits—most people don’t understand that when they send their DNA to a private company that it is not protected by HIPAA or any other law, and the company can legally use and disclose it, including selling it to other companies. Understand what companies are doing with your genetic data and DNA before you just pop it to them in the mail. 

With that said, this week, the U.S. Department of Health and Human Services (HHS) Office of Inspector General (OIG) issued a warning (Alert) to the public about a fraud scheme involving genetic testing. 

According to the Alert, “Scammers are offering Medicare beneficiaries ‘free’ screening or cheek swabs for genetic testing to obtain their Medicare information for identity theft or fraudulent billing purposes. Fraudsters are targeting beneficiaries through telemarketing calls, booths at public events, health fairs, and door-to-door visits.”

It is disturbing that fraudsters continue to prey on our seniors, and this is just another scam targeting them.

The Alert says that if a person agrees to genetic testing, that individual is asked to confirm his or her Medicare information, and receives a cheek swab, an in-person test, or a testing kit in the mail. These tests have not been ordered by a physician and have not been determined to be medically necessary.

The fraudsters then submit a claim with Medicare for reimbursement, and when it is denied, the beneficiary is responsible to pay for it, “which could be thousands of dollars.”

The Alert gives ways you can protect yourself, including:

  • If a genetic testing kit is mailed to you, don’t accept it unless it was ordered by your physician. Refuse the delivery or return it to the sender. Keep a record of the sender’s name and the date you returned the items.
  • Be suspicious of anyone who offers you “free” genetic testing and then requests your Medicare number. If your personal information is compromised, it may be used in other fraud schemes.
  • A physician that you know and trust should assess your condition and approve any requests for genetic testing.
  • Medicare beneficiaries should be cautious of unsolicited requests for their Medicare numbers. If anyone other than your physician’s office requests your Medicare information, do not provide it.
  • If you suspect Medicare fraud, contact the HHS OIG Hotline.

Please pass this along to the seniors in your life to help protect them from this fraud.

On June 3, 2019, the U.S. Department of Health and Human Services Office of Inspector General (OIG) issued a fraud alert to notify consumers about genetic testing fraud schemes (the Alert). According to the OIG, fraudulent actors are using the provision of free genetic testing kits to obtain Medicare information from unwitting consumers, and then using the stolen information for purposes of fraudulent billing and/or identity theft.

In the Alert, OIG advises consumers to protect themselves by:

  • Not accepting mailed genetic testing kits unless ordered by a physician;
  • Closely scrutinizing any request for Medicare information tied to the offer of free genetic testing;
  • Verifying that your physician approves any requests for genetic testing; and
  • Not providing Medicare information to anyone other than a provider’s office.

That the OIG felt compelled to issue the Alert indicates its level of concern with fraudulent scams perpetrated under the guise of free genetic testing. It is not surprising that as genetic testing advances and the options for such testing proliferate, scammers are seeking to take advantage. The Alert therefore provides a welcome reminder to consumers to closely guard Medicare and other personal information. Health care providers and plans would be well-advised to review the Alert and notify their patients about the rising incidence of this scheme.

 

 

The deservedly well-publicized arrest of the Golden State Killer last fall was a coup for law enforcement, and a marvelous use of modern technology. Sequencing the DNA profile of material left by the killer at a crime scene 40 years ago, then scouring publicly available databases for a genetic match, and ultimately making the arrest were strokes of genius by all parties involved.

The question is not “should police have done this?” Of course, yes! Instead, the larger question is two-fold: do people know that their DNA information is going to be shared with government entities, and separately, how are we going to regulate public and private actors seeking to make use of the DNA information currently held by private companies?

To the first question, it should be noted that most DNA databases are (seemingly) transparent with users about how their information could be used. It is both a good business practice and a sound legal strategy to put this information front and center, allowing users to opt IN to the things they want to participate in (including use by law enforcement, medical researchers, genealogists, and the like) as opposed to forcing them to opt OUT of the things that they don’t want be involved with. Yet even when this is executed perfectly and upheld honestly, it remains a thorny issue- a single user agreeing to participate in any use of their genetic data is making that choice not only for himself/herself, but also making that choice for their entire family, and often without their knowledge or consent.

As to the second question, it largely remains to be seen how this will be regulated. It would be nice to believe that use of the massive libraries of genetic information in existence will only ever be used for altruistic purposes, such as catching serial killers or curing diseases. But a failure to acknowledge the potential for misuse would be naïve in the worst of ways, and the fact that we have allowed the industry to get so far ahead of the law is cause for major concern. The only law of note currently in place. The Genetic Information Non-Discrimination Act, or (GINA) is far too narrow in scope to be a source of comfort; beyond that, the world relies merely on the hope that these companies will act responsibly. And as the FamilyTreeDNA scandal this week has revealed, that hope can be all too easily betrayed.

This post was authored by Kyle Prigmore, candidate juris doctor, Roger Williams University School of Law. Kyle is not yet admitted to practice law.

I had very interesting conversations with both of my classes in the last week over the sharing of genetic information in the context of learning about the Genetic Information Non-Discrimination Act (GINA). GINA generally prohibits employers and insurers from using genetic information to discriminate in employment or insurance underwriting.

People mistaken believe that GINA protects the privacy of all genetic information. But it doesn’t. It only applies in very specific instances. When individuals take a swab from the inside of their mouth and send it to private companies for analysis to determine their ancestry or genetic predisposition, they are sending their DNA to a company that is not regulated like a doctor’s office or hospital. If an individual gets DNA testing at a doctor’s office or hospital, the doctor or hospital can perform the analysis, but then has very specific legal requirements on what they cannot do with the information, including disclose it to others or sell it.

Before you send that swab to a private company, take a look at their Privacy Policy so you are fully informed about what they are doing with the information, to who they are disclosing it, and to whom they are selling it. Try to determine how they can aggregate your genetic information with other information and if it can be disclosed to your life insurer, employer or law enforcement.

Here are some interesting articles to consider before you send that swab:

https://apple.news/A6vDj8z7GQFe6psTEYRZGTw

https://www.bloomberg.com/news/articles/2019-02-01/major-dna-testing-company-is-sharing-genetic-data-with-the-fbi

https://www.gsk.com/en-gb/media/press-releases/gsk-and-23andme-sign-agreement-to-leverage-genetic-insights-for-the-development-of-novel-medicines/

And you may wish to discuss this decision with the rest of your family, because when you send your genetic information to these companies, you are in effect sending your entire family’s as well without their consent.

House bill HR 1313, introduced by Representative Virginia Foxx (R-N.C.), proposes to allow companies to require employees to undergo genetic testing, then allow employers to see the results, and impose financial penalties on any employees who request to opt out of the requirement.

The bill, which was before the House Committee on Education and the Workforce, was supported by all 22 Republicans and opposed by all 17 Democrats on the Committee.

Those in support of the bill state that the legislation would give employers the ability to offer wellness plans and promote a healthy workforce and lower health care costs.

Critics say the bill would eviscerate the Genetic Information Non-Discrimination Act (GINA) and the Americans with Disabilities Act (ADA) which specifically prohibit employers from asking for, accessing or using genetic information for certain actions that are considered discriminatory.

We will be watching this bill closely.

Genetic information is basically one’s DNA sequence, which includes health information and genetic information about the individual and their family. It is at the core of one’s individual privacy, as well as providing information on family members. As technology advances, genetic testing is easier and cheaper to obtain, and there are numerous companies in the market that offer quick and cost-effective genetic testing. The ethics of genetic testing is outside the scope of this piece, but is interesting in its own right.

Vanderbilt University School of Medicine has announced that it received a four-year, $4 million grant from the National Institutes of Health to establish the Vanderbilt Center for Genetic Privacy and Identity in Community Settings, which will study privacy concerns associated with the use of genomic information.

According to Vanderbilt, the Center will “examine the likelihood that lapses in protecting genomic information allow people to be identified, how people perceive such risks, and how effective legal and policy efforts are in reducing them.” The Center’s goal is to develop policy recommendations about this complex area.

Why should we be concerned about the use of our genomic information? According to the Council for Responsible Genetics (www.councilforresponsiblegenetics.org), one reason why the use of genetic information is important is because of genetic discrimination. It documented over five hundred cases where the use of genetic information was used to deny individuals employment or health or life insurance. The Genetic Information Non-discrimination Act (GINA) was passed in 2008 to provide individuals protection from these types of discriminatory behaviors.

Genomic information is used by law enforcement to investigate crimes and DNA is now being used to exonerate those who have been wrongfully accused and imprisoned. Often, when one is accused of a crime, s/he is required to submit to a DNA test and has no choice, which was upheld by the U.S. Supreme Court in a 5-4 decision. The FBI’s National DNA Index System (NDIS) is a database that is populated with DNA samples of crime scenes, those arrested for crimes and those convicted of crimes. It holds millions of samples. And it doesn’t delete the samples of those arrested, but not convicted or of victims. Privacy advocates contend that the DNA samples can be used for other purposes—there are no rules around how the samples can be used by law enforcement once they have it, and that samples collected in a law enforcement setting should have privacy protections over how it is collected, maintained, stored, used and expunged.

Most concerning are the issues around surreptitious collection of DNA or genomic information. There is no federal law that prohibits surreptitious DNA testing. Some states have enacted legislation prohibiting the use and disclosure of genetic information, but not all. Further, when consumers send off a swab of the inside of their mouth to a private company to perform genetic testing, the company is not prohibited by law from using, selling or further disclosing the information as it is not covered by HIPAA and prohibited from doing so like your doctor or hospital is.

In fact, usually the individual has given consent to allow the company to use and disclose the information any way it sees fit somewhere in the fine print.

According to the Presidential Commission for the Study of Bioethical Issues’ publication “Privacy and Progress in Whole Genomic Sequencing,” one of the greatest concerns of the collection of genomic data is “Because whole genome sequence data provide important insights into the medical and related life prospects of individuals as well as their relatives–who most likely did not consent to the sequencing procedure–these privacy concerns extend beyond those of the individual participating in whole genome sequencing…data gathered now may well reveal important information, entirely unanticipated and unplanned for…”

Another privacy concern listed includes the potential for unauthorized access to and misuse of information. The example given is someone picking up a discarded coffee cup, sending the cup and saliva from the cup to a commercial lab to try to find out the person’s predisposition of a neurodegenerative disease and use it in a custody dispute, or expose it on social media to embarrass the individual or “adversely affect that individual’s chance of finding a spouse, achieving standing in the community, or pursuing a desired career path” or worse-like blackmail.

You might not have control over some collection of your DNA, but you do have control of giving your genomic information to commercial entities. Before you do, consider the impact of sharing your DNA with commercial entities and find out what you are consenting to before you send it. Your genomic information includes information about your family members too, so your decision may affect others. Be educated on how your genomic information will be used, sold or disclosed before you send it off and consent to its unlimited use. It may affect you or your children in the future.

On May 17, 2024, Colorado Governor Jared Polis signed, “with reservations,” Senate Bill 42-205, “Concerning Consumer Protections in Interactions with Artificial Intelligence Systems” (the Act). The first of its kind in the United States, the Act takes effect on February 1, 2026, and requires artificial intelligence (AI) developers, and businesses that use high-risk AI systems, to adhere to certain transparency requirements and AI governance.

The Governor sent a letter to the Colorado General Assembly explaining his reservations about signing the Act. He noted that the bill “targets ‘high risk’ AI systems involved in making consequential decisions, and imposes a duty on developers and deployers to avoid ‘algorithmic discrimination’ in the use of such systems.” He encouraged the legislature to “reexamine” the concept of algorithmic discrimination of the results of AI system use before the effective date in 2026.

If your company does business in Colorado and either develops or deploys AI systems, your company may need to first determine whether the systems used qualify as high-risk AI systems. A “High-Risk AI System” means any AI system that, when deployed, makes or is a substantial factor in making a consequential decision. A “Consequential Decision” has a material legal or significant effect on the provision or denial of education enrollment/education opportunity, employment opportunity, financial or lending service, essential government service, health care services, housing, insurance, or a legal service.

Unlike other state consumer privacy laws, this Act does not have a threshold number of consumers to trigger applicability. Further, both the Act and the Colorado Privacy Act (CPA) (similar to the California Consumer Privacy Act (CCPA)) use the term “consumers,” but the term refers to Colorado residents under this Act. At the same time, the CPA defines consumers as Colorado residents “acting only in an individual or household context,” excluding anyone in a commercial or employment context. Therefore, businesses that may not be subject to the CPA may have obligations under the Act.

The Act aims to prevent algorithmic discrimination in the development and use of AI systems. “Algorithmic discrimination” means any condition in which the use of an artificial intelligence system results in an unlawful differential treatment or impact that disfavors an individual or group of individuals based on their actual or perceived age, color, disability, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, reproductive health, sex, veteran status, or other protected classification under state or federal law.

What are the requirements of the Act?

For Developers:

  • To avoid algorithmic discrimination in the development of high-risk artificial intelligence systems, developers must develop a statement describing the “reasonably foreseeable uses and known harmful or inappropriate uses of the system,” the type of data used to train the system, risks of algorithmic discrimination, the purpose of the system and the intended benefits and uses of the system. 
  • Additionally, the developer must provide documentation with the AI product stating how the system was evaluated to mitigate algorithmic discrimination before it was made available for use, the data governance measures utilized in development, how the system should be used (and not be used), and how the system should be monitored when used for consequential decision-making. Developers are also required to update the statement no later than 90 days after modifying the system.
  • Developers must also disclose to the Colorado Attorney General any known or reasonably foreseeable risks of algorithmic discrimination arising from system’s intended uses without unreasonable delay but no later than 90 days after discovery (through ongoing testing and analysis or a credible report from a business).

For Businesses:

  • Businesses that use high-risk AI systems must implement a risk management policy and program to govern the system’s deployment. The Act sets out specific requirements for that policy and program and instructs businesses to consider the size and complexity of the company itself, the nature and scope of the systems, and the sensitivity and volume of data processed by the system. Businesses must also conduct an impact assessment for the system at least annually in accordance with the Act. However, there are some exemptions from this impact assessment requirement (e.g., fewer than 50 employees, does not use its own data to train the high-risk AI system, etc.).
  • Additionally, businesses must notify consumers that they are using an AI system to make a consequential decision before the decision is made. The Act sets forth the specific content requirements of the notice, such as how the business manages known or reasonably foreseeable risks of algorithmic discrimination that may arise from the system’s deployment. As applicable, if the CPA applies to the business (in addition to the Act), the company must also provide consumers the right to opt out of the processing of personal data by such AI systems for profiling purposes.
  • Businesses must also disclose to the Colorado Attorney General any known or reasonably foreseeable risks of algorithmic discrimination arising from the use of the system no later than 90 days after discovery.

The Act requires developers and businesses who deploy, offer, sell, lease, license, give, or otherwise make available an AI system that is intended to interact with consumers to disclose to each consumer who interacts with the system that the consumer is interacting with an AI system.

Although noting that the Act is “among the first in the county to attempt to regulate the burgeoning artificial intelligence industry on such a scale,” Colorado’s Governor stated in his letter to the legislature that “stakeholders, including industry leaders, must take the intervening two years before this measure takes effect to fine-tune the provisions and ensure that the final product does not hamper development and expansion of new technologies in Colorado that can improve the lives of individuals across our state.” He further noted:

“I want to be clear in my goal of ensuring Colorado remains home to innovative technologies and our consumers are able to fully access important AI-based products. Should the federal government not preempt this with a needed cohesive federal approach, I encourage the General Assembly to work closely with stakeholders to craft future legislation for my signature that will amend this bill to confirm with evidence based findings and recommendations for the regulation of this industry.”

As we have seen with state consumer privacy rights laws, this new AI law may be a model that other states will follow but, based upon the Governor’s letter to the Colorado legislature, we anticipate that there will be additional iterations of the law before it becomes effective. Stay tuned.