This week we are pleased to have a guest post by Robinson+Cole Business Transaction Group lawyer Tiange (Tim) Chen.

On February 28, 2024, the Justice Department published an Advanced Notice of Proposed Rulemaking (ANPRM) to seek public comments on the establishment of a new regulatory regime to restrict U.S. persons from transferring bulk sensitive personal data and select U.S. government data to covered foreign persons.

The ANPRM was published as a response to a new White House Executive Order (EO), issued pursuant to the International Emergency Economic Powers Act (IEEPA), which requires the Justice Department to propose administrative regulations within 6 months to respond to potential national security threats arising from cross-border personal and government data transfers.

Covered Data Transactions

Under the ANPRM, the Justice Department may restrict U.S. persons from engaging in a “covered data transaction,” which may refer to:

  • (a) a “transaction”: acquisition, holding, use, transfer, transportation, exportation of, or dealing in any property in which a foreign country or national thereof has an interest;
  • (b) that involves (1) bulk U.S. sensitive personal data; or (2) government-related data; and
  • (c) that involves (1) data brokerage, (2) a vendor agreement, (3) an employment agreement, or (4) an investment agreement.

Bulk Sensitive Personal Data. According to the ANPRM, the term “sensitive personal data” includes:

(1) specifically listed categories and combinations of covered personal identifiers (not all personally identifiable information), (2) precise geolocation data, (3) biometric identifiers, (4) human genomic data, (5) personal health data, and (6) personal financial data.

Only transactions exceeding certain “bulk,” or threshold volume, will be subject to the relevant restrictions based on the number of U.S. persons or U.S. devices involved.

Government-related Data. According to the ANPRM, the term means (1) any precise geolocation data, regardless of volume, for any geofenced location within an enumerated list, and (2) any sensitive personal data, regardless of volume, that links to current or former U.S. government, military or Intelligence Community employees, contractors, or senior officials.

Prohibited, Restricted, and Exempted Transactions

The EO and ANPRM propose a three-tier approach to differentiate the types of restrictions subject to the proposed rules.

Prohibited Transactions. The ANPRM generally prohibits a U.S. person to knowingly engage in a “covered data transaction” with a country of concern or covered person.

Restricted Transactions. The ANPRM provides that for U.S. persons involved in “covered data transactions” relating to a vendor, employment or investment agreement, such transactions may be permissible if adequate security measures are taken consistent with relevant rules to be promulgated by the Cybersecurity and Infrastructure Security Agency of the Department of Homeland Security.

Exempted Transactions. The ANPRM proposes to exempt certain types of transactions, including: (1) data transactions involving personal communication, information or information materials carved out by IEEPA, (2) transactions for official government business, (3) financial services, payment processing or regulatory compliance related transactions, (4) intra-entity transactions incident to business operations, and (5) transactions required or authorized by federal law or international agreements.

Licensing Regime. The EO authorizes the Justice Department to grant specific (entity or person-specific transaction) and general (that cover broad classes of transactions) licenses for U.S. persons to engage in prohibited and restricted transactions. The Justice Department is considering establishing a licensing regime modeled on the economic sanctions licensing regime managed by the Treasury Department’s Office of Foreign Asset Control.

Countries of Concerns and Covered Persons

Countries of Concerns. The ANPRM proposes to identify China (including Hong Kong and Macau), Russia, Iran, North Korea, Cuba, and Venezuela as the countries of concern.

Covered Persons. The ANPRM proposes to define the “covered persons” as (1) an entity owned by, controlled by, or subject to the jurisdiction or direction of a country of concern, (2) a foreign person who is an employee or contractor of such an entity, (3) a foreign person who is an employee or contractor of a country of concern, and (4) a foreign person who is primarily resident in the territorial jurisdiction of a country of concern. The Justice Department may also designate specific persons and entities as “covered persons.”

Implementations

The regime will only become effective upon the publication of final administrative rules. The scope of the final rules may significantly differ from the proposals published in the ANPRM. In addition, the EO affords significant discretions to the Justice Department and other agencies to issue interpretative guidance and enforcement guidelines to further clarify and refine the process and mechanisms for complying with the final rules, including potential due diligence, record keeping, or voluntary reporting requirements.

The Federal Trade Commission (FTC) and the California Attorney General teamed up against California company CRI Genetics, LLC, filing a joint complaint against the company alleging that it engaged in deceptive practices when it “deceived consumers about the accuracy of its test reports compared with those of other DNA testing companies, falsely claimed to have patented an algorithm for its genetic matching process, and used fake reviews and testimonials on its websites.” In addition, the complaint alleged that CRI “used ‘dark patterns’ in its online billing process to trick consumers into paying for products they did not want and did not agree to buy.”

CRI agreed to pay a $700,000 civil penalty to settle the action, and agreed to change its marketing practices so it does not misrepresent that: its DNA testing product or service is more accurate or detailed than others; it can show the geographic location of ancestors “with a 90 percent or higher accuracy rate”; or it will show “exactly where consumers’ ancestors came from or exactly when or where they arrived, among others.

The proposed order also requires CRI to disclose the costs of products and services to consumers, obtain consent from consumers about how it may share DNA information, and delete the genetic and other information of consumers who received refunds and requested that their data be deleted.

The full FTC unanimously authorized the proposed order which will now be presented to a federal judge for approval.

Information governance and data retention have been important topics in the corporate world for years. As an executive, it’s crucial to ensure effective management, storage, and secure disposal of your company’s data. Having well-defined information governance and data retention policies helps maintain compliance with legal requirements and safeguards against data breaches and cyber-attacks. In this post, we’ll share 10 essential strategies for implementing a successful information governance and data retention plan, optimizing your company’s data management.

  1. Develop a Comprehensive Data Retention and Information Governance Plan: The first crucial step in creating an effective plan for data retention and information governance is to develop a comprehensive policy. This policy should clearly define the necessary retention of company data, storage locations, access permissions, disposal procedures, and be regularly updated to comply with legal requirements, changing data practices, and company growth.
  2. Employee Training for Enhanced Data Security: Ensuring the success of your data governance policy requires training all employees to understand their roles and responsibilities in safeguarding company data. Your employees are the first line of defense against cyber-attacks and data breaches.
  3. Prioritize Data Security Measures: Regular security checks and audits are essential to protect your company’s information. Your IT department should review and update security measures to ensure data remains safe. Additionally, having backup and recovery plans in place can preempt data breaches and provide a framework for recovery.
  4. Streamline Data Management for Effective Information Governance: Streamlining data management is crucial for efficient information governance. Reduce irrelevant or outdated data by archiving non-essential data, deleting duplicates, and setting clear data and document retention schedules.
  5. Leverage Cloud Technology for Data Storage and Management: The adoption of cloud technology for data storage and management has gained rapid popularity in recent years. Cloud storage solutions offer secure, cost-effective, and scalable methods for storing and accessing data.
  6. Establish a Disaster Recovery Plan: Unforeseen catastrophic events or natural calamities can result in data loss or breaches. Having a robust disaster recovery plan ensures data remains safe and can be retrieved in the event of a system crash.
  7. Implement Data Preservation Procedures: Your Information Governance Policy should outline specific procedures that allow retrieval of archived data when required for litigation or audit purposes. This includes holding certain business records and information in separate preservation formats, protecting this data set from routine backups, and strictly controlling access to it.
  8. Ensure Accountability and Efficiency: The information governance team should be accountable and responsible for documentation management. Provide them with clear job descriptions and access to the necessary information to maximize their efficiency.
  9. Maintain Compliance with Laws and Regulations: Governing authorities have enacted laws and regulations regarding safe storage, data breaches, and access to sensitive information. Organizations must adhere to these legal frameworks, such as GDPR or the California Consumer Privacy Act, hence the importance of maintaining a compliance program.
  10. Regularly Review and Audit: Regularly reviewing and auditing your information governance policy ensures compliance with new regulations, adherence to procedures, and identifies areas for improvement. It also allows identification of document retention periods that have expired and no longer need to be stored.

Accomplishing successful information governance and data retention is a challenging task that requires strict guidelines, consistent enforcement, and centralized storage capabilities. With the implementation of these 10 essential strategies, executives can prioritize information governance and data retention, ensuring control and improved access to critical digital assets. While it may seem daunting, achieving successful information governance and data retention is possible with the right resources and adherence to these strategies. If you need assistance in mapping out your plans for effective solutions, reach out today to discuss how we can help implement a successful information governance and data retention plan for your organization.

The holidays are upon us, including “cyber week” filled with deals for shopping for the holidays. The U.S. Public Interest Research Group (PIRG) is warning shoppers about smart toys this holiday season. In its article, “Consumer watchdog: ‘Smart toys’ put kids’ privacy at risk,” PIRG outlines the privacy risks associated with smart toys. The risks include the use of a camera, microphone, or location tracker embedded in the toy that can collect children’s personal information as well as their families’.

When purchasing a smart toy, I will venture to say that most families do not read the small print of the privacy policy that accompanies it or is on the manufacturer’s website to understand the data that the smart toy is collecting during its use. They are long and detailed, boring, and full of legalese. The devil is definitely in the details, and the details in the privacy policy are what you should be reading carefully before you gift a smart toy to any child.

The most basic understanding of the collection of data from a child while using a toy is of utmost importance before purchasing it. Read the privacy policy and make informed decisions for the protection of the child who is receiving the gift before you wrap it and give it (since, in most cases, children can’t make those decisions on their own). We all know that as soon as a toy is unwrapped by a child, it is virtually impossible to get it back! Let’s all protect kids’ privacy this holiday season and gift tech toys that are respectful of the collection and use of kids’ data or give good old-fashioned toys that don’t collect personal information through microphones, cameras, and location based services.

State and local agencies’ use of unmanned aerial systems (UAS, or drones) has exploded in recent years. In Alaska, the Department of Transportation and Public Facilities (Department) has started using drones for critical infrastructure inspection and avalanche monitoring and mitigation.

With Alaska’s freezing temperatures and expansive geography, drones offer the Department a chance to get a unique aerial view of bridges and other infrastructure. However, the operator typically still has to travel a distance to fly the drone over these structures. To overcome that challenge, the Department introduced UAS “docks.” These docks are systems that allow full support for beyond visual line of sight drone operations. These docks significantly reduce operator and personnel time on the location. The docks provide a remote landing area which allows rapid and continuous deployment of drones. The flight path is programmed by an on-site pilot, and then the docks keep the drone and its battery warm to prevent icing, recharge the device and upload data to an interface through a satellite internet connection.

While these docks have certainly expanded the Department’s use of drones in the state, and increased safety and efficiency, there is another challenge that Alaska faces that could be mitigated by drones: avalanches.

Unlike many other parts of the United States, avalanches present a dangerous, deadly environment for many residents of Alaska. The Department will use docks to deploy drones as a way of monitoring avalanches. The drones will collect snow distribution data without the need for human interaction and provide the data to avalanche specialists who are hundreds of miles away. These specialists will be able to use that data to determine where avalanche mitigation may be necessary. This dock and drone combination will save days of travel and use of a manned aircraft. Data can be collected and analyzed in one day with this model.

Additionally, the FAA granted the Department approval to use a dropping mechanism from the drone that will trigger and redirect avalanches. This new mechanism and drone use will provide a new option for the Department for avalanche mitigation that requires less manpower and dramatically increases safety. Alaska’s dock-drone use case is yet another way that this technology is increasing efficiency and safety and decreasing cost, time, and labor for accomplishing many tasks that formerly took several individuals several days to perform.

This week, the Colorado Supreme Court upheld a criminal conviction which relied in part on evidence obtained pursuant to a warrant for Google search data. People v. Seymour, 2023 CO 53 (Oct. 16, 2023) (available at http://www.courts.state.co.us).

In investigating the cause of a 2020 apparent arson fire at a Denver residence which resulted in the deaths of five people, police served a “reverse-keyword warrant” on Google, requesting information about individuals who had searched for the address of the house. While Google initially objected, it eventually complied and provided data that allowed the investigators to focus on five Google accounts. This ultimately  led to the arrest of three teenagers.

One of the defendants had googled the address for the home that caught fire 14 times in the days before the fire occurred. His attorney challenged the  keyword warrant as an illegal search and sought suppression of the Google search evidence. This was the first such challenge to the constitutionality of keyword search warrants.

The majority opinion upheld the warrant as having been executed in good faith, cautioning however that its holding was a limited one: “Our finding of good faith today neither condones nor condemns all such warrants in the future [. . .] If dystopian problems emerge, as some fear, the courts stand ready to hear argument regarding how we should rein in law enforcement’s use of rapidly advancing technology.” In reaching its decision, the majority grappled with the sensitive nature of an individual’s Google search history, but concluded that—while there are certainly plenty of “innocuous reasons” that an individual would search for an address—the warrant at issue was sufficiently “particularized” and law enforcement took reasonable steps in utilizing this search technique in this case.

The Court’s ruling was not unanimous. Two Justices dissented on the ground that the ruling “gives constitutional cover to law enforcement seeking unprecedented access to the private lives of individuals not just in Colorado, but across the globe.” The dissenting opinion highlighted a fear that this type of warrant will become “the investigative tool of first resort” and that it is “a tantalizingly easy shortcut to generating a list of potential suspects.” Another Justice concurred in the majority’s judgment affirming the conviction but did not join in the majority’s opinion.

While there are very few examples of keyword search warrants, it should cause readers some pause. Society needs to set clear parameters on the use of advanced search techniques (such as keyword searches and even geofence warrants, i.e., where investigators ask Google to provide data about the location of a user’s device near the scene of a crime) in order to balance individual privacy rights against criminal investigations.

The Federal Aviation Administration (FAA) has been working toward one goal when it comes to drones: get more of them safely in the air. However, right now, the FAA is without a permanent leader or a stable funding package. The Senate Commerce Committee recently held a hearing to consider a nomination for FAA Administrator; hopefully that official appointment will come in the near future. Funding, however, is another issue entirely. The 5-year 2018 FAA Reauthorization package expired on September 30, 2023, but authorization was extended until December 31, 2023. This extension allowed employees to keep working, but it does not provide guidelines for the priorities and deadlines in the 2023 FAA Reauthorization package passed by the U.S. House of Representatives in June.  This 2023 package is currently sitting with the Senate.

The 2023 package includes funding and authority for rulemaking to clarify beyond visual line of sight (BVLOS) drone operations. This is one of the biggest issues in the commercial drone industry today. The package requires the FAA to issue a Notice of Proposed Rulemaking within four months of the bill’s enactment. However, the passage of the bill is only the beginning of the BVLOS process for the FAA. The FAA will need to begin an implementation period and allocation of resources. We will watch closely to track the progression of the package and its impact on the drone industry in the coming months. Surely the FAA is working toward its goal of more drones safely in the skies, but the timeline for achieving that goal is still unknown.

On September 8, 2023, the California Privacy Protection Agency (CPPA) will discuss the two new sets of proposed California Privacy Protection Act (CCPA) regulations. Here is a breakdown of the two new proposed regulations and issues up for discussion:

Auditing Requirements: If a business processes data that poses a “significant risk to consumers’ security” then the business must complete an annual cybersecurity audit using an independent auditing professional and file a statement of compliance with the CPPA. The auditor(s) may be internal but the findings must be reporting to the board. Further, these audits must take into account multifactor authentication, encryption and zero-trust architecture. The CPPA will discuss the “significant risk” standard at its upcoming meeting.

AI and Automated Decision-Making Risk Assessments: If businesses use AI systems to make decisions, it must conduct regular and thorough risk assessments considering potential negative impacts to consumers as a result of using such technology. The negative impacts could range from economic harm to reputational and psychological harm. Businesses that do any of the following would be subject to the CCPA:

  • Selling or sharing personal information
  • Processing sensitive personal information
  • Using automated decision-making technology in furtherance of a decision that results in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or contracting opportunities or compensation, healthcare services, or access to essential goods, services, or opportunities
  • Processing the personal information of consumers that the business has actual knowledge are less than 16 years of age
  • Processing the personal information of consumers who are employees, independent contractors, job applicants, or students using technology to monitor employees, independent contractors, job applicants, or students.
  • Processing the personal information of consumers in publicly accessible places using technology to monitor consumers’ behavior, location, movements, or actions.
  • Processing the personal information of consumers to train AI or automated decision-making technology

If your business is subject to the CCPA and it processes data as set forth in the proposed regulations, you should track these changes closely. If your business has not yet assessed its applicability, now is the time to do so. We will monitor these new regulations closely.

In October 2022, Advocate Aurora Health notified three million individuals of a data breach resulting from its use of tracking pixels on its website for tracking website visitor activity. Now, this month, Advocate Aurora Health settled a class action stemming from that data breach for $12.25 million.

In its breach notification to patients, Advocate Aurora Health stated that it had used third-party vendors to “measure and evaluate information concerning the trends and preferences of its patients as they use our websites,” which means the health care system was sharing IP address, locations, times of appointments, and communications within MyChart with these third parties without necessary consent or for an otherwise permissible purpose under the Health Insurance Portability and Accountability Act (HIPAA). Upon discovery of this disclosure, Advocate Aurora Health conducted an internal investigation to determine the scope of patient information that was being transmitted to its third-party vendors.

After the breach notification, many lawsuits were filed and eventually consolidated into a class action complaint. The class action complaint alleged that Advocate Aurora Health’s use of tracking pixels on its website “resulted in the invasion of Plaintiffs’ and Settlement Class Members’ privacy and other alleged common law and statutory violations.”

The $12.25 million settlement will be distributed to class members and to reimburse attorneys’ fees and other expenses. A recent study in Health Affairs found that third-party tracking technologies are being used on 98.6 percent of all U.S. non-federal acute care hospital websites. If your healthcare organization falls into this category, take this settlement and the many other pending pixel class action cases as a reminder to review your website’s use of pixels and other tracking technologies and to update your website privacy policies and data collection practices for compliance.

Earlier this month, the Commissioner of Data Protection of the Dubai International Financial Centre (DIFC), a financial free-zone in the United Arab Emirates (UAE), issued the first adequacy decision regarding the California Consumer Privacy Act (CCPA), which recognizes the CCPA as an equivalent to the DIFC Data Protection Law (DIFC Law No. 5 of 2020, as amended the DIFC DPL).

This decision allows businesses to transfer data between the DIFC and companies located in California, in accordance with the DIFC DPL, without any additional contractual measures. In the DIFC Commissioner’s public statement about this decision, he said, “The importance of additional safeguards for imported personal data is evidenced by the factors set out in published adequacy protocols as well as the DIFC Ethical Data Management Risk Index (EDMRI) and due diligence tool. In evaluating California’s privacy law and regulations, together with implementation, enforcement, and other holistic factors, it became clear that in large part, California importers will treat personal data from DIFC ethically and fairly.” This decision will also likely serve as precedent for the DIFC to establish a similar relationship with other U.S. states. As of today, there are only 49 establishments and/or locations (countries, jurisdictions, and organizations) subject to an adequacy decision by the DIFC.

The decision comes as a result of an assessment by the DIFC commissioner of the grounds for lawful and fair processing of data under the CCPA, the existence of data protection principles and data subjects’ rights, international and onward data transfer restrictions, measures regarding security of processing, and breach reporting and accountability. To read the full decision, click here.  

However, since the CCPA does not have a provision related to the transfer of personal information outside of California or the U.S., DIFC exporters that send personal information to a California-based importer under the decision would still need to ensure that the onward transfer of such personal information is safeguarded. Additionally, this decision will be reviewed annually by the DIFC Commissioner to ensure that the CCPA’s protections still meet expectations.