The Cybersecurity & Infrastructure Security Agency, the FBI and the U.S. Department of Health & Human Services released a Joint Advisory last week warning organizations, particularly those in the health care and public health (HPH) sectors, of the ransomware and data extortion operations by the Daixin Team.

The Advisory is designed to provide information to organizations to help prevent ransomware attacks. According to the Advisory:

The Daixin Team is a ransomware and data extortion group that has targeted the HPH Sector with ransomware and data extortion operations since at least June 2022. Since then, Daixin Team cybercrime actors have caused ransomware incidents at multiple HPH Sector organizations where they have:

  • Deployed ransomware to encrypt servers responsible for healthcare services—including electronic health records services, diagnostics services, imaging services, and intranet services, and/or
  • Exfiltrated personal identifiable information (PII) and patient health information (PHI) and threatened to release the information if a ransom is not paid.

The criminals gain access to victim’s systems through virtual private network servers by exploiting unpatched vulnerabilities or using previously-compromised credentials (obtained with phishing emails) to access VPN servers that do not have multifactor authentication enabled. The Advisory lists the indicators of compromise and mitigation steps that organizations can take to protect against Daixin. If your organization is included in the HPH sectors, prompt attention to the Advisory is warranted.

California law will soon require businesses to treat their employees and business partners as consumers under the California Consumer Privacy Act (CCPA). The CCPA and its successor legislation, the California Privacy Rights Act (CPRA), grant California consumers dignitary rights over their personal information collected and processed by commercial entities that do business in California. The CCPA applies to to such entities that do business in California and collect California consumers’ personal data, have annual gross revenues over $25 million, possess the personal information of 100,000 or more consumers, or earn more than half of their yearly income from brokering data.

Employee, Job Applicant and 1099 Contractor Data

Previously, the CCPA excluded employee data; however, this exemption is set to expire on December 31, 2022. The California State Legislature defied expectations by ending the 2022 legislative session without passing an extension. While the legislature may pass a new exemption in its next legislative session, businesses subject to the CCPA should prepare to process employee CCPA requests as of January 1, 2023.

Fortunately, most businesses already have HR processes to allow employees to access and correct their personal data. Existing OSHA and EEOC record-retention-requirements will also cover most employee data, meaning that it will likely be exempt from deletion requests under the CCPA (i.e., the data cannot be deleted in order to “comply with a legal obligation”).  However, companies must now also allow job applicants to know, view, delete, and correct personal information, and EEOC regulations require businesses to retain applicant records for one year. Businesses must keep close track of when that obligation ends and allow applicants to delete their data as soon as that is legally permissible.

B2B Data

The CCPA also included an exemption for business-to-business (B2B) data collected from agents or representatives of other businesses. However, this exemption also is set to expire on December 31, 2022. As of January 1, 2023, California B2B contacts have the right to know, view, correct, and delete personal information. Some personal information may be exempted as necessary to “complete the transaction for which the personal information was collected, fulfill the terms of a written warranty or product recall conducted in accordance with federal law, provide a good or service requested by the consumer, or reasonably anticipated by the consumer within the context of a business’s ongoing business relationship with the consumer, or otherwise perform a contract between the business and the consumer.” However, companies will need to think outside the box when responding to these requests. Unlike employee and general consumer data, which companies typically collect in a centralized system, B2B data might be scattered across systems tracking emails, contracts, accounts payable, and countless other business processes.

How Can You Prepare?

  • Inventory Your Employee + B2B Data: Businesses should review employee and applicant information (as well as 1099 contractors) to confirm that their privacy notice correctly describes the categories of personal information they collect and process in order to identify “sensitive personal information” subject to the new CPRA right. Businesses should pay special attention to B2B data and clearly document which categories of personal data are stored and on which systems.
  • Enter into Data Processing Agreements with Service Providers: Businesses that use third-party HR software such as Workday and ServiceNow should add data processing addendums that include specific required terms to their contracts. The CCPA requires these agreements with all service providers, including providers that process employees’ personal information.
  • B2B Portals or Websites: If your business collects B2B contact information via a portal or website, you may need to update your privacy policy and include specific provisions required under the CCPA/CPRA.

These are just basic steps. However, if you haven’t assessed whether the CCPA applies to your business, now is the time. And, after that assessment is done, it could mean implementation of a compliance program to avoid fines and penalties and private actions against your business.

President Biden recently signed an executive order establishing the implementation of the new EU-U.S. Data Privacy Framework, which would provide for the possibility of the lawful transfer of personal data from the European Union (EU) to the United States (U.S.), while ensuring a strong set of data protection requirements and safeguards.[1]  Once approved by the European Commission (EC), the new Framework would replace the “Privacy Shield” framework, which was invalidated by the Court of Justice of the European Union (CJEU) in the case commonly known as Schrems II.

It is expected that the new Framework will be effective by the end of the first quarter of 2023, after the EC’s review of the executive order and preparation of a draft adequacy decision, the issuance of a non-binding opinion by the European Data Protection Board, a vote of approval of the decision by EU member states, and formal adoption by the EC College of Commissioners.[2] On the basis of the new Framework, EU (and later European Economic Area) businesses would be able to legally transfer personal data to U.S.-based companies that were self-certified the new Data Privacy list at the U.S. Department of Commerce.

There are three components to the new EU-U.S. Data Privacy framework: (a) limits on U.S. surveillance programs; (b) sufficient redress mechanisms to pursue alleged violations; and (c) the Framework’s commercial data protection principles.

In Schrems II, the CJEU declared the Privacy Shield framework invalid because the court found there were insufficient restrictions on U.S. signals intelligence activities and inadequate redress rights for individuals who wanted to challenge what they considered to be unlawful U.S. government surveillance. Biden’s executive order addresses the CJEU’s findings by expressly mandating necessity and proportionality limits on U.S. surveillance programs and including oversight procedures to verify compliance by U.S. intelligence authorities. The executive order also includes other specifics, such as identifying what signals intelligence can be collected, how it can be used and shared, and how long it can be maintained. 

In addition, Biden’s executive order directed the Department of Justice to adopt regulations[3]  to establish a Data Protection Review Court (DPRC) for individuals to challenge U.S. government surveillance activities. The DPRC will be a second level of review in the redress mechanism, the first level being the Civil Liberties Protection Officer (CLPO) of the Office of the Director of National Intelligence.  The CLPO will also provide training to U.S. intelligence authorities and review their compliance with the executive order and U.S. intelligence priorities. The DPRC will independently review the CLPO’s determinations. 

The U.S. Department of Commerce (Department) has authority over the last component of the new Framework, the commercial data protection principles and self-certification process. The Department is currently updating its requirements for companies to self-certify to the commercial data protection principles.  While the CJEU did not question these commercial principles in Schrems II,  there will still be some changes to these principles because of the need to update references from the 1995 EU Data Protection Directive to the General Data Protection Regulation (GDPR), which went into effect after the Privacy Shield framework was adopted. These modifications include changing the definitions of personal data etc., to conform to the GDPR.

Until the new Framework is adopted, U.S. companies should consult with legal counsel to insure that any transfers of personal data (as defined in the GDPR) to the United States are done in compliance with law.


[1] See https://www.whitehouse.gov/briefing-room/presidential-actions/2022/10/07/executive-order-on-enhancing-safeguards-for-united-states-signals-intelligence-activities/ .

[2] See https://ec.europa.eu/commission/presscorner/detail/en/qanda_22_6045.

[3] See https://www.justice.gov/opcl/redress-data-protection-review-court.

Texas Attorney General Ken Paxton filed a lawsuit against Google for alleged “blatant defiance” of Texas’s biometric privacy law, which prohibits capturing biometric identifiers without prior consumer consent. The complaint alleges that several Google products, including Google Home, Nest, and Google Photos, collect and catalog biometric identifiers such as facial structure and voice print.

Texas seeks penalties of $25,000 per count, an injunction enjoining Google from collecting biometric identifiers without Texas consumers’ consent, and an order forcing Google to destroy the biometric data it has already collected. Google plans to fight the suit and claims users were free to turn off biometric features.

Many trucks on the freeway already drive themselves -with an operator on board. These trucks operate smoothly and safely with little to no human intervention. Such vehicles are capable of knowing when to make space and move over when another vehicle is trying to merge and slowing down when they see a vehicle pulled over on the side of the road. These automated trucks are very capable drivers. So, shouldn’t we put more of these autonomous big rigs on the road? Yes.

Right now, there is a shortage of some 80,000 truck drivers in the U.S.; it is estimated that there will be a shortage of about 160,000 by the end of the decade. This is a piece of the problem in the supply chain issues we’ve all been experiencing. Furthermore, a business can save money by using these autonomous vehicles and increase safety (e.g., an autonomous truck won’t fall asleep at the wheel or try to go under a bridge with too low of a clearance). Also, unlike city streets, freeways are all fairly similar. For example, a freeway in Massachusetts and a freeway in Minnesota look basically the same. However, an intersection in San Francisco would be more of a challenge to program for and navigate, because of the numerous variables involved in the autonomous vehicle’s calculations. We’ll see how many autonomous big rigs hit our freeways in the near future if the industry can save money and increase safety as promised by this technology.

I had a really interesting discussion with my students during class this week about employers’ use of electronic means to monitor employees. When I first started teaching Privacy Law at Roger Williams Law School eight or nine years ago (time flies!), the section on workplace privacy was so brief that I had to combine that subject matter with another to fill the class time. The reason was because, back in the day, there were few laws relevant to employees’ privacy in the workplace, as most employment in the U.S. is “at will,” and employees, in general, don’t have an expectation of privacy in the workplace (with a few obvious exceptions).

That tide is changing, which was very evident during the class discussion this week. In general, companies have always had the right to monitor employees while in the work place, including as technology advanced, and most employees understand that their employers are monitoring their use of company computers, phones, and key cards. With a more hybrid workforce, employers monitor when employees come into the office, when they log into the VPN, monitor email traffic and can prevent employees from accessing websites that are deemed inappropriate. Employees for the most part are aware of this type of monitoring and do not believe it is intrusive.

But when you start talking about keystroke technology, collection of biometric information, and geolocation monitoring, my students pushed back and the discussion became lively. They found this type of monitoring to be much more intrusive and unreasonable, and really blurring the line between information that an employer is entitled to and that which started to creep intrusively into private lives. They felt it was offensive that employers have such little trust in them and felt they were not treating them professionally. The major sentiment in the room of twenty-somethings was distrust of employers who would stoop to such means. We bantered about the legitimate needs of businesses to monitor employee productivity and other laws (such as wage and hour laws) that could complicate the ability of employees to work off-hours, and there was some acknowledgement of those points, but not much. This issue is going to get more and more complicated for companies as the hybrid work model develops and the quest for talent continues to be challenging.

Affirming my students’ reaction to electronic monitoring, in addition to federal law, states are getting into the mix by adopting laws that require employers to give notice of (and in some states, written acknowledgement and consent to) their electronic monitoring of employees. Three states have adopted such laws (New York, Connecticut, and Delaware) requiring private employers to provide notice to employees of electronic monitoring, and New Jersey requires employers to provide notice to employees if a tracking device is used in an employee’s vehicle. Other states have similar bills in the works, including California’s Workplace Technology Accountability Act. (See Linn Freedman quoted in the related SHRM article here.)

In addition, the National Labor Relations Board has indicated that it is looking into including electronic monitoring as an unfair labor practice without notice and consent. All of this is to say that workplace privacy laws are popping up with greater frequency than ever before, and employers may wish to follow them closely. Employers may want to consider how certain monitoring technology might also have legal compliance obligations, as well as affect workforce satisfaction.

Last week, the California Privacy Protection Agency (CPPA) released updated California Privacy Rights Act (CPRA) draft regulations and a summary of the changes. The regulations remain in the proposal stage and it is unclear when to expect finalized rules, although it is likely that this version will include near final requirements and prohibitions.

While most of the changes from the previous incarnation are technical, the modified proposal also softens one of the more revolutionary requirements: universal opt-out signals. Previously, the regulations required all CPRA-subject businesses to treat browser-based opt-out settings as the consumer’s signaled consent. They also required companies to add a dynamic icon to their website to indicate whether they had responded to the signal. Under the modified rules, businesses will only need to respond to browser opt-out signals if they sell or share personal information and have the option to display the status icon, but no longer are required to.  Instead, companies can offer consumers choices about the cookies and other tracking technology used on their website, which offers greater transparency for the consumer.

The modified rules also throw businesses a bone on a few other issues. For example, the CPPA removed some statutory privacy and security requirements for business service providers because the CPRA already requires certain provisions in service contracts. The CPPA reworked other rules to “simplify implementation at this time,” so that companies would still be wise to prepare for eventual compliance without the rush of meeting the end-of-year deadline. Some of these delayed requirements include disclosing in their online privacy policies the identities of third-party data processors and controllers and technical requirements for implementing the ”Right to Limit” and financial incentive programs. 

The updated rules clarify that enforcement actions against companies that employ “dark patterns,” or interfaces that steer consumers toward opting in (or not opting out), do not require showing the business’s intent. The intent is still a “factor to be considered” at CPPA’s discretion, but offenses in this area pose strict liability against the companies using these technologies.  The Board of the CPPA will meet in public sessions on October 28 and 29. See the modified rules and explanations.

The Cybersecurity & Infrastructure Security Agency (CISA) recently issued an Alert outlining the top Common Vulnerabilities and Exposures (CVEs) that have been used by the People’s Republic of China (PRC) state-sponsored cyber actors since 2020.

According to the Alert, these threat actors “continue to exploit known vulnerabilities to actively target U.S. and allied networks as well as software and hardware companies to steal intellectual property and develop access into sensitive networks.” CISA, the National Security Agency (NSA), and the FBI “assess PRC state-sponsored cyber activities as being one of the largest and most dynamic threats to U.S. government and civilian networks.”

The NSA, CISA, and the FBI “urge U.S. and allied governments, critical infrastructure, and private sector organizations to apply the recommendations listed in the Mitigations section and Appendix A to increase their defensive posture and reduce the threat of compromise from PRC state-sponsored malicious cyber actors.”

The Alert lists the top CVEs most used by Chinese state-sponsored cyber actors and provides mitigation tips to apply to reduce the risk of attack, including patching, multi-factor authentication, password and protocol management, upgrading or replacing devices at the end of their useful lives, moving toward a Zero Trust security posture, and enabling robust logging.

PRC attackers are believed to be behind some of the biggest data breaches the U.S. has seen. They continue to be a major threat to businesses in the U.S. Staying abreast of Alerts from CISA is helpful in minimizing risk and preventing becoming a victim of a state-sponsored cyber-attack.

A Dutch court ruled in favor of a Dutch national employed by a U.S. company who was fired for refusing to turn on his webcam. The ruling was part of the employee’s wrongful termination lawsuit against his former employer, Chetu, Inc, a Florida-based software company. Chetu’s company policy requires employees to have their webcams and screens visible on remote viewing software for several hours daily. The employee was fired for “refusal to work” and “insubordination” when he refused to comply and open his webcam cover.

The court disagreed, finding that the policy violates Article 8 of the European Convention of Human Rights. Article 8 protects the “right to respect for private and family life.” The court ordered Chetu to pay the employee’s legal fees, provide back wages, and pay a fine for violating employment law.

This ruling is a marked difference from U.S. law, which does not generally prevent businesses from observing their remote workers via webcam, email, or screen monitoring. However, the tide may be changing on this issue. The spike in remote work following COVID-19 has inspired some states, including Delaware and New Jersey, to pass laws restricting such heavy-handed remote management.

When the U.K. withdrew from the European Union (EU), its General Data Protection Regulation (GDPR) status was one of many headaches for regulators to figure out. After drawn-out negotiations over points such as requiring opt-in or opt-out models, lawmakers had settled mainly on a GDPR-like solution called the Data Protection and Digital Information Bill.

The European Commission previously granted the U.K. adequacy status to allow personal data to pass freely between the U.K. and the EU, which will last until June 27, 2025. The European Commission will likely begin to review this in 2024, at which time it will decide whether to extend the adequacy decision for the U.K. for a further maximum period of up to another four years. If the U.K. does not receive such an extension, the current adequacy status will expire on June 27, 2025. U.K. businesses that formerly operated easily across the EU have understandably watched this space anxiously.

However, U.K .Secretary of State for Digital, Culture, Media, and Sport, Michelle Donelan, announced in a speech earlier this month that the government intends to replace the U.K. GDPR with another data protection law with less “red tape.” This announcement, which did not come with firm policy outlines, has put U.K. businesses back at square one for planning any technical or procedural updates needed to ensure seamless data transfers between the U.K. and the EU when the current adequacy status eventually expires.