The Center for Internet Security (CIS) announced last week that it has launched the Malicious Domain Blocking and Reporting (MDBR) service to assist U.S.-based private hospitals with ransomware and cyber-attacks for free. CIS, a not-for-profit entity, “is fully funding this for private hospitals at no cost, and with no strings attached, because it’s the right thing to do, and no one else is doing it at scale.” According to the announcement, the product is designed as a ransomware protection service and a “no-cost cyber defense for U.S. hospitals.”

CIS teamed up with Akamai to offer its Enterprise Threat Protector software to proactively identify, block and mitigate targeted ransomware threats. The service was previously available (and is still) to public hospitals and health departments through the Multi-State Information Sharing and Analysis Center (MS-ISAC), and according to CIS, over 1,000 government entities have used the product through MS-ISAC. To date, MDBR has blocked almost 750 million requests for access to malicious domains. If an organization uses MDBR, the software will cross-check the request with its database of known and suspected domains and “attempts to access known malicious domains associated with malware, phishing, ransomware, and other cyber threats will be blocked and logged.” The logged data are then analyzed, aggregated reporting is made available for the benefit of the hospital community, and remediation assistance is provided as appropriate.

CIS is now offering the service for free not only to public entities and governmental agencies, but to private hospitals, multi-hospital systems, integrated health systems, post-acute facilities and specialty hospitals. Sounds like a great opportunity for hospitals and facilities to add another tool in their toolboxes to combat ransomware and other cyber-attacks. For more information and to sign up, the CIS website is available here.

Renown Health, P.C. (Renown), a non-profit health system in Nevada, settled with the U.S. Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services resulting from an enforcement action for a potential violation of patients’ access rights under the OCR’s Health Insurance Portability and Accountability Act of 1996 (HIPAA) Right-of-Access Initiative. The Renown settlement is the 15th settlement under this initiative.

Renown paid $75,000 and agreed to:

  • Develop and maintain written access policies and procedures to comply with HIPAA
  • Distribute updated policies and procedures related to the right-of-access to all workforce members
  • Train workforce members on the right-of-access
  • Revise its Notice of Privacy Practices to reflect the steps that patients need to take to access their PHI (including billing records)

OCR alleged that Renown did not respond to a patient’s request that an electronic copy of her protected health information (PHI), including billing records, be sent to a third party in a timely manner under HIPAA. The OCR’s investigation determined that this failure to provide timely access was a potential violation of Renown’s obligations to the patient. As a result of the investigation, Renown also provided access to all the requested records.

Acting Director of OCR, Robinsue Frohboese, said “Access to one’s health records is an essential HIPAA right and health care providers have a legal obligation to their patients to provide access to their health information on a timely basis,” and OCR will certainly continue to enforce these types of violations throughout 2021. OCR announced this initiative in September 2019 seeking to support patients’ right to timely access to their PHI at a reasonable cost under HIPAA.

To view the corrective action plan that Renown has agreed to, click here.

This week, Consumer Reports published a Model State Privacy Act. The Consumer advocacy organization proposed model legislation “to ensure that companies are required to honor consumers’ privacy.” The model legislation is similar to the California Consumer Privacy Act, but seeks to protect consumer privacy rights “by default.”  Some additional provisions of the model law include a broad prohibition on secondary data sharing, an opt-out of first-party advertising, and a private right of action in addition to enforcement by state Attorneys General.

While the introduction of a model privacy law is an interesting development, we also continue to track state privacy laws in multiple states right now, as several states have recently introduced consumer privacy legislation. Connecticut, Massachusetts, Illinois, Minnesota, New York and Utah recently saw the introduction of new privacy legislation. As legislative sessions move forward into 2021, we expect even more states to follow suit.

Our list of pending state privacy legislation includes:

We will continue to provide updates as these bills move forward.

PAL-V, the first flying car to be allowed on the road in Europe, is now also the first flying car to complete full certification with European Union Aviation Safety Agency (EASA). The PAL-V Liberty (flying car) went through 10 years of testing, and now is in the final phase of compliance demonstration before becoming available to its customers.

PAL-V CEO, Robert Dingemanse, said, “Although we are experienced entrepreneurs, we learned that in aviation everything is exponentially stricter. Next to the aircraft, all aspects of the organization, including suppliers and maintenance parties must be certified.”

In 2009, PAL-V worked with EASA to amend the Certification Specifications for Small Rotorcraft, CS-27, as a starting point for certification of its flying car. Ultimately, together they amended the complete list of more than 1,500 criteria to make it applicable for PAL-V. The final version of these criteria was published last week. Note that this development only occurred after more than 10 years of analysis, test data, flight tests, and drive tests.

This EASA certificate is valid in Europe AND is also accepted in about 80 percent of the world’s market, including the United States and China.

WhatsApp started notifying its 2 billion users last month about an update to its privacy policy. Most of its users probably didn’t look at the details, and simply clicked “I agree” when the notice popped up on their phones. (To use the app, one must click “I agree.”) There has been a backlash from privacy advocates, which is worth noting here in case you missed that news. WhatsApp has delayed the implementation of the terms of the new privacy policy for a few months so it can address those concerns.

If you are a WhatsApp user and you click “I agree” to that pop-up that you don’t read, here’s a synopsis (not comprehensive) of what you are agreeing to that is not protecting your privacy:

  • WhatsApp can share all data it collects about you with the entire Facebook network, (including Instagram), even if you don’t have an account with other parts of the network (e.g., Instagram).
  • If you don’t accept the new terms, you will not have full functionality of the app (which is reported to go live in May).
  • WhatsApp is monetizing the data it collects from you and asks for your consent to use your data to make money.
  • WhatsApp will be providing more information about the changes to the privacy policy through a banner in WhatsApp—this writer thinks you may wish to read the banner and the privacy policy a bit more carefully before you agree.
  • Although your conversations in WhatsApp are private and encrypted, WhatsApp has access to your usage data and your unique identifier, which may be linked to your identity. This is one of the reasons they are asking you to accept the new terms.
  • Facebook is monetizing your data and increasing its revenue by using your usage of WhatsApp to push targeted ads to you on Facebook and Instagram.

The changes to the privacy policy are not really designed to protect your privacy, but rather to get consent to sell your information so businesses can sell things to you. It’s not really a “privacy” policy, it is a “let me monetize your data” policy.

Some users are taking note that they will not agree to the new “privacy” policy and are defecting to Signal, which as a privacy pro, I prefer for messaging. WhatsApp users may wish to take a look at Signal’s privacy policy and compare it to WhatsApp’s. It can be accessed here.

Becker’s Health IT reports that two batches of sensitive information of Chatham County, N.C. residents have been posted online on the dark web and light web by the ransomware group DoppelPaymer, and that the files have been accessed more than 30,000 times.

DoppelPaymer obtained the information during a cyber-attack on the County’s systems on October 28, 2020. The group then uploaded the files on November 4, 2020, and again in late January. The posting of information like this usually happens when a victim of ransomware refuses to pay the ransom demand.

The information contained in the files included “medical evaluations of children from neglect cases, personnel records of some employees and documents related to ongoing investigations with the Chatham County Sheriff’s office.”

Chatham County is working to determine its obligations “to ensure we respond in the most appropriate manner possible.”

Last week, in Tsao v. Captiva MVP Restaurant Partners, LLC (Captiva), the U.S. Court of Appeals for the 11th Circuit held that data breach claims arising from increased risk of future identity theft and potential mitigation effort costs, WITHOUT any evidence of actual data misuse or harm, did not satisfy Article III standing. This decision marks the 11th Circuit’s joining of several other Circuit courts that a plaintiff must establish evidence of harm to satisfy standing requirements. To date, the 1st, 2nd, 3rd, 4th and 8th Circuits have also held that plaintiffs may not establish Article III injury-in-fact based on increased risk of harm.

In the Captiva case, the plaintiff’s payment card data were not actually misused following a data breach and therefore the plaintiff did not present an injury-in-fact sufficient to establish standing. Tsao’s complaint only alleged future risk of identity theft because hackers MIGHT have accessed his payment card information and losses for mitigation efforts such as cancelling his potentially affected credit card account and the lost benefits related to that cancellation, such as loss of reward points. However, the court held that plaintiffs may not manufacture standing in this manner.

The decision can be found here 2021 WL 381948 (11th Cir. Feb. 4, 2021).

The state of Virginia might be the next state to enact a privacy law. Senate Bill No. 1392 recently passed the Senate and is likely on its way to Governor Ralph Northam’s desk.  The bill adds the Consumer Data Protection Act to the Virginia Code and includes definitions of biometric data, precise geolocation data, profiling, sensitive data, and targeted advertising. The bill’s effective date is January 1, 2023.

The bill will apply to persons who conduct business in the Commonwealth or produce products or services that are targeted to residents of the Commonwealth, and that (i) during a calendar year, control or process data of at least 100,000 consumers or (ii) control or process personal data of at least 25,000 consumers and derive over 50 percent of gross revenues from the sale of personal data. The law would not apply to any state or local government agency, to financial institutions subject to the Gramm-Leach-Bliley Act, or to covered entities or business associates governed by the Health Insurance Portability and Accountability Act (HIPAA) and Health Information Technology for Economic and Clinical Health Act (HITECH Act).

Consumer rights include the following:

  1. The right to know whether or not a controller is processing the consumer’s personal data and the right to access such personal data;
  2. The right to correct inaccuracies in the consumer’s personal data;
  3. The right to delete personal data provided by or obtained about the consumer;
  4. The right to data portability; and
  5. The right to opt out of the processing of the personal data for purposes of (i) targeted advertising, (ii) the sale of personal data, or (iii) profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.

The bill is designed to feature data controllers and data processors and organizes the rights and responsibilities of each according to those roles. There is no private right-of-action in this bill, as the Attorney General is charged with enforcing violations. The Attorney General will have the exclusive authority to enforce violations in the name of the Commonwealth or on behalf of individual persons residing in the Commonwealth.

The Office for Civil Rights (OCR) recently announced another settlement involving investigations under its Right of Access Initiative. This settlement, the sixteenth such agreement under the Initiative (and one of the most interesting), involves San Diego-based Sharp HealthCare, doing business as Sharp Rees-Stealy Medical Centers (SRMC). In the settlement, OCR alleged that it received a complaint on June 11, 2019, stating that SRMC “failed to timely respond” to a patient’s request to electronically access his medical records. OCR provided technical assistance to SRMC and closed the case.

OCR subsequently received a second, similar complaint that SRMC still had not received the medical records as of August 19, 2019. OCR notes in the Resolution Agreement with SRMC that SRMC did not provide access to the requested records until October 15, 2019.

In settling with SRMC, OCR stated that its investigation found that SRMC failed to timely respond to the request for the records from the third-party recipient. SRMC agreed to pay the OCR $70,000 to settle the case and to enter into a standard Corrective Action Plan.

The reason this is so interesting is that it is apparent from reading the Resolution Agreement that the request to access the medical records of the patient did not come directly from the patient, but from a third party. Covered entities are often faced with requests for medical records from third parties on behalf of patients. These third parties could be family members, executors of estates, guardians, administrators, parents, or lawyers. Under HIPAA, covered entities are not permitted to simply hand over medical records to individuals who are not the patient, and requests from third parties can be tricky for many reasons. In general, covered entities are prohibited from providing medical records of patients without the patient’s specific authorization. Although the background detailed facts of this settlement are not known, reading between the lines it looks like the request came from the patient’s attorney.

Covered entities often receive requests for medical records from attorneys, but often are not accompanied by HIPAA-compliant authorization forms to enable the covered entity to provide the medical records to the attorney. Although as attorneys we are used to being able to obtain documents on behalf of clients we represent, HIPAA does not allow covered entities to provide medical records to attorneys without a valid HIPAA authorization form. If an attorney provides the covered entity with a valid authorization form, the request is no different than the request of the patient, and the covered entity must provide access to the records under HIPAA and the OCR’s Right of Access Initiative. The lesson here is to treat the valid request from the attorney no differently than the request from the patient and to provide access to the records within the time frame outlined in HIPAA. Otherwise, the attorney may file a complaint with the OCR.

It’s called This is how we lost control of our faces in the February 5, 2021 edition of MIT Technology Review, written by Karen Hao.

The article outlines a study recently published by Deborah Raji and Genevieve Fried titled About Face: A Survey of Facial Recognition Evaluation, which includes a survey of over 100 face datasets compiled “between 1976 to 2019 of 145 million images of over 17 million subjects….” It reportedly is the largest study of facial recognition technology ever conducted.

Hao posits that the study “shows just how much this enterprise has eroded our privacy. It hasn’t just fueled an increasingly powerful tool of surveillance. The latest generation of deep-learning-based facial recognition has completely disrupted our norms of consent.”

There are way too many fascinating things about Hao’s synopsis of the study and the study itself to summarize in a blog post. Both are worth reading and contemplating in determining facial recognition technology’s impact on our own privacy, as well as how we want different facets of society to respect our privacy if using facial recognition technology. The study analyzes the development and use of facial recognition technology over the past 30 years. It is relevant and insightful into how we can shape parameters around the use of facial recognition over the next 30 years and beyond.

As Raji and Fried say, “Facial recognition technologies pose complex ethical and technical challenges. Neglecting to unpack this complexity-to measure it, analyze it and then articulate it to others-is a disservice to those, including ourselves, who are most impacted by its careless deployment.”