Massachusetts Ballot Question Poses Privacy Concerns

Ballot Question 1 in Massachusetts, if passed in November, would require car manufacturers that sell cars equipped with telematics systems (i.e., a method of monitoring a vehicle by combining a GPS system with on-board diagnostics to record – and map – exactly where a car is and how fast it’s traveling, etc.) to install a standardized, open data platform beginning with model year 2022. Such a system would allow the cars’ owners to access their telematics system data through a mobile app and give their consent for independent repair facilities to access those data and send commands to the system for repair, maintenance, and diagnostic testing.

An open data platform is primarily designed to help big-data developers in creating big-data applications on a common platform. It provides a baseline model to build applications and services that can be interoperable on different platforms. While this platform would allow for use by many different users, this proposed open data platform may also presents security risks to those providing the information. From loss of confidentiality, to the higher potential for compromising personal information, releasing data inherently puts the data at risk.

Currently, Massachusetts’ Right to Repair law (signed into law in 2013), exempts  telematics systems from accessibility by car owners and independent repair facilities. This means that the car’s telematics system may only be accessed by the brand manufacturer, which may limit a car owner’s ability to choose where the system can be updated or repaired.

A “yes” to the ballot question “supports requiring manufacturers that sell vehicles with telematics systems in Massachusetts to equip them with a standardized open data platform beginning with model year 2022 that vehicle owners and independent repair facilities may access to retrieve mechanical data and run diagnostics through a mobile-based application,” while a “no” opposes this initiative.

Tommy Hickey, director of Massachusetts Right to Repair Coalition, said, “This is really a fight for Massachusetts consumers. Without this information, people may lose the choice to bring their car to an independent repair shop.” Opposingly, the Coalition for Safe and Secure Data’s spokesman, Conor Yunits, said, “This ballot question will create easy opportunities for strangers, hackers and criminals to access consumer vehicles and personal driving data–including real-time location. It will put people at risk, without doing anything to improve the consumer experience.” Both sides seem to be part of a fight for consumers.

Size Doesn’t Matter for OCR Enforcement Actions

Small health care organizations may think they are under the radar of the Office for Civil Rights (OCR), but a settlement the OCR agreed to last week should disabuse small health care providers of that notion.

On July 23, 2020, the OCR issued a press release outlining the terms of its settlement with Metropolitan Community Health Services (Metro), doing business as Agape Health Services. Metro agreed to pay $25,000 to the OCR and to adopt a corrective action plan, including two years of monitoring, to settle an enforcement action OCR initiated against Metro.

The controversy began when Metro self-reported a data breach on June 9, 2011 pursuant to the HIPAA breach notification regulations after it discovered an “impermissible disclosure of protected health information to an unknown email account” that affected 1,263 patients.

OCR commenced an investigation and found “longstanding, systematic noncompliance with the HIPAA Security Rule. Specifically, Metro failed to conduct any risk analyses, failed to implement any HIPAA Security rule policies and procedures, and neglected to provide workforce members with security awareness training until 2016.”

As with all settlements that the OCR enters into with regulated entities, lessons can be learned from this one, including consideration of reviewing the last time a security risk assessment was performed, review of a business’ HIPAA compliance program, including policies and procedures that comply with the Security Rule, and security awareness training for its workforce.

DJI Responds to Recent Cybersecurity Report on App Vulnerabilities

This week, China-based DJI, the drone industry’s leading manufacturer of drones, issued a public statement regarding the recent reports released by cybersecurity researchers (neither Synacktiv nor GRIMM) about the security of its drones’ control app.

In two reports, the researchers claimed that an app on Google’s Android operating system that powers DJI drones collects large amounts of personal information that could be exploited by the Chinese government. In the report, the researchers claim to have discovered typical software concerns, but no specific evidence that those potential vulnerabilities have been exploited. This is not the first time DJI has been accused of lax security safeguards.

DJI responded to these claims, saying that its goal is to help ensure that its comprehensive airspace safety measures are applied consistently across its control apps. However, because recreational pilots often want to share the photos and video they take using the drone with their family and friends over social media, the security of those social media sites must be reviewed by the pilot user. Further, DJI said, “When our systems detect that a DJI app is not the official version – for example, if it has been modified to remove critical flight safety features like geofencing or altitude restrictions – we notify the user and require them to download the most recent official version of the app from our website.”

The report also claimed that one of DJI’s drones could restart itself without any input from the pilot. DJI responded stating,”[Our] DJI GO 4 is not able to restart itself without input from the user, and we are investigating why these researchers claim it did so. We have not been able to replicate this behavior in our tests so far.”

The potential vulnerabilities identified in the report have not been identified by DJI at this point, but DJI says that it has proactively offered security researchers payments of up to $30,000 (through its Bug Bounty Program), to assist in identifying and disclosing security issues with the control apps.

DJI also stated that its drone products designed for government agencies do not transmit data to DJI and are compatible only with a non-commercially available version of the DJI Pilot app. More specifically, “The software for these drones is only updated via an offline process, meaning this report is irrelevant to drones intended for sensitive government use. A recent security report from Booz Allen Hamilton audited these systems and found no evidence that the data or information collected by these drones is being transmitted to DJI, China, or any other unexpected party.”

All in all, DJI has been a part of the ongoing call for a set of industry standards for drone data security. However, until those standards have been set, we are sure to continue to see alleged flaws and risks to data collected and transmitted via drone.

Robots for Package and Food Delivery Invade the Sidewalks

A robot for the food delivery market first debuted at George Mason University in January 2019 from Starship Technologies (Starship). Starship did minimal marketing for these new robots for package and food delivery, but students found them, found the app, and then started requesting package and food delivery. It took off from there.

Now, more than a year later, students dress them up for events and make sure the robots can get through sometimes crowded sidewalks. The robot is now seen as simply another pedestrian. While there is a small percentage of people who are suspicious of these robots, the majority view them as a way to make their lives easier.

Following recent releases of robot delivery forces at Purdue University and the University of Wisconsin, Starship plans to release such robots on more than 100 university campuses over the next two years, and several new job postings seeking operators and support staffing college towns such as Austin, Texas and Tuscaloosa, Alabama have appeared.

Package and food delivery with autonomous robots may be a relatively small market in the near term, but according to industry experts, it’s already attracting a large number of competitors using a wide variety of systems (e.g., FedEx’s SameDay delivery robot). The delivery robot market is expected to grow from $11.9 million in 2018 to $34 million in 2024. Perhaps your next pizza will be delivered by a Starship robot, if you happen to be strolling around a college campus any time soon.

Privacy Tip #246 – Spam, Spam, Spam: Be Extra Cautious

Security researchers are warning companies to be aware of a new resurgence of the Emotet botnet that has been reactivated after a hiatus of five months.

According to the researchers, the Emotet malware steals information, and has been used to distribute the banking Trojan Trickbot. Attackers using the Emotet botnet use simple emails that are personalized, often with the subject line of “RE.” The emails often contain fake invoices, purchase orders, shipping notifications or receipts, and ask the recipient to click on a link or open an attachment. When the link or attachment is opened, the Emotet malware then is activated and the malware hijacks the email accounts and uses them to forward spam emails that contain malicious links and attachments from the legitimate email account to the contacts in that email account. The recipients, believing the email is coming from a trusted source, click on the link or attachment  and the malware exponentially infects other email accounts and systems.

Emotet is known to spread to other devices on the network and those infected devices are then added to the botnet. As of last week, security researchers confirmed that over 250,000 emails containing Emotet are being sent every day.

According to the researchers, if Emotet is detected, it is important to respond as soon as possible, and to isolate the device and remove the malware. Protection from the infection is focused on employee awareness and asking them to be very cautious about opening any Word documents or Excel spreadsheets, even if they think they are coming from a trusted source.

We all have noticed an increase in email traffic and spam during the pandemic. Protecting devices and networks for security personnel has been challenging with a remote workforce; educating a remote workforce on botnets is even more challenging. However, keeping your employees vigilant about emails and attachments, and engaging them to be part of your first line of defense, is critically important to help reduce the spread of Emotet and other malicious malware. As employees, we need to be aware of attacks such as Botnet so we can be responsible and valuable team members in our organization’s data security.

Connecticut Insurance Department Reminds Licensees to Comply with Data Security Law

On July 20, 2020, the Connecticut Insurance Department issued a bulletin to licensees reminding them that the Connecticut Insurance Data Security Law (“Act”) becomes effective on October 1, 2020 and providing guidance on compliance.

The Act requires “all persons who are licensed, authorized to operate or registered, or required to be licensed, authorized or registered pursuant to the insurance laws of Connecticut” to “develop, implement and maintain a comprehensive written information security program (“ISP”) that complies with” the Act “not later than October 1, 2020.” The Act generally applies to domestic insurers and health care centers, with some exemptions.

The Act requires the licensee’s ISP to be based upon a risk assessment “and contain safeguards for the protection of nonpublic information and the licensee’s information systems commensurate with the size and complexity of the licensee, its activities, including use of third-party services providers, and the sensitivity of the nonpublic information used by the licensee or in its possession, custody or control.”

The bulletin reminds that unless a licensee is exempted, the licensee must perform due diligence on its third-party service providers and require those third-party service providers to implement appropriate administrative, technical and physical measures to protect the information disclosed to the third-party service provider by the licensee. Although not specified in the bulletin, licensees may wish to consider documenting such measures through security questionnaires and written contractual obligations.

All licensees (except those licensees exempt from the law) must provide written confirmation to the Insurance Commissioner by February 15, 2021 and annually thereafter certifying that it is in compliance with the Act. Documentation of plans for material improvements, updates or remedial efforts must be maintained by the licensee and be “available for inspection by the Insurance Department.”

The bulletin outlines in detail the obligations of licensees following a cybersecurity attack or event. Similar to the New York Department of Financial Services Cybersecurity Regulations, the Act requires licensees to notify the Insurance Commissioner “as promptly as possible, but in no event later than three (3) business days after the date of the cybersecurity event” if the licensee is domiciled in the State of Connecticut or the licensee believes that the event involves more than 250 residents of the State of Connecticut and notification to individuals is required by state or federal law or the licensee believes that the event has “a reasonable likelihood of materially harming any consumer residing in Connecticut….” The notification will be through the Insurance Commissioner’s website and will be available by October 1, 2020.

The bulletin reminds licensees that it has the power to examine and investigate compliance with the Act and to impose penalties for noncompliance. Nonetheless, the bulletin states that because of COVID-19, the Department “intends to exercise appropriate discretion in evaluating the facts and circumstances of a licensee’s compliance…and in the imposition of sanctions for noncompliance.” The bulletin further states that the Department will not impose sanctions against a licensee if it fails to file its annual certification of compliance by February 15, 2021 as long as the certificate of compliance is filed by April 15, 2021. However, if a licensee is unable to file the certification on a timely basis due to COVID-19, the licensee “is urged to contact the Insurance Department Market Conduct Division” to explain why it is unable to file by the deadline.

Licensees may wish to consider prioritizing compliance with the Act now and develop and implement their ISP to be ready for both the October 1, 2020 compliance deadline, and the February 15, 2021 certification deadline.

Fall-Out from Blackbaud Ransomware Attack

As a follow-up to last week’s post on the importance of due diligence regarding high-risk vendors’ security practices, Blackbaud, a global company providing financial and fundraising technology to not-for-profit entities, notified its customers late last week that it was the victim of a ransomware attack in mid-May. Blackbaud offers a number of products to its customers, including aggregating research data of publicly available information on the wealth of individuals for not-for-profits to assess donors’ giving capacity.

Blackbaud admitted that the ransomware attackers did get access to donor data and were able to remove a copy of a subset of data from Blackbaud’s hosted environment. It has further stated that it paid an undisclosed amount to the ransomware attackers and received a certificate of destruction from the attackers. Blackbaud has stated that no sensitive information, including donors’ Social Security numbers, credit card information or bank account information, was accessed or exfiltrated. According to a company spokesman, “[W]hile this sophisticated ransomware attack happened, we were able to shut it down and have no reason to believe this will result in any public disclosure of any of our customers’ data.”

Nonetheless, multitudes of not-for-profits have received notification of the incident and are struggling with how to respond. The responses have been anything but uniform. In addition, not-for-profit health care entities may have different legal requirements than other not-for-profits because of the Health Information Portability and Accountability Act (HIPAA).

The incident illustrated several things to consider:

  • Do you have a vendor management program in place?
  • Have you vetted or completed due diligence on your vendors’ security practices?
  • Do you have up-to-date and accurate contracts with your vendors, including a Business Associate Agreement, as applicable?
  • Do you have contractual language in place with your vendors concerning appropriate data security measures to protect your data, what happens following a security incident, notification and indemnification?
  • What are your reporting/notification obligations if one of your vendors experiences a data security incident?
  • Who can help navigate these questions?

Mapping vendors that have access to data of your employees or customers is the first step in a vendor management program. This incident is a reminder that vendors are getting attacked just like your organization is.Your company data is your responsibility, even if it is in the possession of a vendor, so prioritizing your vendor management program may be worth consideration.

FAA Waivers Can Help During COVID-19 Pandemic

The Federal Aviation Administration (FAA) has been granting drone flight waivers for certain restricted flights to help during the COVID-19 pandemic, but the FAA says that it is unlikely the waivers will extend beyond current stay-at-home restrictions. Thus far, the FAA has been granting waivers for companies using drones to deliver food and supplies in certain parts of the country in order to enable people to stay home while still being able to get the things they need. These approvals have been granted under the Part 107 (Small Unmanned Aerial Systems (UAS) Rule), Special Government Interest Approval (e.g., a public safety aircraft), Part 137 Certification (i.e., agricultural aircraft operations) and Part 135 Certification (i.e., package delivery and beyond visual line of sight). As we look ahead, whether the FAA will continue to grant waivers as readily as they are in the current landscape is unknown.

Automated Vehicles Assist with Contactless Delivery During COVID-19 Pandemic

Several autonomous vehicle developers stopped their on-road testing to keep staff at home during the COVID-19 pandemic, but others pivoted to COVID-19 relief, not only to be useful but to gain experience. Some companies and developers in this space have taken this opportunity to deploy self-driving cars and driverless bots to help deliver goods both on the frontlines and to residents during the stay-at-home orders across the country, including pharmaceutical deliveries for those affected by the virus and quarantined at home. Additionally, many of these cars and bots have been used to deliver food, water, supplies and equipment to staff at temporary health care facilities, which has assisted in reducing contact among workers on the front lines. Others have used driverless vehicles to deliver food from food banks to senior centers and groceries to those individuals who are at higher risk. In this time of need, the autonomous driving vehicle industry has leveraged its existing capabilities to assist the community. Now, the industry, with more experience under its belt and real-life testing of its capabilities, may be able to make some headway in receiving more approvals and exemptions from the Department of Transportation in the future.

What Do Electric and Self-Driving Cars Mean for Our Electrical Grid?

The National Highway Traffic Safety Administration (NHTSA) said in a recent report, “The development of advanced automated vehicle safety technologies including fully self-driving cars, may prove to be the greatest personal transportation revolution since the popularization of the personal automobile nearly a century ago.” However, the automobile and transportation industry is still struggling with how this revolution will take place and what it means for one of the key components – the electrical grid. Vehicle-grid integration will present an opportunity to more efficiently plan and operate infrastructure broadly.

To address this big issue, leaders in the industry formed the Vehicle Grid Innovation Council (VGIC). One of the items for discussion includes allowing more rate flexibility for electrical vehicles. For example, electric vehicles owners could pay less if they charge their vehicles during low power-usage times. The group also will discuss automated vehicles using the grid; that is, automated vehicles present an even greater challenge than manned ones. Manned electric vehicles have a driver to plug them in, but autonomous ones will not. This means wireless charging. While this has been in the works for several years, the system is far from perfected. Currently, the prototyped system can charge an all-electric vehicle in one to two hours, while a plug-in hybrid could be charged in less than an hour. A wireless charging system would include a plate that the vehicle would drive over to charge. However, to increase safety, autonomous vehicles may be better served by overhead charging plates. On the flip side, there are also companies working on charging robots that would seek out electric vehicles and charge them before moving on to the next “customer.” For the moment, all we know is that these new technologies will mean a big change for the way we transport goods and people, and the way we operate our electrical grid.

LexBlog