The Kids Online Safety Act (KOSA) of 2023 is circulating Congress with bipartisan support. According to bill sponsors Senator Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), KOSA would require social media companies to develop enhanced parental controls for online platforms. 

Additionally, and much more controversially, KOSA creates a duty for online platforms to prevent and mitigate specific dangers to minors, including “promotion of suicide, eating disorders, substance abuse, sexual exploitation,” advertisements for certain illegal or age restricted products, and other matters. State Attorneys General and the Federal Trade Commission would have enforcement power. This provision has drawn sharp criticism from organizations like the Electronic Frontier Foundation and the American Civil Liberties Union, citing concerns that that enforcement would be inconsistent, politically motivated, and disproportionately impact the LGBTQ community. Further compounding these concerns, Senator Blackburn claimed in an interview that KOSA would protect children from “the transgender [sic] in this culture and that influence.”

Activists additionally raise concerns that the bill would drive the promulgation of online age verification, which would require online platforms to collect and secure sensitive personal information from consumers.

The offices of Senators Blumenthal and Blackburn have released statements further asserting that KOSA does not target specific communities, and that LGBTQ community organizations had been consulted throughout the drafting process.

Two more companies will conduct drone operations beyond visual line of sight (BVLOS). Recently, the Federal Aviation Administration (FAA) approved UPS Flight Forward and uAvionix for this type of operation in national airspace. UPS Flight Forward plans to conduct BVLOS drone operations for small-package delivery using a ground-based surveillance system. UPS Flight Forward will conduct these flights in North Carolina, Florida and Ohio. However, it may operate in other states from its Remote Operations Center (ROC) in Kentucky.

uAvionix plans to use the Vantis Network to test its own detect and avoid technology during BVLOS operations. The FAA stated that it is still working to adopt rules for “routine, scalable and economically viable” BVLOS drone operations. Those rules will likely allow wider spread BVLOS operations across the country. The FAA’s goal is to integrate drones into the national airspace in a way that allows drones and manned aircraft to coexist as opposed to a dedicated airspace only for drones.

I was talking to a client today about a security incident and the discussion turned to how threat actors are using increasingly more sophisticated ways to attack individuals and companies. She lamented that we know more than the average individual about how they implement attacks, but she worries about her mother, who is frequently online. I suggested that she educate her mother about different techniques that are being used in cyber-attacks and to provide her with resources on the risks of using the Internet and how to protect herself from scams.

Perfect Privacy Tip for this week!

There are several resources that all of us can provide to our senior family and friends to help protect them from online scams and frauds.

The Federal Trade Commission (FTC) has a great website and lots of helpful hints on how to protect yourself from scams, including identity theft and online security. Subscribe to its scam Consumer Alerts (there are very few times I will say to subscribe to a list-serve, but this is one of them!) The scam alerts are helpful to anyone, including seniors. They alert the subscribers to the newest scams and warns them of the scams that the FTC are seeing reported to it to educate consumers on how the scams work.

There are lots of publications issued by consumer protection organizations to assist seniors with online safety, including AARP and others. One article I particularly like is “The Ultimate Internet Safety Guide for Seniors in 2023,” authored by Katarina Glamoslija of SafetyDetectives. She touts it as a one-stop shop for internet security assistance, and I agree that it is a pretty decent stop.

We all have family and friends who could use a little coaching on avoiding scams, including the seniors in our lives. Think about those in your life who could use a little coaching and pass this article along. You may be a hero when you help prevent them from becoming the victim of a romance scam.

On August 22, 2023, the Cybersecurity and Infrastructure Security Agency (CISA) issued four more advisories related to industrial control systems. The advisories are applicable to four different industrial control products, explain the risk of the vulnerability (e.g., “successful exploitation of these vulnerabilities could allow an attacker to compromise availability, integrity, and confidentiality of the targeted devices”), and how to mitigate the vulnerabilities.

CISA alerts related to Industrial Control Systems are becoming more frequent, [read previous blog post here] and Industrial Control Operators should keep a close eye on these advisories as they are issued.

In October 2022, Advocate Aurora Health notified three million individuals of a data breach resulting from its use of tracking pixels on its website for tracking website visitor activity. Now, this month, Advocate Aurora Health settled a class action stemming from that data breach for $12.25 million.

In its breach notification to patients, Advocate Aurora Health stated that it had used third-party vendors to “measure and evaluate information concerning the trends and preferences of its patients as they use our websites,” which means the health care system was sharing IP address, locations, times of appointments, and communications within MyChart with these third parties without necessary consent or for an otherwise permissible purpose under the Health Insurance Portability and Accountability Act (HIPAA). Upon discovery of this disclosure, Advocate Aurora Health conducted an internal investigation to determine the scope of patient information that was being transmitted to its third-party vendors.

After the breach notification, many lawsuits were filed and eventually consolidated into a class action complaint. The class action complaint alleged that Advocate Aurora Health’s use of tracking pixels on its website “resulted in the invasion of Plaintiffs’ and Settlement Class Members’ privacy and other alleged common law and statutory violations.”

The $12.25 million settlement will be distributed to class members and to reimburse attorneys’ fees and other expenses. A recent study in Health Affairs found that third-party tracking technologies are being used on 98.6 percent of all U.S. non-federal acute care hospital websites. If your healthcare organization falls into this category, take this settlement and the many other pending pixel class action cases as a reminder to review your website’s use of pixels and other tracking technologies and to update your website privacy policies and data collection practices for compliance.

State privacy laws are changing rapidly in the U.S. Here are summaries of seven new state laws that have been enacted and go into effect in the next few years. We anticipate that more state legislatures will continue to enact privacy laws to protect consumers due to the absence of a federal privacy law.

Under each of the acts summarized below, consumers will have the right to access their personal data, the right to correct inaccurate data, the right to data portability, the right to have their data deleted, and the right to opt out of targeted advertising of personal data. Businesses will be required to practice purpose limitation, maintain data security, get consumer consent for data processing, and complete regular data impact assessments. Businesses will be barred from discriminating against consumers who exercise their rights under the law and will be required to secure data processing agreements with service providers. Similarly, these laws each exclude financial institutions or their affiliates that are governed by, or personal data that is collected, processed, sold, or disclosed in accordance with, Title V of the Gramm-Leach-Bliley Act ; state bodies/agencies; nonprofit organizations; institutions of higher education; national securities associations registered with the SEC; and covered entities or business associates as defined in the privacy regulations of the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA).

Continue Reading Seven States Have Upcoming Privacy Laws 

Earlier this month, the Commissioner of Data Protection of the Dubai International Financial Centre (DIFC), a financial free-zone in the United Arab Emirates (UAE), issued the first adequacy decision regarding the California Consumer Privacy Act (CCPA), which recognizes the CCPA as an equivalent to the DIFC Data Protection Law (DIFC Law No. 5 of 2020, as amended the DIFC DPL).

This decision allows businesses to transfer data between the DIFC and companies located in California, in accordance with the DIFC DPL, without any additional contractual measures. In the DIFC Commissioner’s public statement about this decision, he said, “The importance of additional safeguards for imported personal data is evidenced by the factors set out in published adequacy protocols as well as the DIFC Ethical Data Management Risk Index (EDMRI) and due diligence tool. In evaluating California’s privacy law and regulations, together with implementation, enforcement, and other holistic factors, it became clear that in large part, California importers will treat personal data from DIFC ethically and fairly.” This decision will also likely serve as precedent for the DIFC to establish a similar relationship with other U.S. states. As of today, there are only 49 establishments and/or locations (countries, jurisdictions, and organizations) subject to an adequacy decision by the DIFC.

The decision comes as a result of an assessment by the DIFC commissioner of the grounds for lawful and fair processing of data under the CCPA, the existence of data protection principles and data subjects’ rights, international and onward data transfer restrictions, measures regarding security of processing, and breach reporting and accountability. To read the full decision, click here.  

However, since the CCPA does not have a provision related to the transfer of personal information outside of California or the U.S., DIFC exporters that send personal information to a California-based importer under the decision would still need to ensure that the onward transfer of such personal information is safeguarded. Additionally, this decision will be reviewed annually by the DIFC Commissioner to ensure that the CCPA’s protections still meet expectations.

CISA released a blog post last week reminding software designers that artificial intelligence (AI) tools are software and that they “must consider the security of the customers as a core business requirement, not just a technical feature, and prioritize security throughout the whole lifecycle of the product, from inception of the idea to planning for the system’s end-of-life.”

CISA’s urging to make AI systems Secure by Design is timely, taking into consideration recent reports of threat actors using AI for attacks and disruption of systems. 

Threat actors have been leveraging vulnerabilities in software for many years and continue to leverage zero-day vulnerabilities to date. Software developers are conflicted between getting products to market quickly and embedding security in the software. This conflict has to be addressed when developing AI tools to learn from the mistakes of the past and to try to prevent threat actors from using software, including AI systems, as weapons. The CISA blog post is an important read for all software developers and engineers, as well as their bosses.

At the recent Federal Aviation Administration (FAA) Drone Symposium (co-hosted by AUVSI), FAA Deputy Regional Administrator Deb Sanning discussed the impact of autonomy and AI, human/machine integration, and the strategies for gaining public trust in autonomous systems, like drones. Sanning discussed this topic along with Brendan Groves from Skydio; Taylor Lochrane, the Deputy Director for Science and Technology at DOT; Lauren Haertlein from Zipline; and Margaret Nagle from Wing. What did the panel have to say about this issue? Well, in the aviation sector, “[a]utomation is making a meaningful impact in worker safety.” For example, over 30 state DOTs use drones for bridge inspections, which helps to cut time and costs as well as reduce the likelihood of dangerous (and even deadly) outcomes. While most would agree that the use of an autonomous drone to perform these inspections makes sense, the issue of safe and responsible use of AI and robotics still lingers. The panel suggested that responsible autonomous drone use rests on 1) the obligation to mitigate potential misuse of the technology; and 2) governments should be the final arbiter of appropriate conduct.

The core concepts behind these points for drone manufacturers and drone operators, as well as drone software developers using AI and machine learning, are to educate, listen, and respond. When a drone company communicates with the people of the cities and towns in which they operate, they can cultivate acceptance, build connections, and alleviate potential privacy concerns.

To promote widespread use of autonomous drones and vehicles, drone companies must engage stakeholders at all levels: the FAA, civil aviation authorities, AND mayors and community boards. Automation and societal acceptance of drones are connected: automation allows for scale, and scale allows for widespread value amongst a community.

I have the pleasure to present an advanced session on cybersecurity to tax preparers at the IRS’ National Tax Preparers Forum each year. The sessions are well attended, and I enjoy meeting attendees and talking about the craziness of new techniques threat actors are using to attack small businesses. This year was no exception.

One of the tips we discussed is being aware of the data that apps are collecting and checking your privacy settings frequently. During the session, I make the attendees get their phones out, go to their privacy settings, and look to see all of the apps they are allowing to access their location, microphone, and camera. When I explain what this means, there are audible gasps in the audience. It is like having a Russian or Chinese spy on your shoulder at all times.

I check my privacy settings frequently. I have a reminder in my calendar on the first of every month to check my privacy settings and set them where I want them. It is interesting how they may change from month to month depending on software updates.

At the same time, I check to see if there are any apps on my phone that I don’t need or don’t use. If so, I delete them. You can always download them again if you are going to use them.

Finally, before you download an app and click “I agree,” READ THE PRIVACY STATEMENT so you know what data the app is collecting, and you can control the data that is being collected. One setting that is useful is the one where data is only collected while using the app. Get in the habit of checking your privacy settings and deleting apps you don’t use. Don’t let them add up so you gasp when you go to check your settings.