Use of PHI for GenAI and Cloud Accounts Violates Patient Privacy

By Daniel Lopez

Cybersecurity firm Netskope conducted research, which revealed that healthcare employees often expose sensitive information, including protected health information (PHI), through the use of generative AI tools like ChatGPT and Google Gemini and through uploading of information to personal cloud storage solutions like OneDrive and Google Drive.

The healthcare sector has completely accepted AI tools, with most organizations employing AI tools to some extent to enhance productivity. Based on information gathered by Netskope Threat Labs, 88% of healthcare companies have included cloud-based genAI applications into their work procedures, 98% utilize applications that integrate genAI functions, 96% use applications that take advantage of user information for training, and 43% are trying to run genAI systems locally.

As more healthcare companies use AI tools in their processes and make them accessible to their workers, less healthcare employees are utilizing personal AI accounts for doing work. However, there are still 71% of healthcare employees using personal AI accounts compared to 87% last year. When genAI applications aren’t HIPAA-compliant and the programmers do not sign business associate agreements, utilizing those applications with PHI is a violation of HIPAA. This makes companies liable to regulatory fines. Additionally, uploading patient information to genAI applications and cloud storage solutions without appropriate safety measures can hurt patient trust.

Aside from financial penalties, breaches hurt patient trust and ruin company credibility with sellers and associates. Therefore, the use of AI tools needs more supervision. It is recommended to use authorized tools to minimize “shadow AI” issues.

As per Netskope, the improper use of HIPAA-covered information is the top security issue in the healthcare industry. PHI is most often uploaded to personal cloud programs, genAI applications, and other unapproved storages. Netskope states that 81% of data policy violations involve HIPAA-covered healthcare information. The 19% involve intellectual property, source code, and secrets.

Healthcare companies need to balance the advantages of using genAI with compliance with strict data policies to minimize associated risks. Netskope advises using business-grade genAI programs with solid security capabilities to protect sensitive and covered data. Data loss prevention (DLP) tools can also be used for monitoring and managing genAI tools access to prevent privacy violations. Netskope states that 54% of healthcare companies have DLP policies, compared to 31% last year. The most frequently blocked genAI applications in healthcare are Scite, DeepAI, and Tactiq. Up to 44% of healthcare companies block these applications with their DLP tools because of privacy issues and use other more secure options.

Although genAI tools are definitely useful in healthcare and could help enhance productivity, there are important security issues. Netskope states that healthcare providers should stay cautious, apply complete security procedures, impose data protection guidelines, and include the risks in their cybersecurity awareness and HIPAA training.

The report additionally states the danger of malware infections through cloud applications. More threat actors use cloud applications like GitHub, Amazon S3, OneDrive, and Google Drive to set up data stealers and ransomware. Instead of trying to break into networks themselves, using social engineering techniques, threat actors trick healthcare workers into breaching their systems with first-stage malware payloads, allowing threat actors to get initial access to systems. Netskope gives the following recommendations to avoid these concerns:

  • Check all HTTP and HTTPS traffic to prevent phishing and malware attacks
  • Block applications that have no business use or create a disproportionate threat to the company
  • Use remote web browser isolation systems when visiting websites like newly registered domains with a greater risk.

Image credit: flyalone, Adobestock

Twitter Facebook LinkedIn Reddit Link copied to clipboard

Posted by

Daniel Lopez

Daniel Lopez is the HIPAA trainer behind HIPAA Coach and the HIPAA subject matter expert for NetSec.news. Daniel has over 10 years experience as a HIPAA coach. Daniel provides his HIPAA expertise on several publications including Healthcare IT Journal and The HIPAA Guide. Daniel has studied Health Information Management before focusing his career on HIPAA compliance and protecting patient privacy. You can follow Daniel on Twitter / X https://twitter.com/DanielLHIPAA