Everything You Need to Know About Artificial Intelligence and Your Patient Data

The pandemic has rapidly increased implementation of artificial intelligence in health care settings, and its uses will continue to multiply.

Artificial intelligence algorithms have helped health care providers and public health officials predict where COVID-19 outbreaks will occur, aided hospitals in triaging coronavirus patients and assisted clinicians in understanding patients with a high risk of developing severe symptoms from the virus. Because AI relies on patient data to learn and become more precise, it can put patient data at risk—particularly when it is used outside clinical settings. Nurses can help patients better understand how AI algorithms and health apps can use their data.

Artificial Intelligence and Patient Health Data

Though definitions of artificial intelligence (AI)—also known as machine learning—vary, AI refers to the ability of a computer or algorithm to reason or learn from past experience to perform tasks normally associated with intelligent beings. Artificially intelligent algorithms can perform tasks such as pattern and speech recognition, image analysis and decision-making and can help automate arduous tasks.

In medicine, most AI applications rely on numerical or image-based data input, and the use of AI in medicine is rapidly growing. It can be used in clinical and administrative processes and often helps predict patient outcomes. For example, virtual nurse assistants and doc bots can help diagnose patients; AI-backed software can help detect cardiac arrestspot early cancer or flag pulmonary embolisms; and machine learning can help support clinical decision making. Future uses of AI could include brain-computer interfaces, proliferation of smart devices in hospitals and more monitoring of patient health through wearable devices.

But as AI learns from medical data, that can put patient information at risk.

“AI systems require large volumes of data to be developed, trained and constantly refined,” MedCity News reported in an article on AI’s effect on patient privacy. “Where do the data come from though? If we’re talking about healthcare, the data must be about patients, which means sensitive, regulated information must be ingested, utilized and held.”

The Health Insurance Portability and Accountability Act (HIPAA) requires health care providers and their associates to protect patient information. But health systems are not the only ones offering insights to patients through AI: Many third-party mobile apps also aim to track the health of individuals. While AI used by health systems must be HIPAA-compliant, third-party health apps do not necessarily have to comply with the same guidelines. 

What Patients Should Know About Their Personal Data

HIPAA protects patient information from being used or shared without a patient’s knowledge. However, research in the International Journal of Telemedicine and Applications indicates most patients do not understand how their information is used or stored.

Patients should take time to understand how their information is used before signing up for health apps, said Patricia Dykes, RN, PhD, research program director for the Center for Patient Safety, Research and Practice at Brigham and Women’s Hospital. Health care providers and other entities subject to HIPAA must explain to patients how their information will be used, but many mobile apps outside health systems do not have to comply with HIPAA privacy rules.

“People need to be aware of—with the apps that they’re using that are collecting their data—what information are they collecting? How’s it going to be used?” said Dykes, who also serves as board chair of the American Medical Informatics Association. “As nurses and as healthcare providers, we should make patients more aware of that.… We should coach them to make sure they sufficiently understand the future use of the data.”

Under the HIPAA Privacy Rule, patient data is protected if it is:

  • Information added to a patient’s medical record by health providers
  • A conversation between providers about a patient’s treatment
  • Information about a patient in a health insurance system
  • Billing information
  • Held by health care providers, insurers or other health plans, clearinghouses or business associates subject to HIPAA regulations. Health contractors, including AI developers, may not always be considered business associates.

Red flags patients should consider when using health apps

Apps from health providers or insurers are more likely to protect data than third-party mobile apps. But third-party mobile apps can also help patients manage health conditions, maintain an exercise routine and gain insights on their nutrition. Some apps may sell user data, so patients should strive to understand how they are authorizing mobile apps to use their information. 

“Most people don’t take the time to understand the user agreements,” Dykes said. “They usually ignore them. They just click past them because they want to use the app.”

After reading a user agreement to understand how a third-party app might use data, consumers should look for these red flags when deciding whether to download a health app, suggests the American Heart Association.

These factors could indicate user data could be shared:

The presence of ads on a health app may signal that the app sells user data to advertisers.

Privacy settings
Users should check whether they can change the automatic settings on the app to protect their privacy, such as turning off location settings.

Most mobile apps that are free to users make money from advertisers.

 Key patient rights under HIPAA

Here’s what else patients should know about how HIPAA protects their data and privacy, according to the Department of Health and Human Services.

Notice: Health providers must inform patients on how they plan to use and share their information and outline their privacy practices in writing. Most patients receive this notice during their first visit to a provider.

Sharing: Patients can decide whether they want to permit their health information to be shared for other purposes, such as marketing.

Reports: Patients can request reports on how their information was used for other purposes beyond their health care.

Refusal: Refusing to sign disclosures provided by health care providers does not prohibit those providers from using or sharing patient data as permitted by HIPAA.

Access: Patients can ask to access their health records at any time and can also request corrections to their medical records.

Complaints: Patients can file a complaint with their insurer or the Department of Health and Human Services if they believe their health information is not being protected under HIPAA.

How Nurses Can Help Answer Questions About Your Health Information

Protecting patient information and maintaining confidentiality is vital to the nurse/patient relationship, though it has become more difficult with the rise of technology, which is reshaping relationships between patients and clinicians. Providers are no longer the sole gatekeepers of medical information; patients have more ways than ever to monitor data around their health and well-being. And information about patient data is increasingly generated and recorded by systems powered by AI, rather than calculated and documented by nurses and other health care providers.

Many hospitals and health systems rely on electronic health records (EHRs) and have built-in systems that use algorithms to monitor patient health and predict outcomes, such as a patient’s risk of falling. That means nurses are constantly incorporating new technology and data sources in patient care. And in many cases algorithms replace or assist in nurse decision-making. 

“These kinds of models can be very helpful—these algorithms,” Dykes said. “But I think that we always have to ask, ‘What is the science behind it? How are we going to use this in practice, and is it going to improve practice?’”

Tips for Clinicians When Discussing Data Protection

Presentation of patient data to patients can affect how it’s received and understood and ultimately influence patients’ health outcomes. Dykes said nurses also have a responsibility to discuss with patients how their health data might be used. 

“As nurses, I think we have an obligation just to have a discussion about data and risk so that people are aware of it, especially if you have a patient who likes to use apps and devices,” Dykes said. “To the extent that these things can help patients adhere to exercise protocols or machines that help them be more healthy, I think we should encourage them to do it, but they should also understand the risks.”

These communication strategies can help nurses discuss uses of personal information with patients:

  • Offer sample questions. Present patients with the questions they can ask about the use and management of their personal data.
  • Use teach-back strategies. The teach-back method helps patients internalize and process how they understand how their data will be used. 
  • Write about it. Health care providers can create articles about health information privacy in newsletters or blogs for patients.
  • Use basic language. Avoiding complex terms can help patients better understand their privacy rights.
  • Ask patients if they have questions. Respond to any questions patients have about their personal information, and follow up if needed.  

The most helpful AI algorithms not only alert nurses when a patient’s health is changing but also offer suggestions for patient care based on personal data, Dykes said. 

“My concern of just providing alerts to nurses is that there are too many alerts already, and they become noise in the environment,” Dykes said. “And if they’re not actionable, then that’s probably not going to make a difference.”

Please note that this article is for informational purposes only. Individuals should consult their health care provider before following any of the information provided.