Artificial intelligence algorithms have helped health care providers and public health officials predict where COVID-19 outbreaks will occur, aided hospitals in triaging coronavirus patients and assisted clinicians in understanding patients with a high risk of developing severe symptoms from the virus. Because AI relies on patient data to learn and become more precise, it can put patient data at risk—particularly when it is used outside clinical settings. Nurses can help patients better understand how AI algorithms and health apps can use their data.
Artificial Intelligence and Patient Health Data
Though definitions of artificial intelligence (AI)—also known as machine learning—vary, AI refers to the ability of a computer or algorithm to reason or learn from past experience to perform tasks normally associated with intelligent beings. Artificially intelligent algorithms can perform tasks such as pattern and speech recognition, image analysis and decision-making and can help automate arduous tasks.
But as AI learns from medical data, that can put patient information at risk.
“AI systems require large volumes of data to be developed, trained and constantly refined,” MedCity News reported in an article on AI’s effect on patient privacy. “Where do the data come from though? If we’re talking about healthcare, the data must be about patients, which means sensitive, regulated information must be ingested, utilized and held.”
Patients should take time to understand how their information is used before signing up for health apps, said Patricia Dykes, RN, PhD, research program director for the Center for Patient Safety, Research and Practice at Brigham and Women’s Hospital. Health care providers and other entities subject to HIPAA must explain to patients how their information will be used, but many mobile apps outside health systems do not have to comply with HIPAA privacy rules.
“People need to be aware of—with the apps that they’re using that are collecting their data—what information are they collecting? How’s it going to be used?” said Dykes, who also serves as board chair of the American Medical Informatics Association. “As nurses and as healthcare providers, we should make patients more aware of that.… We should coach them to make sure they sufficiently understand the future use of the data.”
Information added to a patient’s medical record by health providers
A conversation between providers about a patient’s treatment
Information about a patient in a health insurance system
Held by health care providers, insurers or other health plans, clearinghouses or business associates subject to HIPAA regulations. Health contractors, including AI developers, may not always be considered business associates.
Red flags patients should consider when using health apps
Apps from health providers or insurers are more likely to protect data than third-party mobile apps. But third-party mobile apps can also help patients manage health conditions, maintain an exercise routine and gain insights on their nutrition. Some apps may sell user data, so patients should strive to understand how they are authorizing mobile apps to use their information.
“Most people don’t take the time to understand the user agreements,” Dykes said. “They usually ignore them. They just click past them because they want to use the app.”
Notice: Health providers must inform patients on how they plan to use and share their information and outline their privacy practices in writing. Most patients receive this notice during their first visit to a provider.
Sharing: Patients can decide whether they want to permit their health information to be shared for other purposes, such as marketing.
Reports: Patients can request reports on how their information was used for other purposes beyond their health care.
Refusal: Refusing to sign disclosures provided by health care providers does not prohibit those providers from using or sharing patient data as permitted by HIPAA.
Access: Patients can ask to access their health records at any time and can also request corrections to their medical records.
Complaints: Patients can file a complaint with their insurer or the Department of Health and Human Services if they believe their health information is not being protected under HIPAA.
How Nurses Can Help Answer Questions About Your Health Information
Protecting patient information and maintaining confidentiality is vital to the nurse/patient relationship, though it has become more difficult with the rise of technology, which is reshaping relationships between patients and clinicians. Providers are no longer the sole gatekeepers of medical information; patients have more ways than ever to monitor data around their health and well-being. And information about patient data is increasingly generated and recorded by systems powered by AI, rather than calculated and documented by nurses and other health care providers.
Many hospitals and health systems rely on electronic health records (EHRs) and have built-in systems that use algorithms to monitor patient health and predict outcomes, such as a patient’s risk of falling. That means nurses are constantly incorporating new technology and data sources in patient care. And in many cases algorithms replace or assist in nurse decision-making.
“These kinds of models can be very helpful—these algorithms,” Dykes said. “But I think that we always have to ask, ‘What is the science behind it? How are we going to use this in practice, and is it going to improve practice?’”
Tips for Clinicians When Discussing Data Protection
Presentation of patient data to patients can affect how it’s received and understood and ultimately influence patients’ health outcomes. Dykes said nurses also have a responsibility to discuss with patients how their health data might be used.
“As nurses, I think we have an obligation just to have a discussion about data and risk so that people are aware of it, especially if you have a patient who likes to use apps and devices,” Dykes said. “To the extent that these things can help patients adhere to exercise protocols or machines that help them be more healthy, I think we should encourage them to do it, but they should also understand the risks.”
These communication strategies can help nurses discuss uses of personal information with patients:
Offer sample questions. Present patients with the questions they can ask about the use and management of their personal data.
Use teach-back strategies. The teach-back method helps patients internalize and process how they understand how their data will be used.
Ask patients if they have questions. Respond to any questions patients have about their personal information, and follow up if needed.
The most helpful AI algorithms not only alert nurses when a patient’s health is changing but also offer suggestions for patient care based on personal data, Dykes said.
“My concern of just providing alerts to nurses is that there are too many alerts already, and they become noise in the environment,” Dykes said. “And if they’re not actionable, then that’s probably not going to make a difference.”
Please note that this article is for informational purposes only. Individuals should consult their health care provider before following any of the information provided.