Translate this page into:
Artificial Intelligence: A Boon to Palliative Care Providers and Cancer Patients?
*Corresponding author: Anju Gupta, Department of Anesthesiology, Pain Medicine and Critical Care, All India Institute of Medical Sciences, New Delhi, India. dranjugupta2009@rediffmail.com
-
Received: ,
Accepted: ,
How to cite this article: Gupta N, Gupta A. Artificial Intelligence: A Boon to Palliative Care Providers and Cancer Patients? Indian J Palliat Care 2024;30:187-8. doi: 10.25259/IJPC_218_2024
There is a tremendous burden of cancer globally.[1] Early and adequate palliative care is an essential aspect of cancer care and is known to impact overall quality of life significantly. Artificial intelligence (AI) is a transformative force in the medical field, offering significant potential to enhance health delivery systems. Its integration across all aspects of healthcare aligns with the ongoing technological advancements in healthcare, presenting a promising future for palliative care.
AI uses computers to perform tasks previously associated only with humans to predict, analyse and integrate information collected and process it to assist clinical decision-making in cancer care. A frequent challenge faced by palliative care specialists is precise prognostication to guide treatment decisions and referrals to hospice care. One can use the natural language process to screen text to identify keywords and find relevant documents rapidly, which may improve the efficiency and quality of palliative care.
The most recent AI program, Chat Generative Pre-trained Transformer (ChatGPT), is a chatbot based on Open AI that can analyse and study data on human literature from the internet. The ChatGPT has coauthored articles, passed the United States Medical Licensing Examination and generated discharge summaries and patient transcription.[1] These remarkable attributes of the AI-based ChatGPT may help clinicians improve patients’ overall care in palliative medicine.
Patients do not routinely access palliative care services early due to misunderstanding, physicians’ limited personal experience and unwillingness to discuss despite public education initiatives and awareness campaigns. AI-powered platforms such as ChatGPT and Google Gemini can generate patient information leaflets custom-made to specific palliative care topics, including end-of-life care (EOLC), which can help educate patients and physicians regarding disease progression and improve their outlook toward accessing palliative care services and overall end-of-life outcomes.[2]
This issue of IJPC has included an interesting article by Gondode et al. that examines the performance of AI-based Chatbots ChatGPT and Google Gemini on palliative care with reference to debunking various myths prevalent in palliative care. Both ChatGPT and Google Gemini showed extraordinary accuracy in debunking palliative care myths.[3] These AI chatbots can be practical tools for dispelling palliative care myths and improving patient awareness in this domain. The findings offer a promising avenue for enhancing patient knowledge, dismissing misconceptions, promoting informed decision-making in varied cultural settings and eventually improving the quality of life. The time has come for a vigilant adoption of AI in routine clinical practice, training and data-handling approaches.
Despite its perceived benefits, its use is associated with ethical challenges, legal issues and data protection. Furthermore, the medical field is imperfect, and empathy and real-time patient interaction are essential aspects of patient care. There is potential for ‘AI hallucination’, leading to unintentional deceptive judgement due to algorithm restrictions. There have been instances of error in triaging as per set criteria by ChatGPT, which may result in mismanagement and complete dependence on ChatGPT-based machine learning algorithm that cannot be recommended now.[4] Sensitive healthcare circumstances like EOLC have ethical and safety concerns. It is not easy to identify the need for EOLC and counsel patients by physicians due to the associated complexity. AI-based Chat Bots may not be fully equipped to handle these complex situations. In such a wrong decision, who takes responsibility for outcomes is also debatable and needs further discussion. It may not be wise to depend entirely on AI-based chatbots for medical management.
As medical professionals, it is essential to undergo proper training and validation of the AI, considering ChatGPT occasionally writes probable improper or irrational answers.[5]
There is an unmet need for AI’s ever-expanding role in palliative care medicine. This need calls for our collective efforts to address limitations and harness its capabilities for patient-centred care, making us all part of this ongoing development.[6]
While AI presents cutting-edge opportunities for patient education and high-quality care, it is crucial to exercise caution and remain mindful of its limitations. It can serve as a valuable addition to palliative care, but it should not be seen as a replacement for physicians’ unique skills and decisions in providing patient-tailored care.
In conclusion, the literature underscores the significant benefits of AI in palliative care, including enhanced safety and individualised care. However, it is imperative that AI undergoes rigorous clinical validation in prospective and multiple-centre trials before its widespread adoption in routine clinical activities, ensuring patient safety and ethical practice.
References
- Global Cancer Statistics 2022: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J Clin. 2024;74:229-63.
- [CrossRef] [PubMed] [Google Scholar]
- ChatGPT: The Future of Discharge Summaries? Lancet Digit Health. 2023;5:e107-8.
- [CrossRef] [PubMed] [Google Scholar]
- Debunking Palliative Care Myths: Assessing the Performance of Artificial Intelligence Chatbots (ChatGPT vs. Google Gemini) Indian J Palliat Care doi: 10.25259/IJPC_44_2024
- [CrossRef] [Google Scholar]
- Natural Language Processing and NETWORK analysis in Patients Withdrawing from Life-sustaining Treatments: A Retrospective Cohort Study. BMC Palliat Care. 2022;21:225.
- [CrossRef] [PubMed] [Google Scholar]
- ChatGPT: Friend or Foe?-Utility in Trauma Triage. Indian J Crit Care Med. 2023;27:563-6.
- [CrossRef] [PubMed] [Google Scholar]