Doctors are most concerned about patients using AI tools to diagnose themselves or take medicines without medical advice.
Published Jan 13, 2026 | 11:38 AM ⚊ Updated Jan 13, 2026 | 11:38 AM
ChatGPT Health. (X)
Synopsis: Recently, OpenAI announced ChatGPT Health, to allow users to securely connect their medical records and wellness data to receive more personalised health-related information. The announcement has sparked global debate on how far AI should go in handling sensitive health information. Doctors believe AI tools could help patients by improving health awareness and prevention, but they have also warned that these tools could lead to self-diagnosis and other issues.
Artificial Intelligence (AI) is increasingly finding its way into healthcare, not as a replacement for doctors but as a tool to help patients navigate an overburdened system.
Recently, OpenAI announced ChatGPT Health, a new, dedicated health-focused experience that allows users to securely connect their medical records and wellness data to receive more personalised health-related information.
While the feature has been rolled out to a limited set of users in the US and is not yet available in India, the announcement has sparked global debate on how far AI should go in handling sensitive health information.
OpenAI describes ChatGPT Health as “a dedicated experience that securely brings your health information and ChatGPT’s intelligence together”, to help users feel more informed, prepared, and confident while navigating their health.
Unlike general health-related chats on AI platforms, Health operates as a separate space within ChatGPT, designed specifically for sensitive medical and wellness conversations.
According to the company, users can securely connect medical records and wellness apps, including Apple Health, MyFitnessPal, and other fitness or lab-tracking platforms, so that conversations are grounded in their own health information and context.
This allows the tool to help users understand test results, track health trends over time, prepare questions ahead of a doctor’s appointment, or make sense of lifestyle data such as sleep, activity, or nutrition patterns.
Importantly, OpenAI has emphasised that ChatGPT Health is not intended for diagnosis or treatment.
“Health is designed to support, not replace, medical care,” the company said, adding that the tool is meant to help users navigate everyday health questions and recognise patterns over time, rather than respond only in moments of illness.
The idea, OpenAI notes, is to help patients arrive at medical consultations better informed, rather than attempting to substitute professional judgement.
OpenAI’s move comes at a time when healthcare systems globally are struggling with rising patient loads, fragmented medical records, and shrinking consultation time.
According to the company, “health is already one of the most common ways people use ChatGPT,” with over 230 million people worldwide asking health and wellness questions every week.
Fidji Simo, CEO of OpenAI, shared a personal experience that highlights this need: “Last year, I was hospitalised for a kidney stone and prescribed an antibiotic. I asked ChatGPT, which had access to my health records, whether it was safe. It flagged that the medication could reactivate a serious infection I’d had before. The resident was relieved — this could have caused severe complications.”
Simo pointed out that doctors often have only a few minutes per patient and fragmented records, making it hard to see the full picture. “AI doesn’t have these constraints,” she said, “so it can support clinicians and help reduce errors.”
ChatGPT Health, OpenAI said, is designed to build on this existing use while introducing “additional, layered protections designed specifically for health” and clearer boundaries around how such information should be used.
The launch, however, raises important questions for patients and clinicians alike. OpenAI has stressed that ChatGPT Health is “designed to support, not replace, medical care” and is “not intended for diagnosis or treatment”, positioning it as an assistive tool rather than a clinical decision-maker.
Even so, the announcement has prompted debate about who such tools are really meant for, how they might influence everyday health decisions, and where the line should be drawn between digital assistance and professional medical advice, particularly in countries like India, where self-medication and health misinformation are already significant concerns.
Based on OpenAI’s description, ChatGPT Health is primarily designed for patients and individuals managing their own health, particularly those dealing with ongoing conditions, complex medical histories, or large volumes of health data spread across multiple platforms.
By consolidating information from medical records, wearables, and wellness apps, the company said the tool can help users see the “full picture” of their health, something that is often difficult within traditional healthcare systems.
The feature may also appeal to people looking to take a more active role in preventive health — tracking diet, exercise, sleep or recovery — areas that are often outside the scope of routine clinical care.
OpenAI said the tool was developed in close collaboration with physicians across dozens of specialities to ensure responses prioritise safety, clarity and appropriate escalation to a clinician when needed.
At the same time, the company stressed the importance of privacy and control. ChatGPT Health operates as a separate, encrypted space, with additional protections designed specifically for sensitive health data.
Conversations within Health are not used to train OpenAI’s foundation models, and users can choose what information to connect or remove at any time. According to OpenAI, these safeguards are critical given the deeply personal nature of health information.
ChatGPT Health is open to users on Free, Go, Plus, and Pro plans, but access depends heavily on location. The rollout currently targets regions outside the European Economic Area, Switzerland, and the United Kingdom, and operates through a waitlist.
Indian users can request access and join this waitlist. They can use the core ChatGPT Health features once approved.
However, key limitations remain. Medical record integrations and several wellness app connections work only in the United States. Apple Health integration also limits access to users on iOS devices.
Dr Kiran Madhala, Professor of Anaesthesiology and Critical Care Medicine at Gandhi Medical College, Secunderabad, said AI tools can track daily health data and identify patterns that doctors may otherwise miss.
“AI can record data from the past year, analyse it and give an overall picture,” he told South First. He added that such long-term tracking is difficult for doctors to do manually during short consultations.
However, Dr Madhala stressed that AI tools are only helpful if they are used correctly. “The response depends on the question you ask,” he said, pointing out that without proper medical knowledge, patients may not know how to frame the right questions or interpret the answers.
Doctors also believe AI tools could help patients by improving health awareness and prevention. Dr Vimala Manne, dermatologist and medical director of Dr Vimala’s Skin, Hair & Laser Centres in Hyderabad, said such platforms could guide people towards healthier lifestyles and encourage them to seek medical help early.
“It can help in prevention, lifestyle advice, and directing patients to the right specialist, but it should not be used in treatment,” she told South First.
According to her, many people currently rely on Google or social media for medical advice, which often leads to misinformation. An AI tool, she said, could instead act as a starting point that pushes patients to consult doctors rather than treat themselves.
Both doctors agreed that AI should act as a bridge between patients and healthcare professionals. They warned that it should never encourage people to diagnose or treat themselves without seeing a doctor.
Doctors are most concerned about patients using AI tools to diagnose themselves or take medicines without medical advice.
Dr Manne said this could be especially risky in areas like dermatology and mental health. “Every person’s skin is different. Even people in the same family don’t have the same skin type,” she said.
She warned that using the wrong creams or medicines based on AI advice could lead to permanent skin damage, scarring, or pigmentation—problems that are difficult to reverse.
Mental health advice without proper evaluation is even more dangerous, she added. “A wrong diagnosis can cause serious emotional trauma,” she said.
She also raised a critical question: If something goes wrong, who is responsible for the harm? Who takes accountability?
Both doctors agreed that while AI can support doctors and educate patients, it should not directly give treatment advice. In India, where self-medication is already common, they warned that misuse of such tools could do more harm than good.
“AI can give people ideas on what to look for and what to discuss with a doctor,” Dr S Jayaraman, Senior Consultant in Pulmonary Medicine at MGM Healthcare, Chennai, told South First.
“But the final diagnosis and treatment decisions must always come from a qualified clinician,” he added.
He raised a caution about accessibility: In rural or low-literacy settings, patients may not fully understand AI-generated guidance. “These tools are helpful, but patients still need proper support to interpret the information and take the right steps,” Dr Jayaraman said.
This highlights a potential digital divide — while tech-savvy users may benefit, others could struggle to use AI effectively without clinical guidance.
Dr Jayaraman also pointed out examples of practical applications, such as wearable devices that track sleep or vitals. AI can analyse this data to provide preliminary insights, helping patients monitor trends over time.
“It’s a way to make patients more aware of their health, but the steps to act on this information must always be guided by a doctor,” he noted, emphasising that AI’s role is to assist, not replace, clinical care.
(Edited by Muhammed Fazil.)