Menu

X-ray machines that think: How embedded AI cuts the wait for diagnosis

The system prioritises based on clinical urgency. It tackles three questions: Is this scan normal or abnormal? If abnormal, is it critical? If critical, what intervention does the patient need?

Published Feb 15, 2026 | 7:00 AMUpdated Feb 15, 2026 | 7:00 AM

X Ray AI

Synopsis: 5C Network processes 6,000 X-rays daily through its teleradiology platform. Every scan generates feedback. A radiologist reviews the AI’s findings, approves them, or marks them as incorrect. Missed findings, overcalls, and undercalls all feed back into training.

India faces a crisis in radiology. The country has approximately 20,000 radiologists serving 1.4 billion people. That ratio translates to one radiologist for every 100,000 citizens, or roughly 22 per million. In comparison, the United States maintains 199 per million. The gap widens in towns and villages, where clinics operate without a single radiologist within driving distance.

This shortage forces patients into a waiting game. A person walks into a clinic, gets an X-ray, and sits. The scan travels through WhatsApp, email or cloud systems to a radiologist who might be three states away.

Hours pass. Sometimes days. The patient returns home without knowing if that chest pain signals pneumonia or if that fall fractured a bone. The scan sits in a queue with hundreds of others.

5C network to process 10,000 scans

To tackle the same issue, 5C Network built the AI that now runs inside BPL’s machines. The Bengaluru company operates a teleradiology platform that processes 10,000 scans daily from 2,000 clinics across India. Those scans arrive in all conditions. Some come from portable machines in villages where generators power the equipment. Others come from hospitals in Mumbai. Image quality varies.

“Most AI in healthcare trains on data from developed countries,” says Kalyan Sivasailam, who founded 5C Network. “Those datasets contain scans taken on equipment that costs lakhs, in facilities where power never cuts out. The images look pristine.”

“5C Network accumulated nine million reports over seven years. The AI trains on this collection, which grows by 1.5 million reports annually. The system uses convolutional neural networks and vision-language models that cluster similar cases together. When a new scan arrives, the system checks its nearest neighbours to understand what it sees,” he added.

Also Read: Is your daily bread hiding a diabetes time bomb? 

The compression challenge

Embedding AI into X-ray machines required solving a computing problem. Graphics processing units handle AI tasks, but they consume electricity, generate heat, and cost money. Deploying them in every clinic makes no economic sense.

The team compressed their models to run on 16 gigabytes of random access memory. That matches what a basic laptop contains. The compressed version operates on the machine’s central processing unit without needing a graphics card or internet connection.

“GPUs are expensive, there is a shortage, and they cannot be deployed everywhere,” says Manan Reshamwalla, who led the compression work at 5C Network.

The result processes scans in 15 to 20 seconds. The system first classifies each scan as normal or abnormal. If abnormal, it specifies which conditions it detects: pneumothorax, pleural effusion, fractures, consolidation, cardiomegaly. A report appears on the dashboard.

Priority based on clinical urgency

The system prioritises based on clinical urgency. It tackles three questions: Is this scan normal or abnormal? If abnormal, is it critical? If critical, what intervention does the patient need?

The AI deliberately overcalls certain findings. Lung nodules trigger alerts even when the system calculates only a small probability. Missing a nodule carries risk, so the system flags 100 suspected nodules even if 24 turn out wrong.

“The idea is that even if there is a small chance of a nodule being present, we alert the radiologist, and they can then decide,” Sivasailam says. “It depends on how radiologists are choosing to work with AI.”

Radiologists can configure sensitivity levels. Some prefer the system to flag everything; they view their role as ruling out false positives. Others adjust the thresholds to reduce alerts.

Also Read: A retinal photo may soon help spot diabetes early

The feedback mechanism

5C Network processes 6,000 X-rays daily through its teleradiology platform. Every scan generates feedback. A radiologist reviews the AI’s findings, approves them, or marks them as incorrect. Missed findings, overcalls, and undercalls all feed back into training.

An annotation team of 100 people processes these corrections. Every three months, each model receives an update based on this validation loop.

The company reports accuracy metrics. When the system calls an X-ray normal, it proves correct 99.8 percent of the time. For osteoarthritis of the knee, accuracy reaches 98 to 100 percent. Overall accuracy across all findings sits at 99.7 to 99.8 percent.

BPL Cortex launches with chest X-rays, which account for half of all X-ray studies. Between 40 and 60 percent of chest X-rays show no abnormality. The system handles these normal cases automatically, reducing radiologist workload by roughly 20 percent.

The technology will expand to spine and extremities in coming months. Those models currently run on GPUs; conversion to CPU operation continues.

The business model serves two markets: Hospitals and diagnostic centres that need radiologist reports send scans to 5C Network’s teleradiology service; Clinics that buy BPL machines with embedded AI get immediate triage results without cloud dependency.

The system however operates without internet. Cloud access exists for radiologists who want to review cases remotely, but the machine functions independently. Updates arrive periodically as 5C Network refines its models, but the core operation requires nothing beyond the equipment in the room.

AI in healthcare

The development raises questions about AI’s role in healthcare. The technology does not replace radiologists but changes their workflow. Normal cases get filtered automatically. Radiologists focus on cases the AI flags as abnormal or uncertain.

Critics of AI in healthcare point to accuracy concerns and the risk of automation bias, where doctors trust machine readings without sufficient scrutiny. Proponents argue that in settings with severe doctor shortages, immediate AI triage beats waiting days for any human review.

The embedded approach also sidesteps data privacy concerns that cloud-based AI systems face. Scans never leave the machine unless the clinic chooses to send them for formal reporting.

For now, the waiting room problem persists across healthcare, but for X-rays in India, the wait just shortened to less than a minute. A person walks in, gets scanned, and walks out knowing if something requires attention or if they can return home. The machine generates the answer before the patient reaches the door.

(Edited by Sumavarsha)

journalist-ad