Published Feb 08, 2026 | 7:00 AM ⚊ Updated Feb 08, 2026 | 7:00 AM
Representational image. Credit: iStock
Synopsis: AI-powered “8K” enhancements magically convert grainy 3D/4D fetal ultrasounds into strikingly photorealistic portraits—revealing eyelashes, lip curves, and finger details—that parents eagerly share online. While aiding early abnormality detection, these tools navigate India’s strict PCPNDT Act, which bans any tech enabling fetal sex determination to prevent sex-selective abortions. Approved systems limit AI to facial/upper-body focus, but broader diagnostic AI remains prohibited, prioritising demographic balance over advanced medical benefits.
The images appear on social media feeds like magic tricks. Parents share ultrasound scans transformed into photorealistic portraits of unborn babies. The technology enhances standard 3D and 4D ultrasounds into what vendors call “8K resolution”. Facial features emerge in detail that matches newborn photographs. Eyelashes. Finger creases. The curve of lips.
“This isn’t just an ultrasound, it’s the future,” reads one post circulating on LinkedIn. The enhancement uses AI trained on thousands of foetal scans to fill in gaps, sharpen contours, and render images that once appeared as shadowy blobs into portraits parents frame and share.
“4D ultrasound has been performed for many years,” says Dr Mohammed Zeeshan Ali, Consultant Radiologist at Airoc Hospital told South First. “Now, with the help of AI and 4D technology, clinicians enhance and contour foetal images. This is legal under the Pre-Conception and Pre-Natal Diagnostic Techniques (PCPNDT) Act, 1994. The genital area ‘must not’ be focused on during the scan.”
The technology walks a line between medical advancement and social risk. Better imaging helps detect abnormalities earlier. The same imaging, if misused, could enable the gender identification that India banned decades ago to prevent sex-selective abortion.
India enacted the PCPNDT Act in 1994 after sex ratios collapsed across several states. Families used ultrasound to identify female foetuses and abort them. The practice skewed demographics so severely that villages ended up with twice as many boys as girls.
The law prohibits determination and disclosure of foetal sex. Radiologists who violate it lose their licences and face jail time. Anyone operating an ultrasound machine without being a registered radiologist goes straight to jail. The regulations extend to any technology that could enable gender identification.
Dr K Prabhakar Reddy, past president of the Indian Radiology and Imaging Association, describes the objective simply: “If you end up with 10 girls and 20 boys, the entire social structure gets affected.”
The ban works through multiple control points. Clinics need permissions to install ultrasound equipment. Radiologists submit monthly compliance forms. Every scan generates documentation. The system creates a paper trail that regulators can audit.
Violations remain rare in southern states but persist in northern regions and parts of Maharashtra.
“Luckily, we do not see many violations in Andhra Pradesh and this region, including Telangana and most of southern India,” Reddy tells South First. “The bigger problem is in northern regions and parts of Maharashtra. Even though the rules exist, violations still happen.”
Sex ratios reflect enforcement patterns. States with strict monitoring maintain balanced numbers. States where violations continue show distorted demographics that affect marriage markets, crime rates, and social stability.
When Dr Zeeshan performs ultrasounds, he follows protocols designed to prevent misuse. Radiologists measure foetal development by examining bone lengths, organ sizes, and anatomical markers. The femur sits near the genitalia. Images captured for legitimate measurements could incidentally show sex.
“In India, we rarely focus on the genitalia or capture images of it,” Zeeshan explains. “Sometimes, while measuring the femur length, the probe comes close to the genital area. If images are saved, we ensure that no printed images given to patients include the genitalia.”
The practice requires constant attention. A radiologist knows what they see on screen but controls what gets saved, printed, and handed to patients. The 8K enhancement technology adds another layer of concern because it renders everything in such detail.
AI systems learn to sharpen facial features, define bone structure, and enhance tissue contrast. The same algorithms that make a baby’s face visible in stunning detail could theoretically enhance genital features if pointed in that direction.
Vendors building these systems design them to avoid the genital area. The AI focuses on the face, hands, and upper body. Central authorities approved the technology based on these safeguards. But the underlying capability exists in the models.
The PCPNDT Act was written before anyone imagined AI enhancement of medical images. The law assumes human operators make conscious decisions about what to show and what to hide. AI operates differently.
Enhancement algorithms learn patterns from training data. If that data includes anatomical features from thousands of scans, the AI absorbs those patterns. The model contains knowledge about genital development even if it never displays that information in output.
Current 8K enhancement systems avoid this by design. Developers train the AI to focus on facial features and deliberately exclude genital regions from enhancement. The systems work as intended when operated correctly.
But AI companies developing broader radiology tools face a different problem. Western biobanks contain ultrasound data that includes gender identification. Training models on this data creates systems that technically know how to identify foetal sex.
Bharadwaj KSS, founder of Endimension Technology, describes the concern: “AI tools learn from data. Biobanks already contain data that shows how gender can be identified through ultrasonography. So if AI is learning from such data, it may technically know how to identify gender, even if it is not openly stating it.”
Could users extract this information? Could the technology be repurposed? These questions lack clear answers because AI models embed knowledge in ways not fully transparent.
“AI can be trained to provide only specific outputs,” Bharadwaj says. “It is ultimately a computer program. If it is not trained or permitted to give a certain output, it simply will not. You cannot extract something that it has not been designed to reveal.”
This assumes proper implementation and assumes developers can completely control what information users might derive from AI outputs. Regulators remain unconvinced.
The PCPNDT Act acknowledges that gender identification sometimes serves medical purposes. Certain congenital anomalies and sexual developmental disorders require knowing foetal sex. Radiologists report this information to gynaecologists through proper channels when medically necessary.
“There are medical situations where knowing the sex is important, such as certain congenital anomalies or sexual developmental disorders,” Reddy says. “In those cases, we are legally and ethically bound to inform the gynaecologist through proper channels. It becomes a medical and medico-legal responsibility.”
A foetus lacking vital organs or showing severe developmental problems needs evaluation that includes sex. Ambiguous genitalia requires early identification to plan interventions. The law permits disclosure in these cases with documentation.
“If a fetus has severe congenital anomalies, such as absence of vital organs, it is our responsibility to inform the treating doctor so that appropriate decisions can be made,” Reddy explains.
The challenge lies in separating legitimate medical use from social selection. Once information exists in a system, controlling who accesses it and for what purpose becomes difficult. The 8K enhancement makes features so visible that even trained professionals must actively avoid certain areas.
Consanguineous marriages, common in Telangana and Andhra Pradesh, contribute to congenital anomalies. This creates genuine medical need for detailed foetal imaging. The same imaging capability that helps detect these problems could be turned toward gender identification if safeguards fail.
Enforcement creates operational friction. Setting up a clinic with ultrasound equipment requires multiple permissions. Radiologists fill forms every month documenting scans performed, patients treated, and compliance measures followed.
“Many radiologists have argued that the process is very cumbersome,” Reddy notes. “There are forms to be filled and submitted every month. That compliance burden is real.”
The process can take three to four months to get approvals for new equipment. Delays affect clinics trying to expand services or upgrade technology. Some radiologists argue that better monitoring rather than blanket paperwork requirements would achieve the same goals with less administrative drag.
“Punishment should be proportional to the gravity of the offence, and administrative processes should be faster,” Reddy says. “Sometimes it takes three to four months just to get approvals.”
Still, the Indian Radiology and Imaging Association supports strict adherence. The organisation views the regulations as necessary protection against social harm, even when compliance creates operational challenges.
“As an organisation, we do not support sex determination in any form,” Reddy states. “We want to help society and work with the government, and we strictly adhere to the law.”
Indian authorities responded to AI uncertainty with prohibition. AI for ultrasound remains banned except for approved enhancement systems that follow strict protocols.
“That law definitely exists in India, and because of it, we do not have any AI solutions for ultrasound today,” Bharadwaj explains, referring to diagnostic AI. “At least from a regulatory perspective, AI for ultrasound is currently not allowed in India.”
His company, Endimension Technology, develops AI for X-rays, CT scans, and MRI but stops short of ultrasound. The company explored the possibility and received an unambiguous answer from regulators: not allowed.
“We did explore this, but at the regulatory level, the answer was very clear,” Bharadwaj says. “It is not allowed at all right now. Maybe in the future, things could change, but India is very sensitive on this issue, and you have to be extremely careful.”
The restriction differs from how human radiologists operate. Doctors performing ultrasounds can see genital features but choose not to disclose them except when medically necessary. The law trusts human judgment while blocking AI assistance.
“It is similar to how doctors operate today,” Bharadwaj notes. “They may technically know certain things, but they are not allowed to disclose them. Similarly, AI might technically be capable of identifying something, but it does not mean it will give that as an output.”
The difference lies in scale and reproducibility. A radiologist makes individual decisions scan by scan. An AI system, once trained, processes thousands of scans using the same learned patterns. If those patterns include gender identification, every scan poses potential risk.
The ban prevents beneficial applications of AI in obstetric ultrasound. Other countries deploy AI systems that improve detection of foetal abnormalities, guide interventions, and assist less experienced operators. Indian patients and doctors lack access to these tools.
AI could help identify heart defects, neural tube problems, and skeletal anomalies earlier and more accurately. The technology could extend specialist-level diagnostic capability to clinics in rural areas where expert sonographers never reach. These benefits remain unavailable because of risks that lie not in the technology itself but in how society might misuse it.
The 8K enhancement systems operate in a narrow space between utility and risk. They make foetal features visible in ways parents value. They help radiologists spot abnormalities. They create images detailed enough that misuse becomes technically possible, even if safeguards prevent it.
Vendors demonstrated the technology at the national radiology conference, showing radiologists what AI can now do with ultrasound data. The response mixed excitement about diagnostic capability with wariness about social implications.
“Definitely, there should be improvements,” Reddy says, referring to the PCPNDT Act. “The most important objective is to ensure that the sex ratio is balanced.”
He suggests better monitoring rather than blanket restrictions might address concerns while permitting beneficial AI applications. Faster administrative processes could reduce compliance burden without compromising enforcement. Punishment proportional to offence gravity could strengthen deterrence.
But changing the law requires political will, and no politician wants responsibility for relaxing restrictions that prevent sex-selective abortion. The current framework stays in place. The ban on diagnostic AI for ultrasound continues.
(Edited by Amit Vasudev)