Most organisations today are willing to embrace artificial intelligence (AI) for cost and operational efficiencies, but those in the healthcare field will need more convincing to follow suit.
Clinicians want technology to get to a level where it is “good enough” before adopting it into their workflow, noted Clement Tan, Singapore country lead for Medow Health AI. The Australia-based healthtech startup aims to streamline medical documentation through the use of AI.
Much of healthcare is centred around the patient and this encompasses how technology will benefit patients, Tan said in a video interview with FutureCIO.
It puts the focus on how technology can be deployed safely and ethically, with patients’ trust and safety in mind, he said.
Hence, accuracy in documentation and will determine whether doctors will use the AI tool, he added, pointing to Medow’s flagship product, which it describes as an “AI medical scribe”.
Used during clinical consultations, the AI tool automatically transcribes and captures notes, reports, and referral letters. It is trained to recognise clinical nuances and phrases across 30 different medical specialties, including paediatrics, neurology, sports medicine, urology, psychiatry, and geriatrics.

The specialised model training better enables Medow AI to recognise speciality-specific phrasing and terminologies, and improves on transcription accuracy, Tan said.
Medical specialists typically would handwrite summarised notes and this often resulted in poor documentation following a consultation, he said. Details would be missing and nuances of conversations with individual patients would not be properly captured.
This ultimately can impact patient care, he noted.
AI accuracy matters even more
And while there were previous similar AI-powered iterations, these were seldom accurate enough, he said. Doctors would not want to use a tool that generates subpar quality documents, only to have to spend extra time editing them.
With advancements in AI and compute resources, AI scribes now are able to produce documentations at higher accuracy, he noted.
This is critical as it determines whether doctors will want to use the tool and are able to use the patient data captured in the documents, Tan said.
And it seems that some in the medical field are willing to tap AI for specific tasks.
A 2024 study led by Singapore’s Nanyang Technological University (NTU) found that 80% of doctors who specialised in gastroenterology accepted and trusted the use of AI-powered tools for diagnosing and assessing colorectal polyps, or benign growths that could metastasise into cancer.
Another 70% said they trusted AI-assisted applications that could guide endoscopists on whether to remove polyps during colonoscopies, according to the survey, which polled 165 gastroenterologists and gastrointestinal surgeons in Asia-Pacific.
It also found that gastroenterologists with fewer than 10 years of clinical experience viewed AI-powered medical tools with higher associated risks, compared to their peers with more than 10 years of experience.
NTU noted that the findings highlighted the need to address concerns and provide training to bolster confidence in AI tools.
“By fostering trust and acceptance, the medical community can fully harness the potential of AI to enhance patient care and optimise healthcare outcomes,” the report noted.
Singapore already is looking to help its healthcare sector do so, with its Ministry of Health (MOH) announcing last October that it was investing SG$200 million over five years to support the development and testing of innovations, including AI, in local public healthcare institutions.
MOH said it would identify “proven and impactful” use cases for AI and scale these into “system-wide national projects”.
Amongst the initial projects are the use of generative AI (GenAI) for routine documentation and summarisation of medical records, as well as AI for imaging. Automated record updating will be deployed across Singapore’s public healthcare system before the end of 2025.
Singapore’s national healthtech agency Synapxe in June also launched an AI-powered conversational assistant that it said would be integrated into the national digital healthcare platform, HealthHub.
Called HealthHub AI, the AI assistant offers multilingual health information via text and voice interactions, with personalised profiles so recommendations are tailored to the patient.
Synapxe is working on various AI projects for the local healthcare industry, including a predictive model to automatically identify patients who are likely to be readmitted multiple times a year, as well as Tandem, a GenAI platform that provides a sandbox in which healthcare professionals can develop and test GenAI applications.
Synapxe is a subsidiary under MOH Holdings, a holding company through which Singapore’s health ministry owns corporatised institutions in the public healthcare market.
MOH also plans to drive the use of AI for imaging to facilitate earlier detection and follow-up on clinically significant signs.
Efforts here would include evaluating how AI could improve efficiencies and turnaround time in the reading of breast cancer screening images, according to the ministry.

If determined to be effective, this initiative would be adopted for use in Singapore’s subsidised screening programme from the end of 2025, “with proper workflows and care pathways in place”, MOH said.
It noted that the AimSG platform, through which public hospitals access imaging AI models from various vendors, would facilitate the continuous monitoring of AI models to ensure model accuracy.
Operate healthcare with its own AI governance
In fact, Singapore’s healthcare sector has its own AI governance, which mandates all AI products go through checks for safety, ethics, and security before they are deployed, said Andy Ta, Synapxe’s chief data officer and director of data analytics and AI.
He highlighted the nation’s AI in Healthcare Guidelines (AIHGIe), which provide a framework on how the technology should be adopted in the local healthcare industry.
The guidelines were jointly developed by MOH, Health Sciences Authority (HSA), and Synapxe to “support patient safety and improve trust in the use of AI in healthcare”.
“The application of AI spans multiple areas, including administration, clinical decision support, and research such as drug development,” said MOH. “AI can increase system efficiency and has the potential to improve patient outcomes. However, the widespread use of AI also includes inherent risks and ethical concerns, underscoring the importance of safe and responsible design and use.”
AIHGIe aims to share best practices with AI developers, such as manufacturers, and AI users including hospitals and clinics, and supports HSA’s Regulatory Guidelines for Software as Medical Devices.
There are checks and balances in place to ensure the AI tools Synapxe develops will not cause harm and should be continuously improved, Ta said in an interview with FutureCIO.
Insights generated are backed up by data and the healthtech agency also references AI models and guidelines from other countries to develop its AI tools, he said.
In addition, humans -- assisted by AI – are involved in the assessment process, where AI tools are put through a stress test and their responses checked for consistency, he noted.
The AI solutions also are continuously monitored through MLOps (machine learning operations) to ensure they operate within the boundaries they are supposed to perform, he said.
He added that the Synapxe's objectives are to provide the capabilities to transform healthcare, so there is better care prevention, healthcare providers’ workloads can be reduced through optimised workflows, and healthcare costs can be contained.
Noting that there are some 90,000 professionals in Singapore's healthcare industry, Ta said Synapxe aims to optimise their work processes with AI and find more ways to use data to make better decisions.
Citizens also should be empowered to use technology with AI to better understand their health and manage their medical conditions, he said.
With recent advancements, AI now can make a difference by providing higher quality recommendations, either through better predictions or by identifying patterns that enable clinicians to perform more effectively, he noted.
However, technology should serve only to enhance and enable, and should never be the decision maker, he stressed.
Recognise limits of AI and need for personal responsibility
Medow Health, too, adopts the necessary safeguards to ensure its products are built ethically and responsibly, including adhering to regulatory requirements, Tan said. This means training the its AI platform on the “right” materials and feeding it data inputs from different specialities and markets.
“We shape our roadmaps with the mindset that puts patients first, not tech first,” he said, adding that Medow AI has multiple layers of checks to mitigate risks such as hallucinations.
“We also advise doctors that [AI] is not a medical device, so doctors still are responsible for making sure it runs properly," Tan said.
He further stressed the importance of training, so doctors understand the limits of AI products.
It also is rare for doctors to want AI to replace diagnosis, he said. Rather than replace medical expertise, AI should be tapped to assist doctors in their diagnosis, he added.
What doctors do look for is speed, velocity, and rapid learning, said Tan, when asked about initial hurdles he faced pitching Medow AI to clinics in Singapore.
“And because it is healthcare, the expectation for accuracy is higher,” he noted. “Especially in Singapore, doctors have high expectations of how the technology should perform and that it needs to really value add.”
And they adopt technology only when it is proven, he said.
The Medow AI platform has been rolled out at 35 private clinics in Singapore since the startup expanded its operations to the Asian market earlier this year. The AI tool is deployed at more than 500 clinics and hospitals across Australia.
Apart from English, Medow’s AI scribe also supports medical transcription in various languages and dialects, including Cantonese, Malay, Mandarin, and Bahasa Indonesia. Other regional languages, such as Tamil, Hindi, and Hokkien, are planned in future updates, according to Tan.
“Our goal is to reflect the way care is actually delivered, in every accent, tongue, and dialect,” he added.
He noted that it can take more effort to support certain languages that lack transcription training models or which models are not sufficiently accurate and are unable to recognise specialised terminologies.
Asked about the tool’s accuracy rate, Tan said Medow Health does not measure this because doctors edit the documents and are the ones assessing how well the reports reflect the consultation. These also can contain abbreviated terms that doctors themselves create.
He added that if doctors do not speak, out loud, during the consultation, their thoughts would not be captured by the Medow AI tool, but this would not be an indication of accuracy.
He noted that doctors spent, on average, about one minute editing the AI-generated documents.
The healthtech startup is looking to add more features that allow doctors to customise their own templates, instead of having to go through Medow Health to do so, Tan said.
“Doctors want to take more control once they’re more familiar with the tool, so we want to give them the ability to make tweaks themselves,” he said, adding that such features are targeted to be released by year-end.