Instead of keeping the new Medgemma AI model locked behind the expensive API, Google passes these powerful tools to medical developers.
The new arrival is called Medgemma 27B Multimodal and Medsiglip and is part of Google’s growing collection of open source healthcare AI models. What makes these special is not just their technical skills, but the fact that hospitals, researchers and developers can download them, modify them, and do what they think is appropriate.
Google’s AI meets authentic healthcare
The flagship Medgemma 27B model doesn’t just read medical texts like its previous versions. It allows you to “look” the medical image and understand what it is seeing. Whether it’s a chest x-ray, a pathology slide, or a patient’s record, or a potentially months or years of patient record, all this information can be processed together like a doctor.
The performance numbers are very impressive. The 27B text model scored 87.7% when tested with the standard medical knowledge benchmark MEDQA. This puts it within the distance spitted from a much larger and more expensive model, and costs about a tenth of the cost to run. For cash-bound healthcare systems, it is potentially transformative.
The smaller sibling, the Medgemma 4B, may be more modest in size, but it’s not leaning forward. Although it is small by modern AI standards, it earned 64.4% in the same test, making it one of the best performers in its weight class. More importantly, when reviewing chest x-ray reports written by US board-certified radiologists, they deemed them accurate to 81% sufficient to guide actual patient care.
Medsiglip: Featherweight powerhouse
In addition to these generator AI models, Google has released Medsiglip. At just 400 million parameters, it is essentially featherweight compared to today’s AI Giants, but is specifically trained to understand medical imaging in ways that a general-purpose model cannot.
This small powerhouse is fed a diet of chest x-rays, tissue samples, photos of skin condition and popsicles. result? It can find important patterns and features in a medical context, and handles everyday images perfectly well.
Medsiglip creates a bridge between the image and the text. Showing the chest x-ray and asking them to find similar cases in the database will help you understand not only the visual similarity but also the medical significance.
Healthcare professionals are making Google’s AI model work
The proof of AI tools lies in whether a real expert actually wants to use it. Early reports suggest that physicians and healthcare companies are excited to see these models become possible.
DeepHealth in Massachusetts is testing Medsiglip for chest x-ray analysis. They find it helps to find potential issues that may be overlooked and serves as a safety net for overworked radiologists. Meanwhile, at Chang Gung Memorial Hospital in Taiwan, researchers discovered that Medgemma works in traditional Chinese medical textbooks, and have high accuracy in staff questions.
India’s TAP Health emphasizes the important aspects of Medgemma’s reliability. Unlike general purpose AI, which can hallucinate medical facts, Medgemma appears to understand when clinical context is important. That’s the difference between a chatbot that sounds like medical and a chatbot that you actually think medically.
Why open source AI models are important in healthcare
Beyond generosity, Google’s decision to create these models is also strategic. Healthcare has its own requirement that standard AI services cannot always meet. Hospitals need to know that patient data is not leaving the facility. Research institutions need a model that does not suddenly change behavior without warning. Developers need the freedom to fine-tune into very specific medical tasks.
By open-sourcing AI models, Google is addressing these concerns to deploy healthcare. Hospitals can trust that they run Medgemma on their own servers, modify it to suit their specific needs and operate consistently over time. This stability is invaluable for medical applications where reproducibility is important.
However, Google is careful to highlight that these models are not ready to replace doctors. They are tools that require human surveillance, clinical correlation, and appropriate validation before actual deployment. Output requires checking, recommendations require checking, and decisions still rely on qualified health professionals.
This careful approach makes sense. Even with impressive benchmark scores, medical AI can still make mistakes, especially when dealing with unusual cases and edge scenarios. Although the model excels in information processing and discovery patterns, it cannot replace the judgment, experience, and ethical responsibility that human physicians bring.
What’s exciting about this release is that it’s not just about the ability, it’s possible. Small hospitals that could not afford expensive AI services now have access to cutting-edge technology. Researchers in developing countries can build specialized tools for local health challenges. Medical schools can teach students using AI that actually understands medicine.
The model is designed to run on a single graphics card, with smaller versions adaptable to mobile devices. This accessibility opens the door to your Point of Care AI application where there is no high-end computing infrastructure.
As healthcare continues to tackle staff shortages, increased patient loads and the need for more efficient workflows, AI tools like Google’s Medgemma could provide much-needed relief. Rather than replacing human expertise, by amplifying it and making it more accessible where it is most needed.
(Photo by Owen Beard)
See: Tencent improves testing of creative AI models with new benchmarks
Want to learn more about AI and big data from industry leaders? Check out the AI & Big Data Expo in Amsterdam, California and London. The comprehensive event will be held in collaboration with other major events, including the Intelligent Automation Conference, Blockx, Digital Transformation Week, and Cyber Security & Cloud Expo.
Check out other upcoming Enterprise Technology events and webinars with TechForge here.