Lightweight medical LLM based on TinyLlama 1.1B, fine-tuned on 14+ medical datasets. Designed for resource-constrained deployment where large models like Mixtral are impractical.
Despite its small size, the model achieves a 13.76% improvement over the base TinyLlama across three medical benchmarks (overall average: 27.95% vs. 24.57% base).
Available on Hugging Face.
Licensed under Apache 2.0.
Hosted on Hugging Face.