Home / AI Models / MELT-Mixtral-8x7B
Model released text-generation apache-2.0

MELT-Mixtral-8x7B

Details

Architecture Mixtral 8x7B Mixture-of-Experts (47B params)
Parameters 47B
Base Model mistralai/Mixtral-8x7B-Instruct-v0.1
Relation finetune
License apache-2.0

Medical LLM fine-tuned on 14+ medical datasets. Achieves 68.2% accuracy across USMLE, AIIMS, and NEET medical benchmarks — a 4.42% improvement over base Mixtral.

Results

BenchmarkAccuracy
Combined (USMLE/AIIMS/NEET)68.2%
Improvement over base+4.42%