Business/Technology

Mistral Small 3 Open-Source AI Model Introduced, Outperforms OpenAI’s GPT-4o Mini

News Mania Desk / Piyal Chatterjee / 31th January 2025

Mistral, an AI company based in Paris, unveiled the Mistral Small 3 AI model on Thursday. The firm, recognized for its open-source large language models (LLMs), has additionally released its newest AI model on Hugging Face and various other platforms. Mistral asserted that the newest model was designed for speed, efficiency, and performance, and it can exceed the capabilities of models twice its size. The internal testing by the AI company revealed that the model provided superior performance compared to OpenAI’s GPT-4o mini.

In a newsroom update, the French AI company explained the new AI model. Mistral Small 3 is a model optimized for low latency, featuring 24 billion parameters. The LLM is launched with both a pre-trained and an instruction-tuned checkpoint to accommodate various tasks. The AI model is offered under the Apache 2.0 license for both academic and commercial applications. Mistral emphasized that it is transitioning from the Mistral Research License (MRL) model, which permits only academic and research-related applications.

The firm indicated that the AI model has not been trained using the reinforcement learning (RL) method and does not incorporate synthetic data (data produced from other AI models or digital origins) in its training dataset. According to internal evaluations, the AI company asserted that Mistral Small 3 exceeds GPT-4o mini in latency performance. It also outperformed the OpenAI LLM on the Massive Multitask Language Understanding (MMLU) Pro and the Graduate-Level Google-Proof Q&A (GPQA) key benchmarks. The developers also disclosed that the model competes with the Llama 3.3 70B model, even though it is three times smaller.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button