Mistral AI
13 models available
All Mistral AI Models
Codestral
Codestral is Mistral AI's cutting-edge language model explicitly designed for code generation tasks. It is Mistral's inaugural code-specific generative model, representing an open-weight generative AI...
Devstral 2 2512 (Free)
Devstral 2 is a state-of-the-art open-source model by Mistral AI specializing in agentic coding. It is a 123B-parameter dense transformer model supporting a 256K context window.
Mistral 7B Instruct
A high-performing, industry-standard 7.3B parameter model, with optimizations for speed and context length. Mistral 7B Instruct has multiple version variants, and this endpoint is intended to be the l...
Mistral 7B Instruct Model
Mistral 7B Instruct is a 7.3 billion parameter language model that has been specifically optimized for instruction-following tasks. It represents the latest version among multiple Mistral 7B variants ...
Mistral AI: Mistral Small
Mistral Small is a 22-billion parameter model serving as a convenient mid-point between smaller and larger Mistral options. It emphasizes reasoning capabilities, code generation, and multilingual supp...
Mistral Devstral 2512 (Free)
Mistral Large
Mistral Large is Mistral AI's flagship offering. The model excels at reasoning, code generation, JSON handling, and chat applications. It is a proprietary model with support for dozens of languages in...
Mistral Medium Model Documentation
A closed-source, medium-sized model from Mistral AI that excels at reasoning, code, JSON, chat, and more. This model performs comparably to other companies' flagship models and represents Mistral's mi...
Mistral Small Creative
Mistral Small Creative is an experimental small model designed for:
Mistral: Devstral 2 2512
**Model ID:** `mistralai/devstral-2512`
Mistral: Mistral Nemo
A 12-billion parameter model featuring a 128k token context window, developed by Mistral in partnership with NVIDIA. The model supports multiple languages including English, French, German, Spanish, I...
Mistral: Mixtral 8x7B Instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts model with 8 experts totaling 47 billion parameters. It has been fine-tuned by Mistral AI specifically for chat and instructi...
Mistral: Pixtral 12B
The first multi-modal, text+image-to-text model from Mistral AI. Its weights were launched via torrent, making it openly available for research and commercial use.