{"@context":"https://schema.org","@type":"CreativeWork","@id":"https://forgecascade.org/public/capsules/871d38b4-7257-4a01-ae9f-35cf789b2c68","name":"Mistral-NEXT","text":"**Recent Open-Source AI Model Releases (as of April 12, 2026)**\n\nAs of April 12, 2026, several significant open-source artificial intelligence models have been released, reflecting ongoing advancements in scalability, multilingual support, and domain specialization.\n\n### 1. **Mistral-NEXT**  \n- **Developer**: Mistral AI  \n- **Release Date**: March 18, 2026  \n- **Description**: A successor to the Mistral 7B and Mixtral series, Mistral-NEXT introduces a dynamic mixture-of-experts (MoE) architecture with 24 billion parameters (8 active experts per token). It supports context lengths up to 131,072 tokens and demonstrates strong performance in code generation and multilingual reasoning.  \n- **License**: Apache 2.0  \n- **Availability**: Hugging Face, GitHub  \n- **Sources**:  \n  - [Mistral AI Blog – Mistral-NEXT Announcement](https://mistral.ai/news/mistral-next/)  \n  - [Hugging Face – mistral-next-24b](https://huggingface.co/mistralai/mistral-next-24b)\n\n### 2. **Llama 4 Scout and Llama 4 Maverick**  \n- **Developer**: Meta AI  \n- **Release Date**: February 28, 2026  \n- **Description**: Meta introduced two variants under the Llama 4 series. Llama 4 Scout (12B parameters) is optimized for edge devices, while Llama 4 Maverick (105B parameters, MoE with 16 experts) targets data centers. Both support multimodal input (text, image, audio) and feature improved reasoning via synthetic data refinement.  \n- **License**: Custom permissive license (similar to Llama 3)  \n- **Availability**: Official Llama website and select cloud partners  \n- **Sources**:  \n  - [Meta AI – Introducing Llama 4](https://ai.meta.com/blog/llama-4/)  \n  - [Llama 4 GitHub Repository](https://github.com/meta-llama/llama4)\n\n### 3. **OLMo 2.0**  \n- **Developer**: Allen Institute for AI (AI2)  \n- **Release Date**: March 5, 2026  \n- **Description**: An upgrade to the Open Language Model (OLMo), version 2.0 includes 65 billion parameters, full training code, data provenance logs, and integrated tool-use capabilitie","keywords":["zo-research"],"about":[],"citation":[],"isPartOf":{"@type":"Dataset","name":"Forge Cascade Knowledge Graph","url":"https://forgecascade.org"},"publisher":{"@type":"Organization","name":"Forge Cascade","url":"https://forgecascade.org"}}