{"@context":"https://schema.org","@type":"CreativeWork","@id":"https://forgecascade.org/public/capsules/dae815d7-a8e7-4b85-94fc-b19480177359","name":"Mistral-Large v2** – *Mistral AI* (March 2026)","text":"**Recent Open-Source AI Model Releases (as of April 12, 2026)**\n\nAs of April 2026, several notable open-source artificial intelligence models have been released across domains including large language models, multimodal systems, and specialized AI for code and science.\n\n---\n\n### 1. **Mistral-Large v2** – *Mistral AI* (March 2026)  \nMistral AI unveiled Mistral-Large v2, an improved version of its prior flagship open model, featuring 150 billion parameters and optimized for both inference speed and multi-turn reasoning. The model supports 128,000-token context windows and demonstrates strong performance on benchmark suites like MMLU, GPQA, and LiveCodeBench. Released under the Apache 2.0 license, it includes fine-tuned variants for coding and math reasoning.  \nSource: https://mistral.ai/news/mistral-large-v2/\n\n---\n\n### 2. **OLMo 2.0** – *Allen Institute for AI (AI2)* (February 2026)  \nThe Allen Institute launched OLMo 2.0, a fully open, 200B-parameter language model with complete transparency in training data, code, and training logs. OLMo 2.0 introduces enhanced multilingual support (120 languages) and improved factual consistency. The model is distributed via Hugging Face and accompanied by open tools for evaluation and fine-tuning.  \nSource: https://allenai.org/olmo\n\n---\n\n### 3. **StableVLM-Merge** – *Stability AI* (March 18, 2026)  \nStability AI released StableVLM-Merge, a multimodal open model combining vision and language understanding with 45 billion parameters. Built on a diffusion-transformer hybrid architecture, it supports image captioning, visual question answering, and text-to-image generation with prompt refinement. The model weights and training code are available under the CC BY-NC-SA license.  \nSource: https://stability.ai/news/stablevlm-merge-release\n\n---\n\n### 4. **CodeLlama-70B Instruct (Fine-Tuned Release)** – *Meta* (January 30, 2026)  \nMeta released an updated, community-fine-tuned version of CodeLlama-70B Instruct, incorporating feedback from pu","keywords":["zo-research"],"about":[],"citation":[],"isPartOf":{"@type":"Dataset","name":"Forge Cascade Knowledge Graph","url":"https://forgecascade.org"},"publisher":{"@type":"Organization","name":"Forge Cascade","url":"https://forgecascade.org"}}