Start your day with intelligence. Get The OODA Daily Pulse.

Mistral releases new AI models optimized for laptops and phones

French AI startup Mistral has released its first generative AI models designed to be run on edge devices, like laptops and phones. The new family of models, which Mistral is calling “Les Ministraux,” can be used or tuned for a variety of applications, from basic text generation to working in conjunction with more capable models to complete tasks. There’s two Les Ministraux models available — Ministral 3B and Ministral 8B — both of which have a context window of 128,000 tokens, meaning they can ingest roughly the length of a 50-page book. “Our most innovative customers and partners have increasingly been asking for local, privacy-first inference for critical applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics,” Mistral writes in a blog post. “Les Ministraux were built to provide a compute-efficient and low-latency solution for these scenarios.” Ministral 8B is available for download as of today — albeit strictly for research purposes. Mistral’s requiring devs and companies interested in Ministral 8B or Ministral 3B self-deployment setups to contact it for a commercial license. Otherwise, devs can use Ministral 3B and Ministral 8B through Mistral’s cloud platform, Le Platforme, and other clouds with which the startup has partnered in the coming weeks. Ministral 8B costs 10 cents per million output/input tokens (~750,000 words), while Ministral 3B costs 4 cents per million output/input tokens. There’s been a trend toward small models, lately, which are cheaper and quicker to train, fine-tune, and run than their larger counterparts.

Full report : Mistral releases Les Ministraux AI models in 3B and 8B sizes with 128K context windows, aimed at on-device translation, internet-less smart assistants, and more.

Tagged: AI Mistral AI