Mistral AI Floats New Mixtral 8x22B Generative AI Model as Free 281GB Download
Generative AI startup Mistral AI dropped its newest large language model as a 281GB file accessed through a magnet link posted on X without any explanation or comment. Yet the Mixtral 8x22B model and its 22 billion parameters may leap ahead of its predecessor in social media-accessible models, Mixtral 8x7B, a notable achievement for an open-weight LLM available for free.
Mixtral 8x22B
The Mixtral 8x22B model is distinguished from the older model in sheer size. The 176 billion parameter model boasts a context window of 65,000 tokens, an enormous number of internal variables for absorbing and composing responses to input and a huge amount of data that can be simultaneously processed and remembered.
The release comes during a hectic time for new, powerful LLMs. OpenAI’s new GPT-4 Turbo with Vision, and Google’s Gemini Pro 1.5 LLM are both now available to developers, with Meta all but promising that Llama 3 is mere weeks away. The frontier of frontier models is slightly overpopulated at the moment.
The approach to releasing the new model certainly fits the French starup’s previous releases. Mixtral 8x7B originated the X-based release, but awareness of its tests on the upcoming Mistral Next only came from a single message on Discord. The understated strategy hasn’t hurt the startup in scoring clients and financing. Mistral emerged from stealth with a $113 million seed round last June, followed by a $415 million Series A funding round led by Andreessen Horowitz in December. And it goes beyond the models alone. Web browser Brave made Mixtral 8x7B the default model for its generative AI assistant Leo.
Follow @voicebotaiFollow @erichschwartz
Mistral Next Generative AI Model Quietly Starts Testing on Discord
Brave Installs Mistral’s Mixtral as Default LLM for Generative AI Browser Assistant Leo
Mistral AI Raises $415M and Releases New LLM as Free Torrent