Mistral, the company sometimes considered Europe’s great hope for AI, is releasing several updates to its AI assistant, Le Chat. In addition to a major web interface upgrade, the company is releasing a mobile app on iOS and Android.
As a reminder, Mistral develops its own large language models. The company’s flagship models, such as Mistral Large or its multimodal model Pixtral Large, are available for commercial use through an API or cloud partner, such as Azure AI Studio, Amazon Bedrock, and Google’s Vertex AI. It also releases a number of open-weight models under the Apache 2.0 license.
Mistral hopes to position itself as a credible alternative to OpenAI or Anthropic. In addition to developing foundation models, its Le Chat assistant competes directly with ChatGPT, Claude, Google Gemini, and Microsoft Copilot.
Mistral is finally releasing a mobile version of the assistant to better compete for a coveted spot on your phone’s home screen. The mobile app features the usual chatbot interface. You can query Mistral’s AI model and ask followup questions in a simple conversation-like interface.
Over the last few months, Le Chat has evolved to become a competent AI assistant. In November 2024, Mistral added support for web search with citations. It also lets you generate images or interact with free-form canvas to edit text or code in a separate window.
More recently, the company signed a wide-ranging deal with news agency Agence France-Presse (AFP) to ground results with reliable information sourcing. However, there’s no voice mode in Mistral’s mobile app for people who rely on speech to query AI assistants.
With today’s update to Le Chat, Mistral is also introducing a Pro tier for $14.99 per month, or €14.99 per month in Europe. While the company is no longer detailing the exact AI model it is using under the hood, it says the Pro plan comes with access to the “highest-performing model,” which suggests that free accounts don’t have access to the top-of-the-line model.
Other benefits include higher limits across the board and the ability to opt out of sharing your data with Mistral — the company is bringing the pay-or-consent model to AI assistants.
Up to 1,000 words per second
What makes Mistral stand out? The company doesn’t claim that it has better models than the competition. But it asserts that, from a product standpoint, it can sometimes outperform its competitors.
For instance, Mistral states that Le Chat runs on “the fastest inference engines on the planet” and it can answer up to 1,000 words per second. In our usage, it indeed felt faster than using ChatGPT’s 4o model.
Mistral also claims that it generates much better images than ChatGPT or Grok. The reason Mistral performs well on this front is that it relies on Black Forest Labs’ Flux Ultra, one of the leading image-generation models.
Those are nice-to-have features, but one of the main distinguishing factors comes with Le Chat’s enterprise solution. Mistral lets you deploy Le Chat in your environment with custom models and a custom user interface.
If you work in defense or banking, you may need to be able to deploy an AI assistant on premise. That is not currently possible with ChatGPT Enterprise or Claude Enterprise. That will help when it comes to growing the company’s revenue.
But first, let’s see if Mistral’s mobile app can persuade some AI enthusiasts to try its chatbot. As of this writing, ChatGPT, DeepSeek, and Google Gemini hold the #2, #3, and #6 spots, respectively, among the most downloaded iPhone apps in the U.S.