Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.
Amazon Web Services (AWS) continues to add more models to Amazon Bedrock, its managed service for generative AI offerings and application development. The company announced that Mistral Large is now publicly available for customers to build out their AI apps. This move follows a similar one made a month ago when AWS added Mistral 7B and Mixtral 8x7B to Bedrock.
“By bringing Mistral AI models to Amazon Bedrock, customers will have access to the most cutting-edge and advanced generative AI technologies as well as easy access to enterprise-grade tooling and features all in a secure and private environment,” Vasi Philomin, vice president of generative AI, AWS, said in a blog post.
Released in February, Mistral Large is Mistral AI’s cutting-edge text generation model for complex multilingual reasoning tasks, such as text understanding, transformation, and code generation. The model is one of the top-ranked models generally available through an API. It’s natively fluent in English, French, Spanish, German, and Italian, has a 32K tokens context window, features precise instruction-following, which enables developers to craft their moderation policies, and is natively capable of function calling.
“Our mission is to make frontier AI ubiquitous, and to achieve this mission, we want to collaborate with the world’s leading cloud provider to distribute our top-tier models,” Mistral AI chief executive Arthur Mensch remarked in a statement. “We have a long and deep relationship with AWS and through strengthening this relationship today, we will be able to provide tailor-made AI to builders around the world.”
All Mistral models are available starting today in the US East (N. Virginia), US West (Oregon) and EU-West-3 (Paris) regions. The addition to Bedrock joins other model providers including Anthropic, AI21 Labs, Cohere, Meta, Stability AI and Amazon.
However, this is not where the AWS-Mistral partnership ends. Amazon revealed Mistral AI will use its Tranium and Inferentia silicon chips to build and deploy future foundational models.
And one more thing: Amazon Bedrock is now available in France. This means developers in the country can access all supported LLMs and foundational models but rest easy knowing that data remains safe and secure within the French borders.