MPT-30B: Raising the bar for open-source foundation models

MPT-30B: Raising the bar for open-source foundation models

4.6
(597)
Write Review
More
$ 11.99
Add to Cart
In stock
Description

Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.

MPT-30B: Raising the bar for open-source foundation models : r/LocalLLaMA

Announcing MPT-7B-8K: 8K Context Length for Document Understanding

Computational Power and AI - AI Now Institute

MetaDialog: Customer Spotlight

MosaicML's latest models outperform GPT-3 with just 30B parameters

Stardog: Customer Spotlight

MPT-30B's release: first open source commercial API competing with OpenAI, by BoredGeekSociety

MosaicML Releases Open-Source MPT-30B LLMs, Trained on H100s to Power Generative AI Applications

MPT-30B: Raising the bar for open-source foundation models

Centralized provisioning of large language models for a research community