MPT-30B: Raising the bar for open-source foundation models
MPT-30B: Raising the bar for open-source foundation models

www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models
Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.
