A decoder-style transformer pre-trained from scratch on 1T tokens of English text and code. This model was trained by MosaicML.
Cross-referenced across 55 tracked directories
#649
Popularity Rank
1 / 55
Listed In
Emerging
Adoption Stage
3/13/2026
First Seen
Recently added to the ecosystem
30 billion LLaMa-based model made by HuggingFace for the chatting conversation.
An open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.
A 7B parameters causal decoder-only model built by TII and trained on 1,500B tokens of RefinedWeb enhanced with curated corpora.
...more