30 billion LLaMa-based model made by HuggingFace for the chatting conversation.
Cross-referenced across 55 tracked directories
#1971
Popularity Rank
1 / 55
Listed In
Emerging
Adoption Stage
3d
Listed For
Recently added to the ecosystem
An open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.
A decoder-style transformer pre-trained from scratch on 1T tokens of English text and code. This model was trained by MosaicML.
...moreA 7B parameters causal decoder-only model built by TII and trained on 1,500B tokens of RefinedWeb enhanced with curated corpora.
...more