A 7B parameters causal decoder-only model built by TII and trained on 1,500B tokens of RefinedWeb enhanced with curated corpora.
Cross-referenced across 55 tracked directories
#2582
Popularity Rank
1 / 55
Listed In
Emerging
Adoption Stage
Mar 13, 2026
First Seen
Recently added to the ecosystem
30 billion LLaMa-based model made by HuggingFace for the chatting conversation.
An open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.