>_Skillful
Need help with advanced AI agent engineering?Contact FirmAdapt

baidu

@baidu

5

Published Tools

0

Total Stars

0

Weekly Downloads

Published Tools

2 Skills5 Agentsacross 3 categories

文心快码 Baidu Comate

Baidu Comate

Coding mate, Pair you create. Your AI Coding Assistant with Autocomplete & Chat for Java, Go, JS, Python & more

SkillVS Code Extension
348K/wk1 dir

Baidu: ERNIE 4.5 21B A3B Thinking

baidu

ERNIE-4.5-21B-A3B-Thinking is Baidu's upgraded lightweight MoE model, refined to boost reasoning depth and quality for top-tier performance in logical puzzles, math, science, coding, text generation, and expert-level academic benchmarks.

AgentLLM Model
1 dir

Baidu: ERNIE 4.5 21B A3B

baidu

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across

AgentLLM Model
1 dir

Baidu: ERNIE 4.5 VL 28B A3B

baidu

A powerful multimodal Mixture-of-Experts chat model featuring 28B total parameters with 3B activated per token, delivering exceptional text and vision understanding through its innovative heterogeneous MoE structure with modality-isolated routing. Built with scaling-efficient infrastructure for high-throughput training and inference, the model leverages advanced post-training techniques including SFT, DPO, and UPO for optimized performance, while supporting an impressive 131K context length and

AgentLLM Model
1 dir

Baidu: ERNIE 4.5 VL 424B A47B

baidu

ERNIE-4.5-VL-424B-A47B is a multimodal Mixture-of-Experts (MoE) model from Baidu’s ERNIE 4.5 series, featuring 424B total parameters with 47B active per token. It is trained jointly on text and image data using a heterogeneous MoE architecture and modality-isolated routing to enable high-fidelity cross-modal reasoning, image understanding, and long-context generation (up to 131k tokens). Fine-tuned with techniques like SFT, DPO, UPO, and RLVR, this model supports both “thinking” and non-thinking

AgentLLM Model
1 dir

Baidu: ERNIE 4.5 300B A47B

baidu

ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool paramet

AgentLLM Model
1 dir

san-store

errorrik

Application State Management for San

SkillAI Tool
1 dir