Search
io.github.egoughnour/massive-context-mcp
Handles 10M+ token contexts with chunking, sub-queries, and local Ollama inference.
@adonisjs/limiter
GitHub Actions
Rate limiting package for AdonisJS framework
@rajat-rastogi/maestro
rajat-rastogi
Conduct your codebase — autonomous AI coding orchestrator
intelliwaketssveltekitv25
denjpeters
A professional SvelteKit component library with 41+ reusable UI components and utilities. Built with Svelte 5, TypeScript, and Tailwind CSS v4.
...moreAstack
astack-tech
🤖 A composable framework for building AI applications.
n8n-nodes-openwebui
mauroprojetos
Nó customizado do n8n para integração com OpenWebUI como provedor de LLM
zradlicz/particle-mcp-server
Facilitates AI-driven management of Particle IoT devices through natural language commands.
langchain-twitter
zbmain
No description available
agentprofiles-cli
lun3lson
Manage configuration profiles for LLM agent tools
@mate_tsaava/pr-review
mate_tsaava
AI-powered code review CLI for Azure DevOps pull requests
llama-index-llms-cohere
Your Name <[email protected]>
llama-index llms cohere integration
@domdhi/claude-code-tts
domdhi
Neural TTS hook system for Claude Code. Reads Claude's responses aloud as they finish.
keystone-cli
mhingston5
A local-first, declarative, agentic workflow orchestrator built on Bun
@macnishio/zoho-mcp-server1
macnishio
Zoho MCP Server for Claude Desktop
N8n Builder
spences10
🪄 MCP server for programmatic creation and management of n8n workflows. Enables AI assistants to build, modify, and manage workflows without direct user intervention through a comprehensive set of tools and resources for interacting with n8n's REST API.
...morexiangmy21/iotdb-mcp-server-TreeModel
Facilitates database interaction and business intelligence through IoTDB with SQL query execution capabilities.
llama-index-llms-groq
Your Name <[email protected]>
llama-index llms groq integration
streamline-mcp
derzathon
Streamline: The intelligent model optimizer and execution engine for MCP
mcp-local-llm
himmussel
MCP server that uses a local LLM to respond to queries - Binary distribution
llama-index-llms-together
Your Name <[email protected]>
llama-index llms together integration