n8n Launches Chat Hub for Unified AI Chat and Agent Management
BERLIN — n8n has released Chat Hub, a centralized AI chat interface that lets users interact with multiple large language models, n8n agents and custom-built agents from a single pane of glass. The feature is available now in n8n 2.1.0 beta and is designed to give organizations a managed entry point for all AI-related tasks instead of scattered, unmanaged chat sessions.
According to n8n’s official announcement, Chat Hub addresses a persistent reality in the AI market: while foundational models differ in capabilities and providers, they all present users with essentially the same chat interface. Rather than forcing teams to juggle separate windows for ChatGPT, Claude, Gemini, Grok, Mistral and others, the new hub consolidates access and adds native integration with n8n’s automation and agent-building capabilities.
Centralized Interface and New User Role
Chat Hub functions as both a multi-model chat client and an operational control plane. Users can query several AI models simultaneously or switch between them, interact directly with n8n-powered agents, and create their own specialized agents using the platform’s workflow tools. The interface is built to feel familiar to anyone who has used consumer AI chat services, while adding enterprise-friendly governance.
A key addition is the “Chat user” role, which allows individuals to access the chat interface and converse with models and agents without gaining permission to edit or view underlying n8n workflows. This separation of concerns is intended to let non-technical staff or external collaborators use AI capabilities safely while keeping automation logic under the control of technical teams.
Documentation published by n8n describes Chat Hub as “a centralized AI chat interface where you can access multiple AI models, interact with n8n agents, and create your own agents.” The company positions the feature as a way to bring order to the proliferation of AI tools inside organizations, replacing ad-hoc usage with a governed, auditable environment.
Technical Availability and Integration
The beta is rolling out with n8n version 2.1.0. Existing self-hosted and cloud users can enable the feature through the current beta channel. Because n8n is an open-source workflow automation platform, Chat Hub inherits the same flexibility that lets users connect hundreds of data sources and services. Agents built inside Chat Hub can therefore trigger complex automations, retrieve live data, or orchestrate multi-step business processes directly from natural-language conversation.
The launch comes as enterprises increasingly look for ways to standardize AI adoption. With dozens of frontier models now available through commercial APIs, many companies report fragmented usage patterns that create both security risks and duplicated spending. By providing a single organizational chat surface that routes to whichever model or agent is most appropriate, n8n aims to simplify governance while preserving choice of underlying AI provider.
Impact on Developers, Teams and the Automation Landscape
For developers and automation engineers already using n8n, Chat Hub offers a faster way to expose AI agents to end users without building custom front-end applications. Business users gain a conversational interface that can execute real work through n8n workflows rather than simply returning static answers.
The introduction also reflects broader industry movement toward agentic AI platforms. Rather than treating chat as an isolated productivity tool, n8n is embedding chat inside its automation fabric so that conversations can seamlessly trigger actions across CRMs, databases, APIs and internal tools.
What’s Next
n8n has not yet published a firm timeline for moving Chat Hub out of beta, but the feature is already documented in the official n8n Docs and available for testing in the 2.1.0 beta release. The company is expected to gather community feedback on performance, multi-model orchestration and role-based access controls before declaring general availability.
Further enhancements, such as expanded model support, improved agent memory, and additional enterprise security features, are likely to appear in subsequent beta iterations and the eventual stable release.
