Agent Chat UI
Agent Chat UI is a Next.js application which enables chatting with any LangGraph server with a messages key through a chat interface.
Note
🎥 Watch the video setup guide here.
Setup
Tip
Don't want to run the app locally? Use the deployed site here: agentchat.vercel.app!
First, clone the repository, or run the npx command:
npx create-agent-chat-app
or
git clone https://github.com/langchain-ai/agent-chat-ui.git
cd agent-chat-ui
Install dependencies:
pnpm install
Run the app:
pnpm dev
The app will be available at http://localhost:3000.
Usage
Once the app is running (or if using the deployed site), you'll be prompted to enter:
- Deployment URL: The URL of the LangGraph server you want to chat with. This can be a production or development URL.
- Assistant/Graph ID: The name of the graph, or ID of the assistant to use when fetching, and submitting runs via the chat interface.
- LangSmith API Key: (only required for connecting to deployed LangGraph servers) Your LangSmith API key to use when authenticating requests sent to LangGraph servers.
After entering these values, click Continue. You'll then be redirected to a chat interface where you can start chatting with your LangGraph server.
Environment Variables
You can bypass the initial setup form by setting the following environment variables:
NEXT_PUBLIC_API_URL=http://localhost:2024
NEXT_PUBLIC_ASSISTANT_ID=agent
To use these variables:
- Copy the
.env.examplefile to a new file named.env - Fill in the values in the
.envfile - Restart the application
When these environment variables are set, the application will use them instead of showing the setup form.
Hiding Messages in the Chat
You can control the visibility of messages within the Agent Chat UI in two main ways:
1. Prevent Live Streaming:
To stop messages from being displayed as they stream from an LLM call, add the langsmith:nostream tag to the chat model's configuration. The UI normally uses on_chat_model_stream events to render streaming messages; this tag prevents those events from being emitted for the tagged model.
Python Example:
from langchain_anthropic import ChatAnthropic
# Add tags via the .with_config method
model = ChatAnthropic().with_config(
config={"tags": ["langsmith:nostream"]}
)
TypeScript Example:
import { ChatAnthropic } from "@langchain/anthropic";
const model = new ChatAnthropic()
// Add tags via the .withConfig method
.withConfig({ tags: ["langsmith:nostream"] });
Note: Even if streaming is hidden this way, the message will still appear after the LLM call completes if it's saved to the graph's state without further modification.
2. Hide Messages Permanently:
To ensure a message is never displayed in the chat UI (neither during streaming nor after being saved to state), prefix its id field with do-not-render- before adding it to the graph's state, along with adding the langsmith:do-not-render tag to the chat model's configuration. The UI explicitly filters out any message whose id starts with this prefix.
Python Example:
result = model.invoke([messages])
# Prefix the ID before saving to state
result.id = f"do-not-render-{result.id}"
return {"messages": [result]}
TypeScript Example:
const result = await model.invoke([messages]);
// Prefix the ID before saving to state
result.id = `do-not-render-${result.id}`;
return { messages: [result] };
This approach guarantees the message remains completely hidden from the user interface.