bracesproul 6e220b1a48 cr
2025-04-02 13:33:57 -07:00
cr
2025-03-11 13:39:28 -07:00
cr
2025-04-02 13:33:57 -07:00
cr
2025-03-11 13:36:05 -07:00
2025-03-07 14:35:51 +01:00
2025-02-18 19:35:46 +01:00
2025-03-04 17:40:03 +01:00
2025-02-27 14:08:24 -08:00
cr
2025-03-11 13:39:28 -07:00
2025-03-10 16:30:10 -07:00
2025-03-12 17:09:27 -07:00
cr
2025-03-10 14:17:44 -07:00
2025-03-03 14:53:53 +01:00
2025-02-18 19:35:46 +01:00
2025-02-27 14:08:24 -08:00

Agent Chat UI

Agent Chat UI is a Vite + React application which enables chatting with any LangGraph server with a messages key through a chat interface.

Note

🎥 Watch the video setup guide here.

Setup

Tip

Don't want to run the app locally? Use the deployed site here: agentchat.vercel.app!

First, clone the repository, or run the npx command:

npx create-agent-chat-app

or

git clone https://github.com/langchain-ai/agent-chat-ui.git

cd agent-chat-ui

Install dependencies:

pnpm install

Run the app:

pnpm dev

The app will be available at http://localhost:5173.

Usage

Once the app is running (or if using the deployed site), you'll be prompted to enter:

  • Deployment URL: The URL of the LangGraph server you want to chat with. This can be a production or development URL.
  • Assistant/Graph ID: The name of the graph, or ID of the assistant to use when fetching, and submitting runs via the chat interface.
  • LangSmith API Key: (only required for connecting to deployed LangGraph servers) Your LangSmith API key to use when authenticating requests sent to LangGraph servers.

After entering these values, click Continue. You'll then be redirected to a chat interface where you can start chatting with your LangGraph server.

Hiding Messages in the Chat

You can control the visibility of messages within the Agent Chat UI in two main ways:

1. Prevent Live Streaming:

To stop messages from being displayed as they stream from an LLM call, add the langsmith:nostream tag to the chat model's configuration. The UI normally uses on_chat_model_stream events to render streaming messages; this tag prevents those events from being emitted for the tagged model.

Python Example:

from langchain_anthropic import ChatAnthropic

# Add tags via the .with_config method
model = ChatAnthropic().with_config(
    config={"tags": ["langsmith:nostream"]}
)

TypeScript Example:

import { ChatAnthropic } from "@langchain/anthropic";

const model = new ChatAnthropic()
  // Add tags via the .withConfig method
  .withConfig({ tags: ["langsmith:nostream"] });

Note: Even if streaming is hidden this way, the message will still appear after the LLM call completes if it's saved to the graph's state without further modification.

2. Hide Messages Permanently:

To ensure a message is never displayed in the chat UI (neither during streaming nor after being saved to state), prefix its id field with do-not-render- before adding it to the graph's state, along with adding the langsmith:do-not-render tag to the chat model's configuration. The UI explicitly filters out any message whose id starts with this prefix.

Python Example:

result = model.invoke([messages])
# Prefix the ID before saving to state
result.id = f"do-not-render-{result.id}"
return {"messages": [result]}

TypeScript Example:

const result = await model.invoke([messages]);
// Prefix the ID before saving to state
result.id = `do-not-render-${result.id}`;
return { messages: [result] };

This approach guarantees the message remains completely hidden from the user interface.

Description
No description provided
Readme 1 MiB
Languages
TypeScript 95.1%
CSS 3%
JavaScript 1.9%