Why optimistic updates matter
Without optimistic updates, the typical flow is:- User types a message and clicks send
- The message is sent to the server
- The UI waits for the server to acknowledge
- Only then does the message appear in the chat
- User types a message and clicks send
- The message immediately appears in the UI
- The server processes in the background
- When the server responds, the optimistic state is silently replaced with the real state
Using optimistic values with useStream
TheuseStream hook supports optimistic updates through the optimisticValues
option on the submit method. You pass a function that receives the previous
state and returns the optimistic state to display immediately.
Define a TypeScript interface matching your agent’s state schema and pass it as a type parameter to useStream for type-safe access to state values. In the examples below, replace typeof myAgent with your interface name:
How optimistic values work
TheoptimisticValues function follows a simple lifecycle:
- Predict—the
optimisticValuesfunction receives the current state (prev) and returns what the state should look like after the submission - Display—
useStreamimmediately uses the optimistic state for rendering, sostream.messagesincludes the new message right away - Reconcile—when the server sends back the first real state update, the optimistic values are discarded and replaced with the server’s state
The optimistic state is only used until the first server event arrives. After
that, all rendering is driven by the server’s actual state. This means any
discrepancy between your optimistic prediction and the server’s state is
automatically corrected.
The optimisticValues function
The function signature is:previs the current graph state (orundefinedif there is no state yet, e.g., first message in a new thread)- Returns the predicted next state
Handling first messages
When starting a new thread,prev will be undefined. Always handle this
case:
prev?.messages ?? [] pattern ensures the code works regardless of whether
there are existing messages.
Combining with message editing
Optimistic updates pair naturally with message editing. When a user edits a previous message, you can optimistically show the conversation truncated to that point with the updated message:Optimistic state for custom graph values
If your graph state includes custom values beyondmessages, you can
optimistically update those too:
Visual indicators for optimistic messages
You may want to visually distinguish optimistic messages from confirmed ones. A common approach is to show a subtle “sending…” indicator:Detecting whether a message is optimistic depends on your implementation. One
approach is to compare the current
stream.messages against a known
“confirmed” set. However, in most cases the reconciliation happens so quickly
(under a second) that visual differentiation is unnecessary.Error handling and rollback
If the server request fails,useStream reverts the optimistic state and the
message disappears from the UI. You should handle this gracefully:
onError callback fires when the server rejects the submission or the
connection fails. Since the optimistic state is automatically rolled back, you
just need to inform the user.
Best practices
Follow these guidelines for smooth optimistic updates:
- Generate stable IDs client-side—use UUIDs for optimistic messages so the server can match and reconcile them correctly.
- Keep the optimistic function pure—no side effects, no async operations. It should be a simple state transformation.
- Handle
undefinedprev state—always account for the case where there is no existing state (new threads). - Don’t over-predict—only optimistically update values you can reliably predict. Don’t try to guess the AI’s response.
- Combine with loading indicators—while the user’s message appears instantly, show a typing indicator or spinner for the upcoming AI response.
- Test with slow connections—throttle your network to verify the optimistic UX feels right when reconciliation takes longer than usual.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

