Intergration with langserve / RemoteRunnable ? #737
-
HI All Ive switched to using the new langgraph agents which i have found to be much better than before so i would like to stick with that
memory = SqliteSaver.from_conn_string(":memory:")
agent_chain = builder.compile(checkpointer=memory)
add_routes(app, agent_chain, path="/agent") But how do i pass stuff like {
"configurable": {
"thread_id": thread_id,
"user_id": user_id,
}
} when i call stream from the RemoteRunnable client? the stream method doesn't seem to support passing in a config? import { RemoteRunnable } from "@langchain/core/runnables/remote";
const remoteChain = new RemoteRunnable({
url: "http://127.0.0.1:8100/agent",
});
const stream = await remoteChain.stream({
messages: {"user": "im looking for a car"}
}, {
"configurable": {
"thread_id": '5678',
"user_id": '1234',
}
}); but i get INFO: 127.0.0.1:37426 - "POST /agent/stream HTTP/1.1" 422 Unprocessable Entity should i switch to use import { Client } from "@langchain/langgraph-sdk";
const assistant = await client.assistants.create({
graphId: "agent",
config: { configurable: { model_name: "openai" } },
}); In which case i have a whole lot more questions about how to use that? like whats an assistant? are there any docs for this? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi @darthShana! We're starting to add docs on the latter option here https://langchain-ai.github.io/langgraph/cloud/quick_start/ We should have JS SDK docs for you soon as well. Thank you for your patience.! |
Beta Was this translation helpful? Give feedback.
-
@darthShana how did you pass the config thread_id to langgraph with checkpointer? |
Beta Was this translation helpful? Give feedback.
Hi @darthShana! We're starting to add docs on the latter option here https://langchain-ai.github.io/langgraph/cloud/quick_start/
We should have JS SDK docs for you soon as well. Thank you for your patience.!