Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remote Agentic AI Backend with LangServe, LangGraph consumed as RemoteRunnable in NextJS with AI/RSC #50

Open
elvenking opened this issue Jun 2, 2024 · 2 comments

Comments

@elvenking
Copy link

elvenking commented Jun 2, 2024

First of all thank you @developersdigest for this excellent youtube tutorial and repo.

Please take following as an idea and potential feature request.

I feel like it is not ideal to mix all the AI logic with the web app. I would like to see some stable example of separation of concerns of "complex AI backend" and consumer NextJS web app.

Imagine following setup of Answer Engine

Backend:

  • Python - as it is lingua franca of LLM and the better part of LangChain libs
  • LangServe (with LCEL streaming) as API
  • LangGraph for complex state & process management and tool invocation
  • Persistence of history (graph) e.g. in EdgeDB or Zep
  • Rate limiting
  • Semantic Caching

Frontend:

  • NextJS app
  • Streaming AI backend consumed as RemoteRunnable of LangChain
  • Vercel AI SDK with AI/RSC for streaming React components from AI backend
  • Able to stream intermediate agentic results / graph state from remote backend

There are multiple requests for such setup and nobody came with a stable solution

I have filled similar request to Vercel AI SDK repo:
vercel/ai#1506

@developersdigest
Copy link
Owner

Hi @elvenking - thanks for your thoughtful feature request here.

I am working on a feature to be able to scope and interact with third party tools my aim is for it to be framework + programming language agnostic with these features. I am starting with building out '@' commands where you can use that to pull up in the UI of your configured workflows/agents/tools/specific llm providers + models.

A small sneak peak is here: https://x.com/Dev__Digest/status/1795626592544399571

You could imagine being able to:

'@langserve - tool to do xyz'
'@Langgraph - agent do xyz'
'@CrewAI - do xyz'

etc.

In terms of changing the entire backend to be Python, I have no plans to do that currently.

In terms of Database support that will be coming hopefully early summer - but my plans with most features moving forward is that I want to build it in a way where most things are optional.

I hope to be able to help support most of what you are asking for here - it might not be quite what you were looking for but I hope it is in the direction that you are interested in seeing a project implement.

@elvenking
Copy link
Author

'@' looks cool :)

Thank you for update

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants