Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Enhancements to generative queries #1247

Open
databyjp opened this issue Aug 19, 2024 · 0 comments
Open

[Proposal] Enhancements to generative queries #1247

databyjp opened this issue Aug 19, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@databyjp
Copy link
Contributor

Proposal: Could we create new functions for wrapping generative capabilities? With:

  • Mandatory prompt and model parameters
  • Optional search_results parameters from Weaviate

This will allow a user to :

  1. Prompt an LLM (without additional retrieved data)
  2. Perform RAG from a Weaviate search response
  3. Perform RAG from multiple Weaviate search responses
  4. Pre-process to formulate a custom LLM prompt

Syntax proposal:

import weaviate
from weaviate.classes.config import Generative
from weaviate.classes.generate import generate_text

client = weaviate.connect_to_local()

gen_model = Generative.aws(
      model="cohere.command-text-v14",
     region="us-east-1"
),


# 💡 >>> SCENARIO 1 <<< Standalone LLM prompt

response = generate_text(
    model=gen_model,
    prompt="What is the capital of France?",
)


# 💡 >>> SCENARIO 2 <<< RAG with a Weaviate response

wiki = client.collections.get("Wiki")
search_response = wiki.query.hybrid("Afrian or European swallow")

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut?",
    search_response=search_response
)


# 💡 >>> SCENARIO 3 <<< RAG with TWO Weaviate responses!

wiki = client.collections.get("Wiki")
scripts = client.collections.get("Scripts")

wiki_response = wiki.query.hybrid("Afrian or European swallow")
scripts_response = scripts.query.hybrid("Afrian or European swallow")

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut?",
    search_response=[wiki_response, scripts_response]
)


# 💡 >>> SCENARIO 4 <<< RAG with transformed text

wiki = client.collections.get("Wiki")
search_response = wiki.query.hybrid("Afrian or European swallow")

context = "\n\n".join([f'{o["title"]}: {o["chunk"]}' for o in search_response.objects])

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut? Answer based on the following information:\n\n" + context,
)
@databyjp databyjp added the enhancement New feature or request label Aug 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant