Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make ask prompt configurable #417

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Make ask prompt configurable #417

wants to merge 1 commit into from

Conversation

ldoguin
Copy link
Contributor

@ldoguin ldoguin commented Aug 27, 2024

Add a configurable prompt for each model in the ask command, add example under examples/writer-agent. This would enable ai agents creation with cbsh.

Here is an example that specify answers must always be in JSON format:

👤 Laurent Doguin 🏠 capella in ☁️ cbsh.commits._default
> ask --prompt "you are a helpful bot, always answering in the JSON text format" "list all films by Steven Spielberg with their name, release date and main cast"
{
  "films": [
    {
      "name": "Jaws",
      "release_date": "1975",
      "main_cast": ["Roy Scheider", "Robert Shaw", "Richard Dreyfuss"]
    },
    {
      "name": "E.T. the Extra-Terrestrial",
      "release_date": "1982",
      "main_cast": ["Henry Thomas", "Drew Barrymore", "Dee Wallace"]
    },
    {
      "name": "Jurassic Park",
      "release_date": "1993",
      "main_cast": ["Sam Neill", "Laura Dern", "Jeff Goldblum"]
    },
    {
      "name": "Schindler's List",
      "release_date": "1993",
      "main_cast": ["Liam Neeson", "Ben Kingsley", "Ralph Fiennes"]
    },
    {
      "name": "Saving Private Ryan",
      "release_date": "1998",
      "main_cast": ["Tom Hanks", "Matt Damon", "Tom Sizemore"]
    },
    {
      "name": "Catch Me If You Can",
      "release_date": "2002",
      "main_cast": ["Leonardo DiCaprio", "Tom Hanks", "Christopher Walken"]
    },
    {
      "name": "The Terminal",
      "release_date": "2004",
      "main_cast": ["Tom Hanks", "Catherine Zeta-Jones", "Stanley Tucci"]
    }
  ]
}

@ldoguin ldoguin requested a review from Westwooo August 27, 2024 15:55
context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
let config = aws_config::load_from_env().await;
let client = aws_sdk_bedrockruntime::Client::new(&config);

let tpl_value = template.unwrap_or( "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"".to_string());
let mut rendered_tpl = tpl_value.replacen("{}", &*question, 1);
rendered_tpl = rendered_tpl.replacen("{}", &*context.join(" "), 1);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This won't work, you'll just end up with the prompt without the question, since there are no "{}" to replace. You'd need to do something like:

        let request_body = if let Some(tpl) = template {
            format!("{} - {}", tpl, question)
        } else {
            format!(
                "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"",
                question,
                context.join(" ")
            )
        };

        let question_with_ctx = if !context.is_empty() {
            format!(
                "{}. Using the following context: \\\"{}\\\"",
                request_body,
                context.join(" ")
            )
        } else {
            request_body
        };

And when using a custom prompt do we want to support users passing in context as well, or should they be mutually exclusive?

context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
let url = format!(
"https://generativelanguage.googleapis.com/v1beta/models/{}:generateContent?key={}",
model, self.api_key
);

let tpl_value = template.unwrap_or( "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"".to_string());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue with this as the bedrock client

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants