Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make ask prompt configurable #417

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions examples/writer-agent/bot1_role.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
Your name is Paul. You are a very creative screenwriter. Your expertise is superhero action shortfilms.
You will provide a new script for a new shortfilm.

Your task is to collaborate with Lina to create an exciting script. Your task is to always iterate on Ms Lina's critic and complete the assignment.
Assigment: write a 100 word shortscript about a new superhero in town, called "Pie Man" and his adventures.

You will ALWAYS converse in this structure:
Response: Here is where you response to Mr. Lisa
Story: Here is where you write your script
11 changes: 11 additions & 0 deletions examples/writer-agent/bot2_role.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
You are an script editor reviewer and your name is Lisa. Your are an expert reviewer working for a company like Marvel.

Your task is to collaborate with Paul to create exciting scripts for shortfilms.
Your task is to always iterate on Mr Pauls text and complete the assignment.

Assignment: Be very critical of Mr Paul and his writting to help him to write the best piece of text. The shortfilm has to be an action film and has to entertain the viewer.

You will ALWAYS converse in this structure:

Response: Here is where you response to Mr. Paul
Critique: Here you write your critic to Mr. Paul
21 changes: 21 additions & 0 deletions examples/writer-agent/writer_agent.nu
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
let bot1_role = open bot1_role.txt;
let bot2_role = open bot2_role.txt;

def bot [] {
mut response_bot1 = "";
mut response_bot2 = "";
for $x in 1..6 {
print ($"****************** ITERATION ($x) ***************")
let rep = ask --prompt $bot1_role $response_bot1
$response_bot1 = $rep
$response_bot2 = $rep
print ($"WRITER:\n ($response_bot1)")

let rep2 = ask --prompt $bot2_role $response_bot2
$response_bot1 = $rep2
$response_bot2 = $rep
print ($"EDITOR:\n ($response_bot2)")
}
}

bot
9 changes: 8 additions & 1 deletion src/cli/ask.rs
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,12 @@ impl Command for Ask {
"the chat model to ask the question",
None,
)
.named(
"prompt",
SyntaxShape::String,
"the prompt used by the model",
None,
)
.category(Category::Custom("couchbase".to_string()))
}

Expand Down Expand Up @@ -87,6 +93,7 @@ pub fn ask(
let span = call.head;

let question: String = call.req(engine_state, stack, 0)?;
let prompt_template: Option<String> = call.get_flag(engine_state, stack, "prompt")?;
let context: Vec<String> = match call.opt(engine_state, stack, 1)? {
Some(ctx) => ctx,
None => {
Expand Down Expand Up @@ -176,7 +183,7 @@ pub fn ask(
let rt = Runtime::new().unwrap();
let answer = match rt.block_on(async {
select! {
answer = client.ask(question.clone(), context.clone(), model) => {
answer = client.ask(question.clone(), prompt_template.clone(), context.clone(), model) => {
match answer {
Ok(a) => Ok(a),
Err(e) => Err(e),
Expand Down
10 changes: 5 additions & 5 deletions src/client/bedrock_client.rs
Original file line number Diff line number Diff line change
Expand Up @@ -108,18 +108,18 @@ impl BedrockClient {
pub async fn ask(
&self,
question: String,
template: Option<String>,
context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
let config = aws_config::load_from_env().await;
let client = aws_sdk_bedrockruntime::Client::new(&config);

let tpl_value = template.unwrap_or( "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"".to_string());
let mut rendered_tpl = tpl_value.replacen("{}", &*question, 1);
rendered_tpl = rendered_tpl.replacen("{}", &*context.join(" "), 1);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This won't work, you'll just end up with the prompt without the question, since there are no "{}" to replace. You'd need to do something like:

        let request_body = if let Some(tpl) = template {
            format!("{} - {}", tpl, question)
        } else {
            format!(
                "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"",
                question,
                context.join(" ")
            )
        };

        let question_with_ctx = if !context.is_empty() {
            format!(
                "{}. Using the following context: \\\"{}\\\"",
                request_body,
                context.join(" ")
            )
        } else {
            request_body
        };

And when using a custom prompt do we want to support users passing in context as well, or should they be mutually exclusive?

let question_with_ctx = if !context.is_empty() {
format!(
"Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"",
question,
context.join(" ")
)
rendered_tpl
} else {
question
};
Expand Down
11 changes: 5 additions & 6 deletions src/client/gemini_client.rs
Original file line number Diff line number Diff line change
Expand Up @@ -129,20 +129,19 @@ impl GeminiClient {
pub async fn ask(
&self,
question: String,
template: Option<String>,
context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
let url = format!(
"https://generativelanguage.googleapis.com/v1beta/models/{}:generateContent?key={}",
model, self.api_key
);

let tpl_value = template.unwrap_or( "Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"".to_string());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue with this as the bedrock client

let mut rendered_tpl = tpl_value.replacen("{}", &*question, 1);
rendered_tpl = rendered_tpl.replacen("{}", &*context.join(" "), 1);
let question_with_ctx = if !context.is_empty() {
format!(
"Please answer this question: \\\"{}\\\". Using the following context: \\\"{}\\\"",
question,
context.join(" ")
)
rendered_tpl
} else {
question
};
Expand Down
7 changes: 4 additions & 3 deletions src/client/llm_client.rs
Original file line number Diff line number Diff line change
Expand Up @@ -37,13 +37,14 @@ impl LLMClients {
pub async fn ask(
&self,
question: String,
template: Option<String>,
context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
match self {
Self::OpenAI(c) => c.ask(question, context, model).await,
Self::Gemini(c) => c.ask(question, context, model).await,
Self::Bedrock(c) => c.ask(question, context, model).await,
Self::OpenAI(c) => c.ask(question, template, context, model).await,
Self::Gemini(c) => c.ask(question, template, context, model).await,
Self::Bedrock(c) => c.ask(question, template, context, model).await,
}
}

Expand Down
4 changes: 2 additions & 2 deletions src/client/openai_client.rs
Original file line number Diff line number Diff line change
Expand Up @@ -130,15 +130,15 @@ impl OpenAIClient {
pub async fn ask(
&self,
question: String,
template: Option<String>,
context: Vec<String>,
model: String,
) -> Result<String, ShellError> {
let mut messages: Vec<ChatCompletionRequestMessage> = vec![];

// Primes the model to respond appropriately
messages.push(
ChatCompletionRequestSystemMessageArgs::default()
.content("You are a helpful assistant.")
.content(template.unwrap_or("You are a helpful assistant.".to_string()))
.build()
.unwrap()
.into(),
Expand Down
Loading