-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tool Support w/o strings #62
Comments
Not yet integrated or complete but the basic idea is here: I think I could make the proc macro (now called It isn't pretty but it shows that tool definitions don't necessarily need to be done by hand. I would argue that stringifying the tool output is for the user to decide, not for the library to insist on. |
Updated with the tool calling json completely autogenerated. essentially |
Made some updates so that now the tool can be called: https://github.com/rseymour/func_me This allows for tool calling without any 'stringly typed' work going into the JSON. I'm going to create an ollama-rs example next. The boilerplate request/json logic shown in these lines should be reduced when I do so: But as you can see the function name is written only once and the LLM calls / tool calls update automatically. |
So now that I've got |
Further updates. I created a branch where I added a trait called The Toolbox trait has 2 functions: pub trait Toolbox: Send + Sync {
fn get_impl_json(&self) -> Value;
fn call_value_fn(&self, tool_name: &str, tool_args: Value) -> Value;
} Both of those are (almost) autogenerated by the macros in
The neat thing here is that the Caveat re: autogen of the trait, when I first did this I wanted to just create static functions on a struct type. After looking at the ollama-rs code I realized I'd need to implement a trait to make it work. This chunk of code in impl Toolbox for MyToolBox {
fn get_impl_json(&self) -> Value {
MyToolBox::get_impl_json()
}
fn call_value_fn(&self, tool_name: &str, tool_args: Value) -> Value {
MyToolBox::call_value_fn(tool_name, tool_args)
}
} There's also a lot of error handling and cleanup that may be needed, and for now this only runs the first function called (as seen in the other issue). Thanks for entertaining my commentary and code, hopefully this makes some sense. |
@rseymour Is this still being developed and are you planning to create PR to this repo? |
@mcmah309 If folks were interested I could definitely put in a PR. I see
some work has been done around function calling, but this is still unique.
Would you want it hidden behind a feature? Right now it's 2 proc macros.
Using func_me w/ ollama-rs is also totally doable now without PR-ing
against ollama-rs -> https://github.com/rseymour/func_me
Up to you, I'm fine adding or closing this out. Doing a proper PR may take
a few weeks on my end, busy with the holidays and work.
…On Mon, Dec 2, 2024 at 5:56 AM Henry ***@***.***> wrote:
@rseymour <https://github.com/rseymour> Is this still being developed and
are you planning to create PR to this repo?
—
Reply to this email directly, view it on GitHub
<#62 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAZ2TP3IFBFDLGZET5VAUD2DQ4GTAVCNFSM6AAAAABMLDK242VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKMJRGIYDSMZVG4>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Merging it into this library would be great if possible! |
In late July 2024 tool support was added to ollama: https://ollama.com/blog/tool-support . About a year ago function calling was added to chatGPT and just this month 2024-08 Structured Outputs were added. I added function calling support to a rust library but I was not happy with the stringly-typed json-schema-esque part.
I think I have worked out a way to mash together
schemars
, proc macros, serde and some other bits and pieces to get something like:and spit out the associated schema with a function like:
tool_my_code
and ideally create some sort of tool dictionary that either behind the scenes or in rust code outside of the interaction can be used to actually execute the
my_code
function with the appropriate args.Ideally with no extra strings or
json!()
calls.I think I could implement it entirely separate from ollama-rs, but for maximum effectiveness it might need to go in here: https://github.com/pepperoni21/ollama-rs/blob/0.2.0/src/generation/functions/tools/mod.rs#L24 but at that point the concerns of a trait might mess with the introspective nature of the proc macro (which would rewrite functions and add code auto-magically)
Just gauging interest, seeing if anyone else is interested in using functions without needing to write strings (and ideally also support when StructuredOutput comes to the open models). I think it's a point where other languages are more stringy, and rust can show just how strict we can make LLM calling (for better or worse).
Thanks again for this nice library.
The text was updated successfully, but these errors were encountered: