Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama + streamText + tool issue #4700

Closed
adeel-ali-atp opened this issue Feb 5, 2025 · 8 comments
Closed

ollama + streamText + tool issue #4700

adeel-ali-atp opened this issue Feb 5, 2025 · 8 comments
Assignees
Labels
ai/ui bug Something isn't working

Comments

@adeel-ali-atp
Copy link

adeel-ali-atp commented Feb 5, 2025

Description

Hi, I've been going through seemingly a strange issue

When I try to call a tool using generateText it is working fine but same code with streamText fails to generate same response.

I tried adding onFinishStep, forced by providing system text as well.

Code example

import { streamText, tool } from "ai";
import { createOllama } from "ollama-ai-provider";
import { z } from "zod";
import { MODELS } from "./constant.js";

const ollama = createOllama();

const getWeatherTool = tool({
description: "Get the current weather in the specified city",
parameters: z.object({
city: z.string().describe("The city to get the weather for"),
}),
execute: async ({ city }) => {
return The weather in ${city} is 25°C and sunny.;
},
});

const askAQuestion = async (prompt: string) => {
const { textStream, text, steps } = await streamText({
model: ollama(MODELS.GRANITE_3_1_DENSE),
prompt,
tools: {
getWeather: getWeatherTool,
},
maxSteps: 2,
});

// for await (const text of textStream) {
// process.stdout.write(text);
// }

console.dir(await steps, { depth: null });
};

askAQuestion(What's the weather in London?).catch(console.error);

AI provider

ollama-ai-provider v1.2.0

Additional context

Package.json

"devDependencies": {
"tsx": "^4.19.2",
"typescript": "^5.7.3"
},
"dependencies": {
"@types/node": "^22.13.1",
"ai": "^4.1.17",
"dotenv": "^16.4.7",
"ollama": "^0.5.12",
"ollama-ai-provider": "^1.2.0",
"zod": "^3.24.1"
}

@adeel-ali-atp adeel-ali-atp added the bug Something isn't working label Feb 5, 2025
@lgrammel
Copy link
Collaborator

lgrammel commented Feb 5, 2025

With [email protected] and @ai-sdk/[email protected] I have reworked the resubmit mechanism. Please check if those new versions work for you (and also consider using message parts). Details can be found here: #4670

@lgrammel lgrammel added the ai/ui label Feb 5, 2025
@lgrammel lgrammel self-assigned this Feb 5, 2025
@adeel-ali-atp
Copy link
Author

adeel-ali-atp commented Feb 5, 2025

Do you think it is related to ui? when using tool with streamText it is not showing steps as well.

implementation with generateText

Image

log in terminal it made tool-call and worked fine

Image

Implementation with streamText

Image

log in terminal it made tool-call and worked fine

Image

@lgrammel
Copy link
Collaborator

lgrammel commented Feb 6, 2025

When you use streamText, you need to consume the stream before the promises (like steps) are resolved. Your await steps blocks before ever consuming the stream.

@adeel-ali-atp
Copy link
Author

adeel-ali-atp commented Feb 6, 2025

Still the same @lgrammel

I used newly added consumeStream() feature. No tool call is being made and response is empty text

Image

@lgrammel
Copy link
Collaborator

lgrammel commented Feb 6, 2025

Can you check for errors? https://sdk.vercel.ai/docs/ai-sdk-core/generating-text#onerror-callback

If that does not resolve it. pls file with the ollama provider https://github.com/sgomez/ollama-ai-provider

@adeel-ali-atp
Copy link
Author

Yes, confirm is nothing in the onError event. The onFinish event yields the same response as above.

Image Image

Thanks man! I'll open file it there.

@lgrammel
Copy link
Collaborator

lgrammel commented Feb 6, 2025

Do you get anything when you log in onChunk?

@adeel-ali-atp
Copy link
Author

adeel-ali-atp commented Feb 6, 2025

It's a known issue and they have provided a workaround on their own repo. You can close this issue. I was able to get the tool response after following the steps.

simulateStreaming: true is to pass in ollama settings to make streaming work.

Thanks for guidence.

Image

@lgrammel lgrammel closed this as completed Feb 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants