-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: use config values in analyze() #10
fix: use config values in analyze() #10
Conversation
WalkthroughThe changes involve updating the Changes
Poem
Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🔭 Outside diff range comments (1)
packages/core/src/core/llm-client.ts (1)
Line range hint
198-211
: Handle potential null reference in Anthropic client.The optional chaining (
?.
) here could result in a silent failure if the Anthropic client is not initialized. Consider throwing an error explicitly if the client is not available, similar to how it's done inexecuteAnthropicCompletion
.- const response = await this.anthropic?.messages.create({ + if (!this.anthropic) throw new Error("Anthropic client not initialized"); + const response = await this.anthropic.messages.create({
🧹 Nitpick comments (1)
packages/core/src/core/llm-client.ts (1)
Line range hint
200-204
: Consider skipping the system message when not provided.Currently, an empty string is sent as the system message when not provided. Consider skipping the system message entirely when it's not provided to avoid unnecessary message overhead.
- messages: [ - { - role: "assistant", - content: system || "", - }, + messages: [ + ...(system ? [{ + role: "assistant", + content: system, + }] : []),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/core/src/core/llm-client.ts
(1 hunks)
🔇 Additional comments (1)
packages/core/src/core/llm-client.ts (1)
191-192
: LGTM! Using config values instead of hardcoded defaults.The change properly uses configuration values from
this.config
instead of hardcoded defaults, making the code more maintainable and consistent with the rest of the implementation.
When calling
LLMClient.analyze()
thetemperature
andmaxTokens
default values are hardcoded. This PR updates the code so the values are taken fromthis.config
.Summary by CodeRabbit