Skip to content

Commit

Permalink
changeimport order
Browse files Browse the repository at this point in the history
  • Loading branch information
qew21 committed Dec 26, 2024
1 parent e8f2410 commit 418c2ce
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion rdagent/oai/llm_conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class LLMSettings(ExtendedBaseSettings):
# Behavior of returning answers to the same question when caching is enabled
use_auto_chat_cache_seed_gen: bool = False
"""
`_create_chat_completion_inner_function` provdies a feature to pass in a seed to affect the cache hash key
`_create_chat_completion_inner_function` provides a feature to pass in a seed to affect the cache hash key
We want to enable a auto seed generator to get different default seed for `_create_chat_completion_inner_function`
if seed is not given.
So the cache will only not miss you ask the same question on same round.
Expand Down
2 changes: 1 addition & 1 deletion rdagent/scenarios/data_science/scen/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from .kaggle import KaggleScen
from .scen import DataScienceScen
from .kaggle import KaggleScen

__all__ = ["DataScienceScen", "KaggleScen"]

0 comments on commit 418c2ce

Please sign in to comment.