Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for retrieving user preferences and memories using Mem0 #1209

Open
wants to merge 37 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 36 commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
2607ac3
Integrate Mem0
Dev-Khant Aug 17, 2024
67991a3
Update src/crewai/memory/contextual/contextual_memory.py
Dev-Khant Aug 19, 2024
602497a
Merge branch 'main' into intergrate-mem0
joaomdmoura Sep 22, 2024
c7d326a
Merge branch 'main' into intergrate-mem0
Dev-Khant Sep 23, 2024
1f66c6d
pending commit for _fetch_user_memories
Dev-Khant Sep 23, 2024
7371a45
update poetry.lock
Dev-Khant Sep 23, 2024
66a1ecc
fixes mypy issues
Dev-Khant Sep 23, 2024
3517d53
fix mypy checks
Dev-Khant Sep 23, 2024
6e13c0b
Merge branch 'main' into intergrate-mem0
joaomdmoura Sep 23, 2024
bb90718
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 1, 2024
5d6eb6e
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 1, 2024
9a756bb
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 14, 2024
1dd20f9
New fixes for user_id
Dev-Khant Oct 14, 2024
3ab6084
remove memory_provider
Dev-Khant Oct 14, 2024
50758c9
handle memory_provider
Dev-Khant Oct 15, 2024
3fff7c4
checks for memory_config
Dev-Khant Oct 15, 2024
45e307e
add mem0 to dependency
Dev-Khant Oct 15, 2024
6316936
Update pyproject.toml
Dev-Khant Oct 15, 2024
b810697
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 15, 2024
57fcae8
update docs
Dev-Khant Oct 15, 2024
bb17567
update doc
Dev-Khant Oct 15, 2024
c55ccac
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 18, 2024
9a4952d
bump mem0 version
Dev-Khant Oct 18, 2024
9d0459d
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 21, 2024
cda4c07
merge main
Dev-Khant Oct 24, 2024
9044c5a
Merge branch 'main' into intergrate-mem0
bhancockio Oct 25, 2024
6be3061
fix api error msg and mypy issue
Dev-Khant Oct 25, 2024
7f4917e
mypy fix
Dev-Khant Oct 25, 2024
5929d35
resolve comments
Dev-Khant Oct 26, 2024
47b34ec
Merge branch 'main' into intergrate-mem0
Dev-Khant Oct 26, 2024
e9c2bc8
fix memory usage without mem0
Dev-Khant Oct 26, 2024
8d3145b
Merge branch 'main' into intergrate-mem0
Dev-Khant Nov 1, 2024
ee7d530
Merge branch 'main' into intergrate-mem0
bhancockio Nov 4, 2024
d2746e1
Merge branch 'main' into intergrate-mem0
bhancockio Nov 7, 2024
c63f56e
mem0 version bump
Dev-Khant Nov 7, 2024
34cc0d9
lazy import mem0
Dev-Khant Nov 7, 2024
ea54fb9
Merge branch 'main' into intergrate-mem0
Dev-Khant Nov 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/concepts/crews.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@ A crew in crewAI represents a collaborative group of agents working together to
| **Max RPM** _(optional)_ | `max_rpm` | Maximum requests per minute the crew adheres to during execution. Defaults to `None`. |
| **Language** _(optional)_ | `language` | Language used for the crew, defaults to English. |
| **Language File** _(optional)_ | `language_file` | Path to the language file to be used for the crew. |
| **Memory** _(optional)_ | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). Defaults to `False`. |
| **Memory** _(optional)_ | `memory` | Utilized for storing execution memories (short-term, long-term, entity memory). |
| **Memory Config** _(optional)_ | `memory_config` | Configuration for the memory provider to be used by the crew. |
| **Cache** _(optional)_ | `cache` | Specifies whether to use a cache for storing the results of tools' execution. Defaults to `True`. |
| **Embedder** _(optional)_ | `embedder` | Configuration for the embedder to be used by the crew. Mostly used by memory for now. Default is `{"provider": "openai"}`. |
| **Full Output** _(optional)_ | `full_output` | Whether the crew should return the full output with all tasks outputs or just the final output. Defaults to `False`. |
Expand Down
42 changes: 42 additions & 0 deletions docs/concepts/memory.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ reason, and learn from past interactions.
| **Long-Term Memory** | Preserves valuable insights and learnings from past executions, allowing agents to build and refine their knowledge over time. |
| **Entity Memory** | Captures and organizes information about entities (people, places, concepts) encountered during tasks, facilitating deeper understanding and relationship mapping. Uses `RAG` for storing entity information. |
| **Contextual Memory**| Maintains the context of interactions by combining `ShortTermMemory`, `LongTermMemory`, and `EntityMemory`, aiding in the coherence and relevance of agent responses over a sequence of tasks or a conversation. |
| **User Memory** | Stores user-specific information and preferences, enhancing personalization and user experience. |

## How Memory Systems Empower Agents

Expand Down Expand Up @@ -92,6 +93,47 @@ my_crew = Crew(
)
```

## Integrating Mem0 for Enhanced User Memory

[Mem0](https://mem0.ai/) is a self-improving memory layer for LLM applications, enabling personalized AI experiences.

To include user-specific memory you can get your API key [here](https://app.mem0.ai/dashboard/api-keys) and refer the [docs](https://docs.mem0.ai/platform/quickstart#4-1-create-memories) for adding user preferences.


```python Code
import os
from crewai import Crew, Process
from mem0 import MemoryClient

# Set environment variables for Mem0
os.environ["MEM0_API_KEY"] = "m0-xx"

# Step 1: Record preferences based on past conversation or user input
client = MemoryClient()
messages = [
{"role": "user", "content": "Hi there! I'm planning a vacation and could use some advice."},
{"role": "assistant", "content": "Hello! I'd be happy to help with your vacation planning. What kind of destination do you prefer?"},
{"role": "user", "content": "I am more of a beach person than a mountain person."},
{"role": "assistant", "content": "That's interesting. Do you like hotels or Airbnb?"},
{"role": "user", "content": "I like Airbnb more."},
]
client.add(messages, user_id="john")

# Step 2: Create a Crew with User Memory

crew = Crew(
agents=[...],
tasks=[...],
verbose=True,
process=Process.sequential,
memory=True,
memory_config={
"provider": "mem0",
"config": {"user_id": "john"},
},
)
```


## Additional Embedding Providers

Expand Down
6 changes: 3 additions & 3 deletions poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ Repository = "https://github.com/crewAIInc/crewAI"
[project.optional-dependencies]
tools = ["crewai-tools>=0.13.4"]
agentops = ["agentops>=0.3.0"]
mem0 = ["mem0ai>=0.1.29"]

[tool.uv]
dev-dependencies = [
Expand Down
2 changes: 2 additions & 0 deletions src/crewai/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,9 +252,11 @@ def execute_task(

if self.crew and self.crew.memory:
contextual_memory = ContextualMemory(
self.crew.memory_config,
self.crew._short_term_memory,
self.crew._long_term_memory,
self.crew._entity_memory,
self.crew._user_memory,
)
memory = contextual_memory.build_context_for_task(task, context)
if memory.strip() != "":
Expand Down
24 changes: 22 additions & 2 deletions src/crewai/crew.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
from crewai.memory.entity.entity_memory import EntityMemory
from crewai.memory.long_term.long_term_memory import LongTermMemory
from crewai.memory.short_term.short_term_memory import ShortTermMemory
from crewai.memory.user.user_memory import UserMemory
from crewai.process import Process
from crewai.task import Task
from crewai.tasks.conditional_task import ConditionalTask
Expand Down Expand Up @@ -71,6 +72,7 @@ class Crew(BaseModel):
manager_llm: The language model that will run manager agent.
manager_agent: Custom agent that will be used as manager.
memory: Whether the crew should use memory to store memories of it's execution.
memory_config: Configuration for the memory to be used for the crew.
cache: Whether the crew should use a cache to store the results of the tools execution.
function_calling_llm: The language model that will run the tool calling for all the agents.
process: The process flow that the crew will follow (e.g., sequential, hierarchical).
Expand All @@ -94,6 +96,7 @@ class Crew(BaseModel):
_short_term_memory: Optional[InstanceOf[ShortTermMemory]] = PrivateAttr()
_long_term_memory: Optional[InstanceOf[LongTermMemory]] = PrivateAttr()
_entity_memory: Optional[InstanceOf[EntityMemory]] = PrivateAttr()
_user_memory: Optional[InstanceOf[UserMemory]] = PrivateAttr()
_train: Optional[bool] = PrivateAttr(default=False)
_train_iteration: Optional[int] = PrivateAttr()
_inputs: Optional[Dict[str, Any]] = PrivateAttr(default=None)
Expand All @@ -114,6 +117,10 @@ class Crew(BaseModel):
default=False,
description="Whether the crew should use memory to store memories of it's execution",
)
memory_config: Optional[Dict[str, Any]] = Field(
default=None,
description="Configuration for the memory to be used for the crew.",
)
short_term_memory: Optional[InstanceOf[ShortTermMemory]] = Field(
default=None,
description="An Instance of the ShortTermMemory to be used by the Crew",
Expand All @@ -126,7 +133,11 @@ class Crew(BaseModel):
default=None,
description="An Instance of the EntityMemory to be used by the Crew",
)
embedder: Optional[Any] = Field(
user_memory: Optional[InstanceOf[UserMemory]] = Field(
default=None,
description="An instance of the UserMemory to be used by the Crew to store/fetch memories of a specific user.",
)
embedder: Optional[dict] = Field(
default=None,
description="Configuration for the embedder to be used for the crew.",
)
Expand Down Expand Up @@ -238,13 +249,22 @@ def create_crew_memory(self) -> "Crew":
self._short_term_memory = (
self.short_term_memory
if self.short_term_memory
else ShortTermMemory(crew=self, embedder_config=self.embedder)
else ShortTermMemory(
crew=self,
embedder_config=self.embedder,
)
)
self._entity_memory = (
self.entity_memory
if self.entity_memory
else EntityMemory(crew=self, embedder_config=self.embedder)
)
if hasattr(self, "memory_config") and self.memory_config is not None:
self._user_memory = (
self.user_memory if self.user_memory else UserMemory(crew=self)
)
else:
self._user_memory = None
return self

@model_validator(mode="after")
Expand Down
3 changes: 2 additions & 1 deletion src/crewai/memory/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from .entity.entity_memory import EntityMemory
from .long_term.long_term_memory import LongTermMemory
from .short_term.short_term_memory import ShortTermMemory
from .user.user_memory import UserMemory

__all__ = ["EntityMemory", "LongTermMemory", "ShortTermMemory"]
__all__ = ["UserMemory", "EntityMemory", "LongTermMemory", "ShortTermMemory"]
47 changes: 42 additions & 5 deletions src/crewai/memory/contextual/contextual_memory.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,25 @@
from typing import Optional
from typing import Optional, Dict, Any

from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory
from crewai.memory import EntityMemory, LongTermMemory, ShortTermMemory, UserMemory


class ContextualMemory:
def __init__(self, stm: ShortTermMemory, ltm: LongTermMemory, em: EntityMemory):
def __init__(
self,
memory_config: Optional[Dict[str, Any]],
stm: ShortTermMemory,
ltm: LongTermMemory,
em: EntityMemory,
um: UserMemory,
):
if memory_config is not None:
self.memory_provider = memory_config.get("provider")
else:
self.memory_provider = None
self.stm = stm
self.ltm = ltm
self.em = em
self.um = um

def build_context_for_task(self, task, context) -> str:
"""
Expand All @@ -23,6 +35,8 @@ def build_context_for_task(self, task, context) -> str:
context.append(self._fetch_ltm_context(task.description))
context.append(self._fetch_stm_context(query))
context.append(self._fetch_entity_context(query))
if self.memory_provider == "mem0":
context.append(self._fetch_user_context(query))
return "\n".join(filter(None, context))

def _fetch_stm_context(self, query) -> str:
Expand All @@ -32,7 +46,10 @@ def _fetch_stm_context(self, query) -> str:
"""
stm_results = self.stm.search(query)
formatted_results = "\n".join(
[f"- {result['context']}" for result in stm_results]
[
f"- {result['memory'] if self.memory_provider == 'mem0' else result['context']}"
for result in stm_results
]
)
print("formatted_results stm", formatted_results)
return f"Recent Insights:\n{formatted_results}" if stm_results else ""
Expand Down Expand Up @@ -65,7 +82,27 @@ def _fetch_entity_context(self, query) -> str:
"""
em_results = self.em.search(query)
formatted_results = "\n".join(
[f"- {result['context']}" for result in em_results] # type: ignore # Invalid index type "str" for "str"; expected type "SupportsIndex | slice"
[
f"- {result['memory'] if self.memory_provider == 'mem0' else result['context']}"
for result in em_results
] # type: ignore # Invalid index type "str" for "str"; expected type "SupportsIndex | slice"
)
print("formatted_results em", formatted_results)
return f"Entities:\n{formatted_results}" if em_results else ""

def _fetch_user_context(self, query: str) -> str:
"""
Fetches and formats relevant user information from User Memory.
Args:
query (str): The search query to find relevant user memories.
Returns:
str: Formatted user memories as bullet points, or an empty string if none found.
"""
user_memories = self.um.search(query)
if not user_memories:
return ""

formatted_memories = "\n".join(
f"- {result['memory']}" for result in user_memories
)
return f"User memories/preferences:\n{formatted_memories}"
42 changes: 32 additions & 10 deletions src/crewai/memory/entity/entity_memory.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,43 @@ class EntityMemory(Memory):
"""

def __init__(self, crew=None, embedder_config=None, storage=None):
storage = (
storage
if storage
else RAGStorage(
type="entities",
allow_reset=True,
embedder_config=embedder_config,
crew=crew,
if hasattr(crew, "memory_config") and crew.memory_config is not None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are we tying together entity memory with user memory? Shouldn't they be separate?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The root issue is that we are an agnostic platform and this PR directly couples CrewAI memory with mem0.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey, thanks for reviewing the PR!

In this case, we're using Mem0 within entity_memory when the user opts for Mem0. So, it's not only the user memory being utilized but also the entity memory.

Let me know if we only want to use user_memory when user opts for Mem0 and keep other memories as is.

self.memory_provider = crew.memory_config.get("provider")
else:
self.memory_provider = None

if self.memory_provider == "mem0":
try:
from crewai.memory.storage.mem0_storage import Mem0Storage
except ImportError:
raise ImportError(
"Mem0 is not installed. Please install it with `pip install mem0ai`."
)
storage = Mem0Storage(type="entities", crew=crew)
else:
storage = (
storage
if storage
else RAGStorage(
type="entities",
allow_reset=False,
embedder_config=embedder_config,
crew=crew,
)
)
)
super().__init__(storage)

def save(self, item: EntityMemoryItem) -> None: # type: ignore # BUG?: Signature of "save" incompatible with supertype "Memory"
"""Saves an entity item into the SQLite storage."""
data = f"{item.name}({item.type}): {item.description}"
if self.memory_provider == "mem0":
data = f"""
Remember details about the following entity:
Name: {item.name}
Type: {item.type}
Entity Description: {item.description}
"""
else:
data = f"{item.name}({item.type}): {item.description}"
super().save(data, item.metadata)

def reset(self) -> None:
Expand Down
11 changes: 9 additions & 2 deletions src/crewai/memory/memory.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,5 +23,12 @@ def save(

self.storage.save(value, metadata)

def search(self, query: str) -> List[Dict[str, Any]]:
return self.storage.search(query)
def search(
self,
query: str,
limit: int = 3,
score_threshold: float = 0.35,
) -> List[Any]:
return self.storage.search(
query=query, limit=limit, score_threshold=score_threshold
)
39 changes: 31 additions & 8 deletions src/crewai/memory/short_term/short_term_memory.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,13 +14,27 @@ class ShortTermMemory(Memory):
"""

def __init__(self, crew=None, embedder_config=None, storage=None):
storage = (
storage
if storage
else RAGStorage(
type="short_term", embedder_config=embedder_config, crew=crew
if hasattr(crew, "memory_config") and crew.memory_config is not None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once again, why are we connecting short term with mem0? Shouldn't we only be checking user memory?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here when user opts for Mem0 we use mem0 inside short_term memory as well.

So let me know if we only want to use user_memory when user opts for Mem0 and keep other memories as is.

self.memory_provider = crew.memory_config.get("provider")
else:
self.memory_provider = None

if self.memory_provider == "mem0":
try:
from crewai.memory.storage.mem0_storage import Mem0Storage
except ImportError:
raise ImportError(
"Mem0 is not installed. Please install it with `pip install mem0ai`."
)
storage = Mem0Storage(type="short_term", crew=crew)
else:
storage = (
storage
if storage
else RAGStorage(
type="short_term", embedder_config=embedder_config, crew=crew
)
)
)
super().__init__(storage)

def save(
Expand All @@ -30,11 +44,20 @@ def save(
agent: Optional[str] = None,
) -> None:
item = ShortTermMemoryItem(data=value, metadata=metadata, agent=agent)
if self.memory_provider == "mem0":
item.data = f"Remember the following insights from Agent run: {item.data}"

super().save(value=item.data, metadata=item.metadata, agent=item.agent)

def search(self, query: str, score_threshold: float = 0.35):
return self.storage.search(query=query, score_threshold=score_threshold) # type: ignore # BUG? The reference is to the parent class, but the parent class does not have this parameters
def search(
self,
query: str,
limit: int = 3,
score_threshold: float = 0.35,
):
return self.storage.search(
query=query, limit=limit, score_threshold=score_threshold
) # type: ignore # BUG? The reference is to the parent class, but the parent class does not have this parameters

def reset(self) -> None:
try:
Expand Down
Loading
Loading