Skip to content

add modeling agent memory content #12

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/assets/images/memory-episodic-data-model.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-episodic-example.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-procedural-data-model.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-procedural-example.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-procedural-temporal-data-model.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-procedural-temporal-example.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-semantic-data-model.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-semantic-example.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-semantic-temporal-data-model.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions src/assets/images/memory-semantic-temporal-example.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
---
title: Episodic Memory Graph
description: Structured, navigable, contains episodic details of events
---

## Episodic Memory

Episodic memory stores remembered experiences. These contain information about event details and past agent actions. A common use case for this type of memory is in few-shot prompting. Previous question-answer pairs are stored in the graph and retrieved to provide examples in the prompt. Performing a similarity search between the user query and questions in the database accomplishes this.

The image below shows a graph data model that stores user questions and the Cypher query used to retrieve the data. Using similarity search and a simple graph traversal, we can retrieve the top k most relevant examples from the database and inject them into the prompt to inform novel Cypher generation.

![Graph](../../../../assets/images/memory-episodic-data-model.svg)

Here’s an example of how this may look in practice. A possible retrieval method is to perform a similarity search against the question embeddings in the database and then traverse to the associated Cypher queries. The top k question texts and Cypher query statements are returned and formatted into a few-shot examples.

![Graph](../../../../assets/images/memory-episodic-example.svg)

*A Cypher query may have many questions. This is because some user questions may differ in explicit text but have the same semantic meaning.*

The process of updating these memories may look like this:

* The agent generates a Cypher query to retrieve information from a Neo4j database.
* The user rates the returned result from the agent as good or bad.
* Positive feedback kicks off a process that writes the Cypher query and question to the database to use for future examples.

Episodic memory is better written in the background once user feedback is received. This prevents bad or unhelpful memories from being included, which would ultimately diminish performance.

## Further reading

* [Modeling Agent Memory](https://medium.com/neo4j/modeling-agent-memory-d3b6bc3bb9c4)
* [LangGraph Memory](https://langchain-ai.github.io/langgraph/concepts/memory/)
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
---
title: Procedural Memory Graph
description: Structured, navigable, annotated and instructional content
---

## Procedural Memory Graph

Procedural memory stores how to do something. In real life, this type of memory helps us write with a pen or play guitar without thinking about the explicit actions. Through practice, these actions have become internalized in our brains, and we no longer have to think about how to perform them.

For AI systems, procedural memory is a combination of the model weights, code, and prompts. Since the prompts are most accessible to us, this type of memory is commonly used to store system prompts, tool descriptions, and instructions. These memories may then be updated in a feedback loop from either the end user or another system, such as an LLM-as-a-judge. Just as we learned to write through practice and feedback, an agent can learn a task by refining its internal set of instructions.

Here, the graph data model stores information about prompts. It contains both system and user prompts for a particular use case.

![Graph](../../../../assets/images/memory-procedural-data-model.svg)

This is how the data may look for a Text2Cypher agent. A possible retrieval strategy is to retrieve these prompts anytime a new chat session is started. This ensures that any updates made since the previous session are implemented.

![Graph](../../../../assets/images/memory-procedural-example.svg)

Note the difference between episodic and procedural memory. In the Cypher generation example, episodic memory is recalling the explicit question and Cypher pairs, whereas procedural memory is recalling how the Cypher is generated.

The process of updating these memories may look like this:

* A series of prompt, answer, and feedback triples are collected as input.
* An LLM takes this series and generates a new prompt that adheres to the provided feedback.
* This new prompt is written into the database.

Procedural memory also lends itself well to being written in the background once feedback is received. In the example of improving prompts, it is probably best to wait until many feedback responses are received so the LLM performing the improvement can align with a more representative sample of the user base.

## Further reading

* [Modeling Agent Memory](https://medium.com/neo4j/modeling-agent-memory-d3b6bc3bb9c4)
* [LangGraph Memory](https://langchain-ai.github.io/langgraph/concepts/memory/)
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
---
title: Semantic Memory Graph
description: Structured, navigable, contains entities extracted from conversation
---

## Semantic Memory Graph

Semantic memory contains facts about the world. For an agent, this can be information about the user, such as name, age, or relationships to other people. This could also take the form of a collection of documents used in a RAG pipeline. This type of memory requires information to be properly maintained and can change frequently, which leads to complexity in creating, updating, and deleting memories appropriately.

Below is a possible graph data model that contains information about a user profile. In this data model, we also can track the relationships a user has with other users, as well as events they attended.

![Graph](../../../../assets/images/memory-semantic-data-model.svg)

Here’s an example of how this may look in practice. Information about the current user may be retrieved dynamically according to the input question. For example, if the question requires knowledge about what the user does for fun, a query may be used to grab information about events they’ve attended.

![Graph](../../../../assets/images/memory-semantic-example.svg)

The process of updating these memories may look like this:

* Prepare entities or unstructured text from the conversation to be written as a memory.
* Search for the top k memories in the database that are similar to the incoming prepared data.
* Identify if there is new or conflicting information in the user query.
* Update the existing memories with new nodes or values.
* Create or delete relationships.

Semantic memory lends itself well to being updated in the hot path. This prevents the agent from communicating out-of-date information to the user. Since this data is typically used in a RAG pipeline, it poses a risk of delayed writing in the future.


## Further reading

* [Modeling Agent Memory](https://medium.com/neo4j/modeling-agent-memory-d3b6bc3bb9c4)
* [LangGraph Memory](https://langchain-ai.github.io/langgraph/concepts/memory/)
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: Temporal Memory Graph
description: Structured, navigable, maintained change history of data
---

## Temporal Memory Graph

Temporal memory stores how data changes over time. This can apply to any other memory type and allows the agent to be aware of how things have changed. In the example of storing the semantic memory of a user profile, we can implement temporal memory in a few ways. One is by including timestamps on `HAS_FRIEND` relationships with other users to identify the beginning and end of personal relationships. The other is implementing a `PREVIOUS` relationship between nodes we want to maintain versions of. In the example below, we’ll pull out the user description into its own `UserDescription` node, which can be tracked in this manner. The image below shows the updated user profile data model that incorporates semantic and temporal memory.

![Graph](../../../../assets/images/memory-semantic-temporal-data-model.svg)

We see an applied version of this data where the user description has been updated over several months. Notice that the `User` node only has a `HAS_CURRENT_DESC` relationship with the most up-to-date `UserDesc`. This makes retrieval easier. We also see that Bob used to be friends with Alice until recently.

![Graph](../../../../assets/images/memory-semantic-temporal-example.svg)

Another example would be tracking prompt versions with the procedural memory type. Here, we have a primary prompt node with the prompt name. It’s connected to a sequence of `UserPromptDetails` or `SysPromptDetails` nodes with a `PREVIOUS` relationship to the most recent version. This allows the prompt to be retrieved while allowing previous versions to be easily accessible for auditing or reverting changes.

![Graph](../../../../assets/images/memory-procedural-temporal-data-model.svg)

We can see the applied version of this data model for a Text2Cypher agent, where changes in prompt content and parameters are easily accessible.


![Graph](../../../../assets/images/memory-procedural-temporal-example.svg)



## Further reading

* [Modeling Agent Memory](https://medium.com/neo4j/modeling-agent-memory-d3b6bc3bb9c4)
* [LangGraph Memory](https://langchain-ai.github.io/langgraph/concepts/memory/)