The ability to establish and maintain relationships with users is crucial for Interactive Conversational Agents (ICAs). This involves remembering information from previous interactions and utilizing it in future conversations-key behaviors that build rapport and demonstrate interest in the user, which are essential for sustained engagement. Current conversational agents typically rely on context from recent conversational turns. To enhance interaction quality, agents should be equipped with a Long-Term Memory (LTM) that encompasses entire past conversations. The state-of-the-art approach to incorporating LTM in ICAs involves saving all previous turns in an external storage system (e.g., a plain database). Despite achieving considerable results, this method is not biologically inspired and diverges from how human memory works-where only essential information is stored and reconstructed generatively using semantic memory. We propose a hierarchical memory system inspired by human memory stratification. It dynamically updates and retrieves contextually relevant episodic and declarative memories to support semantically enriched interactions. A significant innovation in our approach is the use of a Large Language Model (LLM) Agent to determine what structured information should be stored during conversations. Our preliminary results suggest that our approach maintains competitive performance compared to existing methods while introducing novel features such as graph-based persona representation and a reduced need for storing entire past dialogues.
Development of a Human-inspired Long-Term Memory for Interactive Conversational Agents
Galatolo, Federico A.;Cominelli, Lorenzo;Pardini, Marco;Cimino, Mario G. C. A.;Greco, Alberto;Scilingo, Enzo Pasquale
2025-01-01
Abstract
The ability to establish and maintain relationships with users is crucial for Interactive Conversational Agents (ICAs). This involves remembering information from previous interactions and utilizing it in future conversations-key behaviors that build rapport and demonstrate interest in the user, which are essential for sustained engagement. Current conversational agents typically rely on context from recent conversational turns. To enhance interaction quality, agents should be equipped with a Long-Term Memory (LTM) that encompasses entire past conversations. The state-of-the-art approach to incorporating LTM in ICAs involves saving all previous turns in an external storage system (e.g., a plain database). Despite achieving considerable results, this method is not biologically inspired and diverges from how human memory works-where only essential information is stored and reconstructed generatively using semantic memory. We propose a hierarchical memory system inspired by human memory stratification. It dynamically updates and retrieves contextually relevant episodic and declarative memories to support semantically enriched interactions. A significant innovation in our approach is the use of a Large Language Model (LLM) Agent to determine what structured information should be stored during conversations. Our preliminary results suggest that our approach maintains competitive performance compared to existing methods while introducing novel features such as graph-based persona representation and a reduced need for storing entire past dialogues.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


