Made with ❤️ by MineAI

MineAI - 24 December 2025 • Update

Introducing the experimental evaluation of our dynamic memory systems, focusing on Long-Term Memory (LTM) and Short-Term Memory (STM).

Experimental Evaluation of Dynamic STM and LTM in MineAI

Authors: Bilal Mehtab
Publish Date: 21 December 2025 (Original Release)
Update Date: 24 December 2025 (Latest Content)

Abstract

This document presents the design, testing, and evaluation of MineAI’s Short-Term Memory (STM) and Long-Term Memory (LTM) systems. It explains how user messages are captured, stored, and retrieved dynamically, describes memory behavior using a Redux-inspired pattern (state, actions, selectors), and provides descriptive guidelines for regex-based extraction patterns used in LTM. The LTM is currently in Experimental Phase, with plans to refine and launch as Beta.

1. Introduction

MineAI supports dynamic memory systems:

  • STM:Captures temporary, session-specific information. Resets after the session ends.
  • LTM:Stores persistent, cross-session information and evolves dynamically based on frequency and importance.

The system follows a state-management approach inspired by Redux, where actions update memory state, and selectors retrieve relevant information for AI responses.

2. Memory Architecture

2.1 Short-Term Memory (STM)

  • State: Stores session-specific messages and context.
  • Actions: Add temporary messages; clear STM on session end.
  • Selectors: Retrieve session-limited info for AI responses.
  • Behavior: STM handles ephemeral instructions like “respond using only emojis” or session-specific project mentions.

2.2 Long-Term Memory (LTM)

  • State: Persistent memory across sessions.
  • Actions: Extract new memories; update existing; confidence decay; manual deletion.
  • Selectors: Retrieve relevant memories for AI responses.

3. Regex Extraction Patterns (Descriptive)

To help users understand how LTM identifies and stores information, here’s a descriptive explanation of regex patterns for each category:

Preferences

Detect phrases where the user expresses likes, dislikes, or favorites.

“I like …”“I prefer …”“My favorite tool is …”

Projects

Detect phrases describing projects the user is working on.

“I’m working on …”“My project is …”“Currently developing …”

Goals

Detect phrases describing ambitions or objectives.

“My goal is to …”“I want to …”“I hope to …”

Constraints

Detect rules or requirements for AI responses.

“I need all responses to …”“Please avoid …”“Never …”

Facts

Detect personal information.

“I am a …”“I live in …”“My name is …”

How Users Can Use LTM:

  • • Any message matching these patterns is automatically captured and stored in LTM.
  • • Repeated mentions increase confidence; single mentions may decay if not reinforced.
  • • Users can view, refresh, or delete memories in Settings → History Management → LTM Chats.

4. Testing Methodology

CategorySession 1 InputSession 2 Retrieval
PreferenceUser prefers short and direct answers.How should you reply? → Short and direct
ProjectUser is working on a hosting platform called SkyHost.What project am I working on? → SkyHost
GoalUser’s goal is to become a leading AI hosting provider.What is my long-term goal? → Become leading AI hosting provider
ConstraintUser needs all responses to be concise.How should you respond? → Enforce concise responses
FactUser lives in Pakistan and is named Ali.What’s my name and where do I live? → Ali, Pakistan

Observations:

  • • Repeated topics (SkyHost) persisted across sessions.
  • • Single-mention facts decayed if the original session was removed.
  • • LTM retrieval works best for high-confidence and repeated topics.

5. Memory Dynamics

  • Frequency-based reinforcement: Repeated mentions strengthen memory confidence.
  • Confidence decay: Unused or low-priority memories gradually lose strength.
  • STM vs LTM: STM is session-limited; LTM persists for repeated or important facts.
  • Dynamic UI: Users can manage memories via History Management.

6. Experimental Observations

  • • SkyHost project persisted due to multiple mentions.
  • • Single-mention facts disappeared if sessions were removed.
  • • System behavior mirrors human memory: repetition increases retention.

7. Future Work

  • • Fine-tune confidence decay parameters.
  • • Ensure persistence for single-mention important facts.
  • • Expand extraction patterns for broader coverage.
  • • Improve cross-session retrieval accuracy and History UI.

8. Conclusion

STM and LTM are clearly differentiated. LTM dynamically captures, reinforces, and retrieves user information using regex-based patterns. Beta launch planned after refinement and user feedback.