Memory is the formation of: - Shot term memory, verbose and full context - Long term memory, summaries of conversation topics - Attentive/Archivist memory, information fed to the model by archivist

Short term memory

- Shot term memory, verbose and full context

For being the current numeration of the conversation and being the length of short term memory-

This may be modified such that only or is included in the output, in order to prevent redundant information. This is as quite neatly and it is expected that when performing a language model chain.

Long term memory

- Long term memory, Conciseand Relevant summary of a conversation. Note: may be any arbitrary function to summarize, not just a language model call.

This is derived from some arbitrary summarization function upon the prior conversation. The exact algorithm and function for implementing this may vary.

For cases where the summarization function produces a larger corpus than it’s input than this is considered an expansion and the summarization function is defined as . This is a meta-cognitive effect that should be considered similar to a person’s ability to narrativize some basic facts into a larger story that contains far more information. Which is a function that expands upon .

Where

Attentive Memory

- Attentive/Archivist memory, information fed to the model by Archivist

This is the set of information not present in or that is considered relevant to .

In order to perform these operations a variety of options are available.