Short Term Memory
Last updated
Was this helpful?
Last updated
Was this helpful?
Short Term Memory in LangFlux refers to ephemeral memory nodes that are only capable of storing past conversations in RAM. It simply stores the conversations in an array. When Flowise instance got restarted, everything will be lost.
There are 3 short term memory nodes in LangFlux:
BufferMemory
BufferWindowMemory
ConversationSummaryMemory
The simplest amongst all. Store conversations into an array, and later pass it on to LLM.
Sometimes when conversations are too long, you might face issues where token limit exceeded. This is because there is simply too much text to fit into a limited context size of LLM.
Instead of storing all conversations, store only K
number of conversations. This uses a sliding window implementation to get the most recent K
interactions.
This uses a LLM to create a summary of the conversations. It is useful for condensing information from the conversation over time.
By default, UI and Embedded Chat will automatically separate different users conversations. This is done by providing a list of history
to the API. That logic is handled under the hood by LangFlux.
You can separate the conversations for multiple users by providing a list of history
:
In the /api/v1/prediction/{your-chatflowid}
POST body request, specify the history
array:
GET /api/v1/chatmessage/{your-chatflowid}
DELETE /api/v1/chatmessage/{your-chatflowid}
sort
enum
ASC or DESC
startDate
string
endDate
string
All conversations can be visualized and managed from UI as well: