docs.langflux.space
  • Welcome to LangFlux
  • Using LangFlux
    • API
    • Streaming
    • Embed
    • Variables
  • Configuration
    • Auth
      • Chatflow Level
    • Rate Limit
  • Integrations
    • Cache
      • InMemory Cache
    • Chains
      • Conversational Retrieval QA Chain
      • Vectara QA Chain
    • Document Loaders
      • S3 File Loader
      • PDF Files
    • Chat Models
      • Azure ChatOpenAI
      • ChatLocalAI
      • Google VertexAI
    • Embeddings
      • Azure OpenAI Embeddings
      • LocalAI Embeddings
    • Memory
      • Short Term Memory
      • Long Term Memory
        • Zep Memory
      • Threads
    • Text Splitters
      • Character Text Splitter
    • Tools
      • Custom Tool
    • Vector Stores
      • Chroma
      • Pinecone
      • Elastic
      • Qdrant
      • SingleStore
      • Supabase
      • Vectara
    • Utilities
      • Set/Get Variable
      • If Else
    • External Integrations
      • Zapier Zaps
  • Use Cases
    • Web Scrape QnA
    • Webhook Tool
Powered by GitBook
On this page
  • Separate conversations for multiple users
  • UI & Embedded Chat
  • Prediction API
  • Message API

Was this helpful?

  1. Integrations
  2. Memory

Long Term Memory

PreviousShort Term MemoryNextZep Memory

Last updated 1 year ago

Was this helpful?

Long Term Memory in LangFlux refers to memory nodes that are capable of persisting past conversations, that can be later retrieved to resume the conversation. This allows the conversations for different users to be isolated.

There are 5 short term memory nodes in LangFlux:

  • DynamoDB Chat Memory

  • Motorhead Memory

  • Upstash Chat Memory

Separate conversations for multiple users

UI & Embedded Chat

By default, UI and Embedded Chat will automatically separate different users conversations. This is done by generating a unique chatId for each new interaction. That logic is handled under the hood by LangFlux.

Prediction API

You can separate the conversations for multiple users by specifying a unique sessionId

  1. Use one of the long term memory nodes on LangFlux. Make sure the node has the input parameter Session ID

  1. In the /api/v1/prediction/{your-chatflowid} POST body request, specify the sessionId in overrideConfig

{
    "question": "hello!",
    "overrideConfig": {
        "sessionId": "user1"
    }
}

Message API

  • GET /api/v1/chatmessage/{your-chatflowid}

  • DELETE /api/v1/chatmessage/{your-chatflowid}

Query Param
Type
Value

sessionId

string

sort

enum

ASC or DESC

startDate

string

endDate

string

All conversations can be visualized and managed from UI as well:

Redis Chat Memory
Zep Memory