docs.langflux.space
  • Welcome to LangFlux
  • Using LangFlux
    • API
    • Streaming
    • Embed
    • Variables
  • Configuration
    • Auth
      • Chatflow Level
    • Rate Limit
  • Integrations
    • Cache
      • InMemory Cache
    • Chains
      • Conversational Retrieval QA Chain
      • Vectara QA Chain
    • Document Loaders
      • S3 File Loader
      • PDF Files
    • Chat Models
      • Azure ChatOpenAI
      • ChatLocalAI
      • Google VertexAI
    • Embeddings
      • Azure OpenAI Embeddings
      • LocalAI Embeddings
    • Memory
      • Short Term Memory
      • Long Term Memory
        • Zep Memory
      • Threads
    • Text Splitters
      • Character Text Splitter
    • Tools
      • Custom Tool
    • Vector Stores
      • Chroma
      • Pinecone
      • Elastic
      • Qdrant
      • SingleStore
      • Supabase
      • Vectara
    • Utilities
      • Set/Get Variable
      • If Else
    • External Integrations
      • Zapier Zaps
  • Use Cases
    • Web Scrape QnA
    • Webhook Tool
Powered by GitBook
On this page
  • Separate conversations for multiple users
  • UI & Embedded Chat
  • Prediction API
  • Message API

Was this helpful?

  1. Integrations
  2. Memory

Threads

PreviousZep MemoryNextText Splitters

Last updated 1 year ago

Was this helpful?

is only used when an OpenAI Assistant is being used. It is a conversation session between an Assistant and a user. Threads store messages and automatically handle truncation to fit content into a model’s context.

Separate conversations for multiple users

UI & Embedded Chat

By default, UI and Embedded Chat will automatically separate threads for multiple users conversations. This is done by generating a unique chatId for each new interaction. That logic is handled under the hood by LangFlux.

Prediction API

POST /api/v1/prediction/{your-chatflowid}, specify the chatId . Same thread will be used for the same chatId.

{
    "question": "hello!",
    "chatId": "user1"
}

Message API

  • GET /api/v1/chatmessage/{your-chatflowid}

  • DELETE /api/v1/chatmessage/{your-chatflowid}

You can also filter via chatId - /api/v1/chatmessage/{your-chatflowid}?chatId={your-chatid}

All conversations can be visualized and managed from UI as well:

Threads