docs.langflux.space
  • Welcome to LangFlux
  • Using LangFlux
    • API
    • Streaming
    • Embed
    • Variables
  • Configuration
    • Auth
      • Chatflow Level
    • Rate Limit
  • Integrations
    • Cache
      • InMemory Cache
    • Chains
      • Conversational Retrieval QA Chain
      • Vectara QA Chain
    • Document Loaders
      • S3 File Loader
      • PDF Files
    • Chat Models
      • Azure ChatOpenAI
      • ChatLocalAI
      • Google VertexAI
    • Embeddings
      • Azure OpenAI Embeddings
      • LocalAI Embeddings
    • Memory
      • Short Term Memory
      • Long Term Memory
        • Zep Memory
      • Threads
    • Text Splitters
      • Character Text Splitter
    • Tools
      • Custom Tool
    • Vector Stores
      • Chroma
      • Pinecone
      • Elastic
      • Qdrant
      • SingleStore
      • Supabase
      • Vectara
    • Utilities
      • Set/Get Variable
      • If Else
    • External Integrations
      • Zapier Zaps
  • Use Cases
    • Web Scrape QnA
    • Webhook Tool
Powered by GitBook
On this page
  • Prediction API
  • Vector Upsert API
  • Message API
  • Tutorials

Was this helpful?

  1. Using LangFlux

API

PreviousUsing LangFluxNextStreaming

Last updated 1 year ago

Was this helpful?

Prediction API

  • POST /api/v1/prediction/{your-chatflowid}

Request Body

Key
Description
Type
Required

question

User's question

string

Yes

overrideConfig

Override existing flow configuration

object

No

history

array

No

You can use the chatflow as API and connect to frontend applications.

You also have the flexibility to override input configuration with overrideConfig property.

import requests

API_URL = "http://localhost:3000/api/v1/prediction/<chatlfowid>"

def query(payload):
    response = requests.post(API_URL, json=payload)
    return response.json()
    
output = query({
    "question": "Hey, how are you?",
    "overrideConfig": {
        "returnSourceDocuments": true
    },
    "history": [
        {
            "message": "Hello, how can I assist you?",
            "type": "apiMessage"
        },
        {
            "type": "userMessage",
            "message": "Hello I am Bob"
        },
        {
            "type": "apiMessage",
            "message": "Hello Bob! how can I assist you?"
        }
    ]
})
async function query(data) {
    const response = await fetch(
        "http://localhost:3000/api/v1/prediction/<chatlfowid>",
        {
            method: "POST",
            headers: {
                "Content-Type": "application/json"
            },
            body: JSON.stringify(data)
        }
    );
    const result = await response.json();
    return result;
}

query({
    "question": "Hey, how are you?",
    "overrideConfig": {
        "returnSourceDocuments": true
    },
    "history": [
        {
            "message": "Hello, how can I assist you?",
            "type": "apiMessage"
        },
        {
            "type": "userMessage",
            "message": "Hello I am Bob"
        },
        {
            "type": "apiMessage",
            "message": "Hello Bob! how can I assist you?"
        }
    ]
}).then((response) => {
    console.log(response);
});

Vector Upsert API

  • POST /api/v1/vector/upsert/{your-chatflowid}

Request Body

Key
Description
Type
Required

overrideConfig

Override existing flow configuration

object

No

stopNodeId

Node ID of the vector store. When you have multiple vector stores in a flow, you might not want to upsert all of them. Specifying stopNodeId will ensure only that specific vector store node is upserted.

array

No

Document Loaders with Upload

Some document loaders in Flowise allow user to upload files:

It is user's responsibility to make sure the file type is compatible with the expected file type from document loader. For example, if a Text File Loader is being used, you should only upload file with .txt extension.

import requests

API_URL = "http://localhost:3000/api/v1/vector/upsert/<chatlfowid>"

# use form data to upload files
form_data = {
    "files": ('state_of_the_union.txt', open('state_of_the_union.txt', 'rb'))
}

body_data = {
    "returnSourceDocuments": True
}

def query(form_data):
    response = requests.post(API_URL, files=form_data, data=body_data)
    print(response)
    return response.json()

output = query(form_data)
print(output)
// use FormData to upload files
let formData = new FormData();
formData.append("files", input.files[0]);
formData.append("returnSourceDocuments", true);

async function query(formData) {
    const response = await fetch(
        "http://localhost:3000/api/v1/vector/upsert/<chatlfowid>",
        {
            method: "POST",
            body: formData
        }
    );
    const result = await response.json();
    return result;
}

query(formData).then((response) => {
    console.log(response);
});

Document Loaders without Upload

Message API

  • GET /api/v1/chatmessage/{your-chatflowid}

  • DELETE /api/v1/chatmessage/{your-chatflowid}

Query Parameters

Param
Type
Value

sessionId

string

sort

enum

ASC or DESC

startDate

string

endDate

string

Tutorials

  • How to use LangFlux API

Provide list of history messages to the flow. Only works when using

If the flow contains with Upload File functionality, the API looks slightly different. Instead of passing body as JSON, form-data is being used. This allows you to upload any files to the API.

For other nodes without Upload File functionality, the API body is in JSON format similar to .

How to use LangFlux API and connect to

Document Loaders
Bubble
Document Loaders
Prediction API
Short Term Memory