Streaming
LangFlux supports streaming back to your front end application when the final node is a Chain or OpenAI Function Agent.


Install socket.io-client to your front-end application
yarn add socket.io-client
or using npm
npm install socket.io-client
Refer official docs for more installation options.
Import it
import socketIOClient from 'socket.io-client'
Establish connection
const socket = socketIOClient("http://localhost:3000") //flowise url
Listen to connection
import { useState } from 'react'
const [socketIOClientId, setSocketIOClientId] = useState('');
socket.on('connect', () => {
setSocketIOClientId(socket.id)
});
Send query with
socketIOClientId
async function query(data) {
const response = await fetch(
"http://localhost:3000/api/v1/prediction/<chatflow-id>",
{
method: "POST",
body: data
}
);
const result = await response.json();
return result;
}
query({
"question": "Hey, how are you?",
"socketIOClientId": socketIOClientId
}).then((response) => {
console.log(response);
});
Listen to token stream
socket.on('start', () => {
console.log('start');
});
socket.on('token', (token) => {
console.log('token:', token);
});
socket.on('sourceDocuments', (sourceDocuments) => {
console.log('sourceDocuments:', sourceDocuments);
});
socket.on('end', () => {
console.log('end');
});
Disconnect connection
socket.disconnect();
Last updated
Was this helpful?