LocalAI Embeddings
LocalAI Setup
LocalAI is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format.
To use LocalAI Embeddings within LangFlux, follow the steps below:
git clone https://github.com/go-skynet/LocalAI
cd LocalAI
LocalAI provides an API endpoint to download/install the model. In this example, we are going to use BERT Embeddings model:

In the
/models
folder, you should be able to see the downloaded model in there:

You can now test the embeddings:
curl http://localhost:8080/v1/embeddings -H "Content-Type: application/json" -d '{
"input": "Test",
"model": "text-embedding-ada-002"
}'
Response should looks like:

LangFlux Setup
Drag and drop a new LocalAIEmbeddings component to canvas:

Fill in the fields:
Base Path: The base url from LocalAI such as http://localhost:8080/v1
Model Name: The model you want to use. Note that it must be inside
/models
folder of LocalAI directory. For instance:text-embedding-ada-002
That's it! For more information, refer to LocalAI docs.
Last updated
Was this helpful?