docs.langflux.space
  • Welcome to LangFlux
  • Using LangFlux
    • API
    • Streaming
    • Embed
    • Variables
  • Configuration
    • Auth
      • Chatflow Level
    • Rate Limit
  • Integrations
    • Cache
      • InMemory Cache
    • Chains
      • Conversational Retrieval QA Chain
      • Vectara QA Chain
    • Document Loaders
      • S3 File Loader
      • PDF Files
    • Chat Models
      • Azure ChatOpenAI
      • ChatLocalAI
      • Google VertexAI
    • Embeddings
      • Azure OpenAI Embeddings
      • LocalAI Embeddings
    • Memory
      • Short Term Memory
      • Long Term Memory
        • Zep Memory
      • Threads
    • Text Splitters
      • Character Text Splitter
    • Tools
      • Custom Tool
    • Vector Stores
      • Chroma
      • Pinecone
      • Elastic
      • Qdrant
      • SingleStore
      • Supabase
      • Vectara
    • Utilities
      • Set/Get Variable
      • If Else
    • External Integrations
      • Zapier Zaps
  • Use Cases
    • Web Scrape QnA
    • Webhook Tool
Powered by GitBook
On this page
  • LocalAI Setup
  • LangFlux Setup

Was this helpful?

  1. Integrations
  2. Embeddings

LocalAI Embeddings

PreviousAzure OpenAI EmbeddingsNextMemory

Last updated 1 year ago

Was this helpful?

LocalAI Setup

is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format.

To use LocalAI Embeddings within LangFlux, follow the steps below:

  1. git clone https://github.com/go-skynet/LocalAI
  2. cd LocalAI
  3. LocalAI provides an to download/install the model. In this example, we are going to use BERT Embeddings model:

  1. In the /models folder, you should be able to see the downloaded model in there:

  1. You can now test the embeddings:

curl http://localhost:8080/v1/embeddings -H "Content-Type: application/json" -d '{
    "input": "Test",
    "model": "text-embedding-ada-002"
  }'
  1. Response should looks like:

LangFlux Setup

Drag and drop a new LocalAIEmbeddings component to canvas:

Fill in the fields:

  • Model Name: The model you want to use. Note that it must be inside /models folder of LocalAI directory. For instance: text-embedding-ada-002

Base Path: The base url from LocalAI such as

That's it! For more information, refer to LocalAI .

http://localhost:8080/v1
docs
LocalAI
API endpoint