Xalora AI Docs
  • Getting Started
    • Welcome to Xalora
  • Agent Framework
    • Social Agents
  • Xalora Mesh
  • Ecosystem
    • Xalora Imagine
  • imaginaries NFT
  • Developer Reference
    • Integration Overview
  • Supported Models
  • Understanding Image Prompting
  • Using Xalora API with Eliza
  • REST API
    • Xalora Mesh API
  • Synchronous Request
  • Image Generation API
  • Xalora SDK
    • Introduction
  • Getting Started
  • Basic Image Generation
  • SmartGen
  • ComfyUI Workflow
  • FluxLora Workflow
  • Text2Video Workflow
  • LLM Gateway
    • Introduction
  • LLM Tool Calling
  • Embeddings
Powered by GitBook
On this page
  • Overview
  • ​Example Usage

Embeddings

PreviousLLM Tool Calling

Last updated 7 days ago

Overview

Embedding models convert text into numerical vectors (embeddings) that capture semantic meaning. These vectors enable powerful applications like semantic search, text clustering, and similarity analysis. Xalora LLM Gateway provides embedding capabilities consistent with the OpenAI SDK interface.

Embeddings are particularly useful for:

  • Finding similar text content

  • Document clustering and classification

  • Retrieval Augmented Generation (RAG)

Example Usage

Here’s a simple example showing how to generate embeddings using the Xalora API:

Copy

from openai import OpenAI

client = OpenAI(
    api_key="your_user_id#your_api_key",
    base_url="https://llm-gateway.xalora.xyz"
)

embeddings = client.embeddings.create(
    model="BAAI/bge-large-en-v1.5",
    input="Hello, world!",
    encoding_format="float"
)

print(embeddings.data[0].embedding)

print("Prompt tokens used:", embeddings.usage.prompt_tokens)
​