Xalora AI Docs
  • Getting Started
    • Welcome to Xalora
  • Agent Framework
    • Social Agents
  • Xalora Mesh
  • Ecosystem
    • Xalora Imagine
  • imaginaries NFT
  • Developer Reference
    • Integration Overview
  • Supported Models
  • Understanding Image Prompting
  • Using Xalora API with Eliza
  • REST API
    • Xalora Mesh API
  • Synchronous Request
  • Image Generation API
  • Xalora SDK
    • Introduction
  • Getting Started
  • Basic Image Generation
  • SmartGen
  • ComfyUI Workflow
  • FluxLora Workflow
  • Text2Video Workflow
  • LLM Gateway
    • Introduction
  • LLM Tool Calling
  • Embeddings
Powered by GitBook
On this page
  1. LLM Gateway

Introduction

PreviousText2Video WorkflowNextLLM Tool Calling

Last updated 7 days ago

Xalora LLM Gateway

Access open source LLMs in Xalora instantly and transform your AI capabilities using the familiar OpenAI SDK. Simply replace the base URL setting in OpenAI libraries with https://llm-gateway.xalora.xyz.

Key Features

  • OpenAI Compatibility: Xalora LLM Gateway is designed to be fully compatible with the OpenAI SDK, allowing you to use the same familiar interface to interact with decentralized LLMs.

  • Decentralized AI at Low Cost: Leverage the benefits of decentralized computing to access powerful LLMs while significantly reducing operational costs.

  • Simple Integration: You can start using Xalora Gateway with just 3 lines of code. No complicated setup required.

  • Streaming Responses: Get real-time results from LLMs with built-in support for streaming responses, enabling interactive experiences like chatbots and live content generation.

  • Open-Source Ecosystem: Xalora supports a wide range of open source LLMs, allowing you to choose the best model for your use case, from conversational agents to code generation.

Getting Started

To get started with the Xalora LLM Gateway, you’ll first need an API key from Xalora. Once you have your API token, integrating it into your application is as simple as a few lines of code.

Here’s how you can use the Xalora LLM Gateway in different languages.

NodePythonCopy

import OpenAI from "openai";

const openai = new OpenAI({
  apiKey: process.env["XALORA_API_TOKEN"],
  baseURL: "https://llm-gateway.xalora.xyz",
});

const completions = await openai.chat.completions.create({
  model: "mistralai/mixtral-8x7b-instruct",
  messages: [
    {
      role: "user",
      content: "Write rap lyrics about Solana",
    },
  ],
  maxTokens: 64,
  stream: true,
});

for await (const part of completions) {
  process.stdout.write(part.choices[0]?.delta?.content || "");
}

Xalora LLM Gateway integrates seamlessly with the official OpenAI libraries. You can easily replace the baseURL in your client configuration to use Xalora’s decentralized models, gaining low-cost and uncensored AI capabilities. Refer to your language-specific for endpoint configuration.

​
​
OpenAI SDK docs