SDKs & Libraries

LLM Hub is compatible with all OpenAI SDKs. Simply change the base URL to start using our API with your preferred language.

No Special SDK Required

LLM Hub implements the OpenAI API specification, so you can use the official OpenAI SDK for your language. Just set the base URL tohttps://api.llmhub.one/v1and use your LLM Hub API key.

Available SDKs

JavaScript / TypeScript

Official
Documentation
Package: openai

Installation

Bash
npm install openai

Configuration

Java
import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.llmhub.one/v1',
  apiKey: process.env.LLMHUB_API_KEY
});

Python

Official
Documentation
Package: openai

Installation

Bash
pip install openai

Configuration

Python
from openai import OpenAI

client = OpenAI(
    base_url="https://api.llmhub.one/v1",
    api_key="your-api-key"
)

Go

Official
Documentation
Package: github.com/openai/openai-go

Installation

Bash
go get github.com/openai/openai-go

Configuration

Go
import (
    "github.com/openai/openai-go"
    "github.com/openai/openai-go/option"
)

client := openai.NewClient(
    option.WithBaseURL("https://api.llmhub.one/v1"),
    option.WithAPIKey(os.Getenv("LLMHUB_API_KEY")),
)

PHP

Community
Documentation
Package: openai-php/client

Installation

Bash
composer require openai-php/client

Configuration

PHP
use OpenAI;

$client = OpenAI::factory()
    ->withBaseUri('https://api.llmhub.one/v1')
    ->withApiKey($_ENV['LLMHUB_API_KEY'])
    ->make();

Ruby

Community
Documentation
Package: ruby-openai

Installation

Bash
gem install ruby-openai

Configuration

Ruby
require "openai"

client = OpenAI::Client.new(
  uri_base: "https://api.llmhub.one/v1",
  access_token: ENV["LLMHUB_API_KEY"]
)

C# / .NET

Official
Documentation
Package: OpenAI

Installation

Bash
dotnet add package OpenAI

Configuration

C#
using OpenAI;

var client = new OpenAIClient(
    new OpenAIClientOptions {
        BaseUrl = new Uri("https://api.llmhub.one/v1"),
        ApiKey = Environment.GetEnvironmentVariable("LLMHUB_API_KEY")
    });

Java / Kotlin

Official
Documentation
Package: com.openai:openai-java

Installation

Bash
// Add to build.gradle or pom.xml

Configuration

Java
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;

OpenAIClient client = OpenAIOkHttpClient.builder()
    .baseUrl("https://api.llmhub.one/v1")
    .apiKey(System.getenv("LLMHUB_API_KEY"))
    .build();

Rust

Community
Documentation
Package: async-openai

Installation

Bash
cargo add async-openai

Configuration

rust
use async_openai::{Client, config::OpenAIConfig};

let config = OpenAIConfig::new()
    .with_api_base("https://api.llmhub.one/v1")
    .with_api_key(std::env::var("LLMHUB_API_KEY")?);

let client = Client::with_config(config);

Direct HTTP / REST

If there's no SDK for your language, you can make direct HTTP requests:

Bash
curl https://api.llmhub.one/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $LLMHUB_API_KEY" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

API Endpoints

POST/v1/chat/completions— Chat & completions
POST/v1/images/generations— Image generation
POST/v1/embeddings— Text embeddings
GET/v1/models— List available models

Framework Integrations

LLM Hub works with popular AI frameworks:

LangChain

Use our endpoint with LangChain's ChatOpenAI:

Python
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.llmhub.one/v1",
    api_key="your-key",
    model="gpt-4o"
)

Vercel AI SDK

Works with the OpenAI provider:

TypeScript
import { createOpenAI } from '@ai-sdk/openai';

const llmhub = createOpenAI({
  baseURL: 'https://api.llmhub.one/v1',
  apiKey: process.env.LLMHUB_API_KEY
});

LlamaIndex

Configure as an OpenAI-compatible endpoint:

Python
from llama_index.llms.openai import OpenAI

llm = OpenAI(
    api_base="https://api.llmhub.one/v1",
    api_key="your-key",
    model="gpt-4o"
)

AutoGen

Use in your AutoGen config:

Python
config_list = [{
    "model": "gpt-4o",
    "base_url": "https://api.llmhub.one/v1",
    "api_key": "your-key"
}]

Next Steps