API Integeration
OpenLedger enables seamless interaction with custom-trained large language models through secure API endpoints and a flexible chat interface. This guide outlines how developers can access, authenticate, and manage their AI agents via the OpenLedger proxy infrastructure.
Python Integration

This example demonstrates how to connect to an OpenLedger-hosted model using the OpenAI Python client.
Usage
Set the base_url to your OpenLedger proxy endpoint.
Provide your api_key for authorization.
Specify the full model path, including adapter and version.
This method is recommended for backend services or scripts using Python.
Curl

This is a raw HTTP example using curl for environments where SDKs are not preferred.
Usage
Define the POST request with headers for Content-Type and Authorization.
Include your model path and message payload directly in the request body.
This method is useful for testing, automation scripts, and CLI environments.
JavaScript / Node.js Integration

Use the OpenAI client library for JavaScript to integrate OpenLedger-hosted models in frontend or Node.js environments.
Usage
Initialize the client with your API key and baseURL.
Call the chat.completions.create() method with the model path and user input.
Fully async/await compatible.
Ideal for web applications, bots, or services requiring browser-compatible interaction.
Completion
With OpenLedger, users can:
Build and contribute to Datanets
Train and deploy models
Interact and earn through tokenized chat
Guide ecosystem direction via governance
All actions are on-chain, ensuring verifiability, transparency, and community ownership across the AI-data lifecycle.
Last updated