Getting Started as an Inference Provider

This guide walks you through setting up your organization as an Inference Provider in 15 minutes.

Prerequisites

Before starting:

  • A LangMart account (sign up at https://langmart.ai)
  • At least one provider API key (OpenAI, Anthropic, Groq, etc.)
  • Understanding of your billing model (org-pays vs member-pays)

Quick Start Overview

Step Time Description
1. Create Organization 2 min Set up your provider organization
2. Add Connections 5 min Configure provider API keys
3. Configure Billing 3 min Set up payment model
4. Invite Members 3 min Add your first users
5. Test Access 2 min Verify everything works

Step 1: Create Your Organization

Via Web Interface

  1. Log in to your LangMart account
  2. Navigate to Organizations in the sidebar
  3. Click Create Organization
  4. Fill in the details:
Field Example Notes
Name Acme AI Services Visible to members
Description AI model access for Acme teams Optional
Billing Email [email protected] For invoices
Website https://acme.com Optional
  1. Click Create

Via API

curl -X POST https://api.langmart.ai/api/organizations \
  -H "Authorization: Bearer <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Acme AI Services",
    "description": "AI model access for Acme teams",
    "billing_email": "[email protected]"
  }'

Step 2: Add Provider Connections

Connections define how your organization accesses AI model providers.

Adding Your First Connection

  1. Go to Connections page
  2. Click Add Connection
  3. Select provider (e.g., OpenAI)
  4. Configure:
Field Value Description
Name Team OpenAI Descriptive name
Provider OpenAI Model provider
API Key sk-... Your provider API key
Scope Organization Share with all members
Billing Mode Org Pays Organization covers costs
  1. Click Test Connection to verify
  2. Click Save

For a full-featured provider setup, add connections for:

Provider Models Use Case
OpenAI GPT-4o, GPT-4-turbo General purpose, coding
Anthropic Claude 3.5 Sonnet Long context, analysis
Groq Llama 3.3-70B Fast inference, cost-effective

Via API

curl -X POST https://api.langmart.ai/api/connections \
  -H "Authorization: Bearer <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Team OpenAI",
    "provider_id": "openai",
    "endpoint_url": "https://api.openai.com/v1",
    "api_key": "sk-your-openai-key",
    "scope": "organization",
    "billing_mode": "org_pays"
  }'

Step 3: Configure Billing

Add Organization Credits

For org-pays connections, you need credits:

  1. Go to Organization > Billing
  2. Click Add Credits
  3. Enter amount (e.g., $100)
  4. Complete payment

Set Spending Limits

Prevent runaway costs:

# Set organization monthly limit
curl -X PUT https://api.langmart.ai/api/organizations/<org_id> \
  -H "Authorization: Bearer <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "spending_limit_monthly": 1000.00
  }'

Configure Default Member Limits

New members automatically get these limits:

Limit Type Recommended Start Adjust Based On
Daily $5 Member's typical usage
Monthly $50 Department budget

Step 4: Invite Members

Allow users to request access:

  1. Go to Organization > Settings
  2. Enable Allow Join Requests
  3. Optionally enable Auto-Approve for trusted domains

Direct Invitations

Invite specific users:

  1. Go to Organization > Members
  2. Click Invite Member
  3. Enter email address
  4. Select role:
Role Permissions
Member Use models, view own usage
Admin Manage members, view all usage
Owner Full control, billing access
  1. Click Send Invitation

Via API

curl -X POST https://api.langmart.ai/api/organizations/<org_id>/members/invite \
  -H "Authorization: Bearer <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "email": "[email protected]",
    "role": "member"
  }'

Step 5: Test Member Access

Test as Admin

Verify models are accessible:

# Using your admin API key
curl -X POST https://api.langmart.ai/v1/chat/completions \
  -H "Authorization: Bearer <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o",
    "messages": [{"role": "user", "content": "Hello, testing!"}]
  }'

Test as Member

  1. Have a member log in
  2. Go to Chat interface
  3. Select an organization model
  4. Send a test message

Verify Usage Tracking

After tests, check:

  1. Organization > Usage - See aggregate usage
  2. Analytics > Request Logs - View individual requests

Configuration Summary

After completing setup, you should have:

Component Status Notes
Organization Created With name and billing email
Connections 1+ added At least one org-scoped connection
Credits Funded For org-pays connections
Limits Set Organization and member limits
Members Invited At least one test member

Next Steps

Expand Your Service

  • Add more connections: Support additional providers
  • Create connection pools: High availability setup
  • Configure PII detection: Enable data protection

Manage at Scale

  • Bulk invitations: Import member lists
  • Usage reports: Export for billing/compliance
  • Alerts: Set up spending notifications

Advanced Configuration

Troubleshooting

Connection Test Failed

Issue Solution
Invalid API key Verify key is correct and has permissions
Rate limited Wait and retry, or check provider limits
Network error Check endpoint URL is correct

Member Can't Access Models

Issue Solution
No organization models Verify connection scope is "organization"
Exceeded limit Check and increase member limits
Not approved Approve the member's join request

Credits Not Deducted

Issue Solution
Using personal credits Check connection billing mode is "org_pays"
Member billing mode Some connections may be member-pays

Support

Need help? Contact us: