Skip to main content

Helicone Integration

Integrate TealTiger with Helicone to combine governance decisions with LLM proxy monitoring and cost tracking.

Why integrate TealTiger with Helicone?

Helicone provides LLM proxy and observability. Adding TealTiger gives you:
  • Governance + monitoring - Policy decisions alongside LLM metrics
  • Cost attribution - Separate governance costs from LLM costs
  • Compliance evidence - Audit trails in Helicone
  • Unified dashboard - All AI operations in one place

Quick start

Install both packages:
npm install tealtiger helicone
# or
pip install tealtiger helicone
Configure TealTiger to export to Helicone:
import { TealTiger } from 'tealtiger';
import { HeliconeLogger } from 'helicone';

// Initialize Helicone
const helicone = new HeliconeLogger({
  apiKey: process.env.HELICONE_API_KEY
});

// Configure TealTiger to export to Helicone
const teal = new TealTiger({
  policies: { /* ... */ },
  telemetry: {
    helicone: {
      enabled: true,
      client: helicone,
      exportDecisions: true,
      exportCosts: true
    }
  }
});

// Use TealTiger - decisions automatically exported
const decision = await teal.evaluate(request);

What gets exported?

TealTiger exports governance data as Helicone custom properties:

Decision properties

{
  "Helicone-Property-TealTiger-Decision": "DENY",
  "Helicone-Property-TealTiger-ReasonCodes": "TOOL_NOT_ALLOWED",
  "Helicone-Property-TealTiger-RiskScore": "85",
  "Helicone-Property-TealTiger-PolicyId": "security.tool_access.v1",
  "Helicone-Property-TealTiger-Mode": "ENFORCE"
}

Cost properties

{
  "Helicone-Property-TealTiger-Cost": "0.001",
  "Helicone-Property-TealTiger-TokensInput": "1000",
  "Helicone-Property-TealTiger-TokensOutput": "500"
}

Complete example with OpenAI

import { TealTiger } from 'tealtiger';
import { Configuration, OpenAIApi } from 'openai';

// Initialize TealTiger with Helicone export
const teal = new TealTiger({
  policies: {
    budget: {
      maxCostPerRequest: 0.50,
      maxCostPerDay: 100.00
    },
    security: {
      detectPII: true,
      redactPII: true
    }
  },
  telemetry: {
    helicone: {
      enabled: true,
      apiKey: process.env.HELICONE_API_KEY,
      exportDecisions: true,
      exportCosts: true
    }
  }
});

// Configure OpenAI with Helicone proxy
const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
  basePath: 'https://oai.hconeai.com/v1',
  baseOptions: {
    headers: {
      'Helicone-Auth': `Bearer ${process.env.HELICONE_API_KEY}`
    }
  }
});

const openai = new OpenAIApi(configuration);

// Wrap OpenAI with TealTiger
const governedOpenAI = teal.wrap(openai);

// Make request (governance + Helicone monitoring)
const response = await governedOpenAI.createChatCompletion({
  model: 'gpt-4',
  messages: [
    { role: 'user', content: 'Hello!' }
  ]
});

Viewing in Helicone

TealTiger decisions appear in Helicone:
  1. Request view - See governance decisions as custom properties
  2. Filtering - Filter by TealTiger decision action
  3. Cost tracking - Separate governance costs from LLM costs
  4. Dashboards - Create charts for policy decisions
  5. Alerts - Get notified of policy violations

Creating Helicone dashboards

Dashboard 1: Policy decisions

Filter requests by:
  • TealTiger-Decision = DENY - Blocked requests
  • TealTiger-Decision = ALLOW - Allowed requests
  • TealTiger-RiskScore > 80 - High-risk requests

Dashboard 2: Cost breakdown

Track costs:
  • Total LLM cost (from Helicone)
  • Total governance cost (from TealTiger)
  • Cost per decision action

Dashboard 3: Compliance

Monitor compliance:
  • Requests with PII detected
  • Requests with PII redacted
  • Policy violations by type

Best practices

  1. Use Helicone proxy - Route all LLM calls through Helicone
  2. Export costs - Track governance overhead separately
  3. Create dashboards - Visualize policy decisions
  4. Set up alerts - Get notified of violations
  5. Filter by properties - Use TealTiger properties for analysis

Next steps