Skip to main content

Documentation Index

Fetch the complete documentation index at: https://cognisafeltd.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

1

Create an account

Sign up at cognisafe.uk. The free tier includes 10,000 requests per month — no credit card required.
2

Install the SDK

pip install cognisafe
For Anthropic or other providers, install the extras:
pip install "cognisafe[anthropic]"   # + Anthropic
pip install "cognisafe[all]"         # all providers
3

Get your API key

In the dashboard: API KeysCreate key. Keys look like:
csk_prod_a1b2c3d4e5f6...
Store it as an environment variable — never hard-code it in source:
export COGNISAFE_API_KEY="csk_prod_a1b2c3d4e5f6..."
export COGNISAFE_PROJECT_ID="my-app"
4

Configure and patch

Call cognisafe.configure() once at startup, then patch your provider:
import cognisafe

cognisafe.configure(
    api_key="csk_your_key_here",   # or reads COGNISAFE_API_KEY
    project_id="my-app",           # or reads COGNISAFE_PROJECT_ID
)

cognisafe.patch_openai()  # rewrites openai.base_url to the Cognisafe proxy
For TypeScript / Node.js:
import cognisafe from 'cognisafe';

cognisafe.configure({ apiKey: 'csk_your_key_here', projectId: 'my-app' });
cognisafe.patchOpenAI();
5

Make an LLM call

Your existing code is unchanged — Cognisafe captures everything automatically:
from openai import OpenAI

client = OpenAI()  # base_url is already pointing to Cognisafe proxy

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello, world!"}],
)
print(response.choices[0].message.content)
Every call is logged, priced, and queued for safety scoring.
6

View in the dashboard

Open cognisafe.uk/dashboardOverview. Your request appears within a second, including model, token counts, cost, and latency. Safety scores appear within a few seconds once the async worker processes the job.
Proxy mode (OpenAI, Mistral) adds zero latency overhead — the proxy forwards the response to your app at the same time as it logs the payload. Direct mode (Anthropic, Cohere) wraps the provider client’s create method, which adds one lightweight HTTP call to the Cognisafe API in the background after the response is returned to you.