Interactive Demo

Try IdentArk in your browser

No signup required. See how agents execute LLM calls without ever touching your API keys.

agent.py
Python
from identark import ControlPlaneGateway, Message, Role # Agent holds ZERO credentials # Only a session token that expires gateway = ControlPlaneGateway() async def ask_agent(question: str): response = await gateway.invoke_llm( new_messages=[ Message( role=Role.USER, content=question ) ] ) return { "answer": response.message.content, "cost": response.cost_usd, "model": response.model, } # The credential (OpenAI API key) is: # - Stored encrypted in Vault # - Fetched per-request by the gateway # - NEVER exposed to the agent code
Response

Enter a message and click "Run" to see
IdentArk in action

Zero Secrets in Agent

The agent code above holds no API keys. Credentials are fetched per-request by the gateway and never exposed.

Encrypted at Rest

Your LLM API keys are stored in an encrypted vault (Infisical). AES-256-GCM encryption with automatic key rotation.

Full Audit Trail

Every LLM call is logged with timestamp, tokens, cost, and latency. Immutable audit log for compliance.

Ready to secure your agents?

Get started with 500 free executions per month. No credit card required.