A drop-in spec that gives any AI coding agent cryptographic proof of every pipeline run. No private keys. No infrastructure. Just Sigstore.
The Problem
Your agent produces files, runs tests, reports success. But there is no audit trail. No way to know if the output was tampered with. No proof for your team, your auditors, or anyone else that the work actually happened the way the agent claims.
How It Works
Every pipeline run produces a tamper-evident attestation bundle. Signed with your identity, logged to a public transparency ledger. The whole chain, from intent to proof, is automated.
When the user requests work, the raw text is SHA-256 hashed (never stored) and appended to a tamper-evident JSONL chain. Privacy by design. Proves a request was made without revealing what was said.
After tasks complete, every output file is SHA-256 hashed. The hashes are collected into a manifest, a fingerprint of what the agent produced.
A JSON attestation bundle is created with all hashes, timestamps, task metadata, and the intent chain. This is the claim: "these artifacts were produced from this request at this time."
The bundle is signed using Sigstore's OIDC flow. Your GitHub or Google identity, verified in real time. No private keys to manage. No PKI. The signature is ephemeral and tied to your identity at the moment of signing.
The signed attestation is automatically logged to Rekor, a public immutable ledger. Anyone can verify the entry. The timestamp is canonical. You cannot backdate or forge it.
What's in a Bundle
Every field traces back to a real cryptographic operation. The intent chain proves what was requested. The artifacts prove what was produced. Rekor proves when it happened.
{
"predicateType": "natural-language-session/v1",
"predicate": {
"invocation": {
"configSource": "sha256-of-spec-file", ← what was requested
"parameters": "sha256-of-task-decomposition",
"intent_chain": [ ← tamper-evident request log
{
"rev": 1,
"timestamp": "2026-04-28T12:00:00Z",
"raw_input_hash": "abc123...", ← SHA-256 of user's words
"spec_hash_after": "def456..."
}
]
},
"output": {
"artifacts": [ ← what was produced
{"path": "src/login.py", "sha256": "abc123..."},
{"path": "tests/test_login.py", "sha256": "def456..."}
]
},
"timestamp": {
"start": "2026-04-28T12:00:00Z", ← when it happened
"end": "2026-04-28T12:05:00Z"
},
"rekor": {
"logIndex": "1387966928", ← public, immutable proof
"entryUrl": "https://search.sigstore.dev/?logIndex=1387966928"
}
}
}
Drop-in Setup
pip install sigstore
cp -r lib/ your-project/lib/
mkdir -p your-project/.state/{attestations,intents}
# Self-test (no signing)
python lib/attest.py --dry-run
# Real attestation
python lib/attest.py path/to/spec.md path/to/tasks/
Tech Stack
Results
See all runs on the live attestation feed.
Who This Is For
Using AI agents who want accountability. Know what your agent actually did, not just what it claimed.
Adopting coding agents who need audit trails. Every pipeline run produces verifiable evidence your team can inspect.
SOC2, HIPAA, audit-ready from day one. Cryptographic proof that your AI agent operated within governance. No retroactive paperwork.