diff --git a/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md b/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md
new file mode 100644
index 0000000..26db10d
--- /dev/null
+++ b/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md
@@ -0,0 +1,196 @@
+# IF.EMOTION TRACE PROTOCOL v3.2 — AUDITABLE DEBUGGING (WITHOUT WISHFUL THINKING)
+
+**Alternate title:** Debugging Emotion — the Immutable Flight Recorder
+
+**Subject:** End-to-end traceability, bounded completeness witnessing, and PQ-anchored evidence binding
+
+**Protocol:** IF.TTT (Traceable, Transparent, Trustworthy)
+
+**Version:** 3.2 (Methodology hardening: key separation + monotonic timing + correlation-only client trace)
+
+**Date (UTC):** 2025-12-21
+
+**Status:** AUDIT REQUIRED
+
+**Citation:** `if://whitepaper/emotion/trace-protocol/v3.2`
+
+---
+
+## What this is (and why it matters)
+
+If you run an LLM system in a high-liability environment, you eventually hit the moment where “the logs say” isn’t enough. You need evidence you can hand to someone who does not trust you.
+
+*This is not an observability feature. It’s chain-of-custody.*
+
+This protocol is a practical answer to one question:
+
+Can an external reviewer independently verify what happened from request → retrieval → output, and detect tampering after the fact?
+
+It intentionally separates what we can prove from what we cannot.
+
+---
+
+## Guarantees (and boundaries)
+
+This system provides **integrity** guarantees (tamper-evidence) and **bounded completeness** guarantees (witnessing) within explicit boundaries.
+
+- **Integrity:** the trace timeline is hash-chained; the signed summary binds the final output to a trace head.
+- **Completeness (bounded):** a REQ_SEEN ledger witnesses each request that crosses the backend witness boundary, with a signed per-hour Merkle head.
+- **PQ anchoring (bounded):** Post-quantum signatures apply at registry anchoring time (IF.TTT), not necessarily on every hot-path artifact.
+
+One sentence boundary (non-negotiable):
+
+Integrity starts at the backend witness boundary; completeness is only meaningful at and after that boundary until edge witnessing is cryptographically enforced.
+
+---
+
+## Layered evidence stack (where guarantees live)
+
+```mermaid
+flowchart TB
+ U[User] -->|HTTPS| E[Edge]
+ E --> B[Backend Witness Boundary]
+
+ B --> R[Retrieval]
+ B --> P[Prompt]
+ B --> M[Model]
+ B --> X[Postprocess]
+
+ B --> T1["REQ_SEEN ledger
(hourly JSONL)"]
+ B --> T2["Trace events
(hash chain JSONL)"]
+ B --> T3["Signed summary
(output hash + head attestation)"]
+
+ T1 --> H["Signed Merkle head
(per hour)"]
+ T2 --> S["Trace head
(event_hash)"]
+
+ H --> BUNDLE["Evidence bundle
(tar.gz + manifest)"]
+ S --> BUNDLE
+ T3 --> BUNDLE
+
+ BUNDLE --> REG["Registry anchor
(PQ-hybrid)"]
+ BUNDLE --> MIRROR["Static mirror
(public download)"]
+```
+
+Interpretation:
+
+Integrity starts at the witness boundary; completeness is only meaningful at and after that boundary until edge witnessing exists.
+
+---
+
+## Evidence inventory (what ships)
+
+| Artifact | File | Claim it supports | Verification |
+|---|---|---|---|
+| Evidence bundle | `emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz` | Portable reproduction | `sha256sum` + verifier |
+| Manifest + checksums | `payload/manifest.json`, `payload/sha256s.txt` | “One check implies contents” | verifier validates per-file SHA256 |
+| Trace event chain | `payload/trace_events.jsonl` | Tamper-evident event ordering | verifier recomputes event hashes |
+| Signed summary | `payload/ttt_signed_record.json` | Binds response hash → trace head | verifier recomputes HMAC signature |
+| REQ_SEEN ledger | `payload/req_seen_.jsonl` | Bounded completeness | verifier recomputes leaf hashes + Merkle root |
+| REQ_SEEN head | `payload/req_seen_head_.json` | Signed Merkle head | verifier checks Ed25519 signature |
+| Inclusion proof | `payload/req_seen_inclusion_proof.json` | Proves this trace is in the hour ledger | verifier checks Merkle path |
+| IF.story annex | `payload/if_story.md` and external annex | Human-readable timeline | anchors must reference real `event_hash` |
+| Registry corroboration | `*.ttt_chain_record.json` | PQ-anchored record (when available) | compare content hashes |
+
+---
+
+## Methodology hardening in v3.2 (the changes that close real audit gaps)
+
+### HMAC key separation for REQ_SEEN (no mixed keys)
+
+REQ_SEEN uses HMAC commitments only if `IF_REQ_SEEN_HMAC_KEY` is configured. It never reuses the signing secret used for the signed summary.
+
+If `IF_REQ_SEEN_HMAC_KEY` is missing, REQ_SEEN downgrades to SHA256 commitments and the system must not claim “privacy-preserving HMAC commitments”.
+
+### Correlation-only client trace IDs (collision discipline)
+
+If a client provides `X-IF-Client-Trace`, it is treated as a correlation-only identifier.
+
+The canonical trace ID is always server-generated and returned in `X-IF-Emotion-Trace`.
+
+### Monotonic timing fields (clock realism)
+
+Each trace event includes:
+
+- `ts_utc`: wall-clock timestamp (not trusted for crypto time)
+- `mono_ns` / `mono_ms`: monotonic timing since trace start (stable ordering and performance attribution)
+
+This does not solve time attestation, but it removes “clock drift” as an excuse for missing latency evidence.
+
+### Inclusion proof is a first-class prior
+
+The inclusion proof file is registered as a child artifact in IF.TTT. It is not optional.
+
+---
+
+## Reference proof run (v3.2)
+
+Trace ID:
+
+- `96700e8e-6a83-445e-86f7-06905c500146`
+
+Evidence bundle:
+
+- Static mirror (preferred): `https://infrafabric.io/static/hosted/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+- Forgejo raw (alternate): `https://git.infrafabric.io/danny/hosted/raw/branch/main/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+
+Tarball SHA256:
+
+- `85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87`
+
+IF.TTT tarball handle:
+
+- `if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1`
+
+---
+
+## Verification (external reviewer path)
+
+Download + hash:
+
+```bash
+curl -fsSL -o emo.tar.gz 'https://infrafabric.io/static/hosted/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz'
+sha256sum emo.tar.gz
+# expected: 85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87
+```
+
+Run verifier:
+
+```bash
+python3 -m venv venv
+./venv/bin/pip install canonicaljson pynacl
+curl -fsSL -o iftrace.py 'https://infrafabric.io/static/hosted/iftrace.py'
+./venv/bin/python iftrace.py verify emo.tar.gz --expected-sha256 85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87
+```
+
+REQ_SEEN inclusion proof check:
+
+```bash
+tar -xzf emo.tar.gz
+./venv/bin/python iftrace.py verify-inclusion payload/req_seen_inclusion_proof.json
+# expected: OK
+```
+
+---
+
+## Narrative annex
+
+IF.story is not evidence; it is a deterministic projection keyed by `event_hash`.
+
+- Static mirror: `https://infrafabric.io/static/hosted/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md`
+- Forgejo raw: `https://git.infrafabric.io/danny/hosted/raw/branch/main/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md`
+
+---
+
+## Limitations (still true)
+
+- This proves what the system did and what bytes were served. It does not prove factual truth in the world.
+- Completeness is bounded by the witness boundary; requests dropped before the backend boundary are out of scope until edge witnessing is cryptographically bound.
+- Key management and time attestation remain the practical certification blockers (HSM/TPM, rotation ceremony, external timestamping).
+
+---
+
+## What to do next (tomorrow’s work, not wishful thinking)
+
+- Move REQ_SEEN witnessing to the true front door (edge) and sign the head there.
+- Publish a deploy attestation record (code hash + image digest) into IF.TTT for every release.
+- Add a clear anchoring SLO (maximum time from trace finalization → registry anchor) and enforce it.
diff --git a/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md.sha256 b/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md.sha256
new file mode 100644
index 0000000..963d34d
--- /dev/null
+++ b/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md.sha256
@@ -0,0 +1 @@
+8e61cfd0353da980439d9e18aeb6d572d71eb58960ccf26dfdf279c453095835 /root/tmp/hosted_repo_update/IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md
diff --git a/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md b/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md
new file mode 100644
index 0000000..db8651b
--- /dev/null
+++ b/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md
@@ -0,0 +1,23 @@
+# IF.story — contextual narrative log (reference)
+
+Trace: `96700e8e-6a83-445e-86f7-06905c500146`
+
+Evidence bundle tarball:
+- `https://infrafabric.io/static/hosted/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+- `https://git.infrafabric.io/danny/hosted/raw/branch/main/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+
+---
+
+Trace: `96700e8e-6a83-445e-86f7-06905c500146`
+
+Deterministic narrative projection of `trace_events.jsonl`. Each line includes the `event_hash` anchor.
+
+- 2025-12-21T10:20:04Z (+0ms) | `request_commit` | Request body commitment; commit_ok=True client_trace_id=22222222-2222-4222-8222-222222222222 | event_hash=f924cb8cba0a6db4580009da023bd4eaeb376daaffa119619799f26f584358aa
+- 2025-12-21T10:20:04Z (+1ms) | `req_seen` | REQ_SEEN witnessed; hour=20251221T10 count=4 merkle_root=fc96fce3d19583cbb4e11e4e0c4e717c4ce7d426697a5633286a9a446a146455 | event_hash=8f0c3568e59243519994ff76dad25def95e1014180fb8c5db7b3f86efb92f9f9
+- 2025-12-21T10:20:04Z (+2ms) | `request_received` | Auth+quota succeeded; provider=codex model=gpt-5.2 stream=False user_len=47 auth_ms=3 | event_hash=f50db27625228b5293e1a2c14018bfd95377a12d211233f46fc4c85739f8f27d
+- 2025-12-21T10:20:04Z (+3ms) | `guard_short_circuit` | IF.GUARD short-circuit; reasons=['self_harm_signal'] | event_hash=2c9eb30ff9fb12e19faecc9cd403c86d033bb76d7923d534ef08c37eb1bc217f
+- 2025-12-21T10:20:04Z (+3ms) | `trace_finalizing` | Trace finalizing; ok=True provider=guard | event_hash=0022e0ce2050bc544bc38ff518aa465f505aad4c231bba4d7aabff19fcf459d9
+
+Notes:
+- Ground truth remains `trace_events.jsonl` + `ttt_signed_record.json`.
+- REQ_SEEN ledger+head are included; public key is `trace_ed25519.pub`.
diff --git a/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md.sha256 b/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md.sha256
new file mode 100644
index 0000000..b537f13
--- /dev/null
+++ b/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md.sha256
@@ -0,0 +1 @@
+4f86ad4c1ebf415b6ed1ee700748584af1380ec0d04f8e0350e1fe51f458720e /root/tmp/hosted_repo_update/IF_EMOTION_TRACE_REFERENCE_96700e8e-6a83-445e-86f7-06905c500146_IF_STORY.md
diff --git a/README.md b/README.md
index 01d25e7..746e8b0 100644
--- a/README.md
+++ b/README.md
@@ -29,6 +29,21 @@ Static hosted artifacts used in InfraFabric reviews.
- IF.TTT citation (PQ hybrid signed): `if://citation/c24fe95e-226c-4efc-ba22-5ddcc37ff7d2/v1`
- Notes: includes `payload/trace_ed25519.pub` + `payload/req_seen_inclusion_proof.json` + nested priors (`payload/ttt_children*.json`).
+## emo-social trace payload (v3.2, methodology hardening demo)
+
+- File: `emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+- SHA256: `85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87`
+- IF.TTT citation (PQ hybrid signed): `if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1`
+- Notes: includes monotonic timing fields (`mono_ns`/`mono_ms`), correlation-only `X-IF-Client-Trace`, and registers `payload/req_seen_inclusion_proof.json` as a first-class prior.
+
+## IF.emotion trace whitepaper (styled v3.2)
+
+- File: `IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v3.2_STYLED.md`
+
+## Trace bundler (operator tool)
+
+- File: `emo_trace_pack.py` (builds an evidence bundle tarball from a trace id by pulling artifacts from `pct 220` + `pct 240`)
+
## IF.emotion trace whitepaper (styled v2.1)
- File: `IF_EMOTION_DEBUGGING_TRACE_WHITEPAPER_v2.1_STYLED.md`
diff --git a/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md b/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md
new file mode 100644
index 0000000..072b89e
--- /dev/null
+++ b/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md
@@ -0,0 +1,35 @@
+# Verify emo-social trace bundle (external)
+
+Artifacts:
+- Tarball: `emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz`
+- SHA256: `85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87`
+- IF.TTT handle (PQ hybrid signed in registry): `if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1`
+
+Download:
+```bash
+curl -fsSL -o emo.tar.gz 'https://infrafabric.io/static/hosted/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz'
+# Alternate (Forgejo raw):
+curl -fsSL -o emo.tar.gz 'https://git.infrafabric.io/danny/hosted/raw/branch/main/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz'
+
+sha256sum emo.tar.gz
+```
+
+Run verifier:
+```bash
+python3 -m venv venv
+./venv/bin/pip install canonicaljson pynacl
+curl -fsSL -o iftrace.py 'https://infrafabric.io/static/hosted/iftrace.py'
+./venv/bin/python iftrace.py verify emo.tar.gz --expected-sha256 85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87
+```
+
+Merkle inclusion proof demo (REQ_SEEN completeness):
+```bash
+mkdir -p payload && tar -xzf emo.tar.gz -C .
+./venv/bin/python iftrace.py verify-inclusion payload/req_seen_inclusion_proof.json
+```
+
+IF.TTT corroboration note:
+- The `if://citation/...` handle is an internal registry identifier.
+- For external review without registry access, use the published chain record:
+ - `emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json`
+ - `emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json`
diff --git a/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md.sha256 b/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md.sha256
new file mode 100644
index 0000000..20dec8e
--- /dev/null
+++ b/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md.sha256
@@ -0,0 +1 @@
+48a58a3729409dcc66da57cbd428b611ca50c29f03dc3b8b411a96acefdba76d /root/tmp/hosted_repo_update/VERIFY_EMO_TRACE_96700e8e-6a83-445e-86f7-06905c500146.md
diff --git a/emo_trace_pack.py b/emo_trace_pack.py
new file mode 100644
index 0000000..8bb15f8
--- /dev/null
+++ b/emo_trace_pack.py
@@ -0,0 +1,547 @@
+#!/usr/bin/env python3
+"""
+Build an IF.emotion "evidence bundle" tarball for a trace ID.
+
+This is an operator tool that runs on the Proxmox host and pulls artifacts from:
+ - pct 220 (emo-social / if.emotion backend)
+ - pct 240 (IF.TTT registry)
+
+Outputs:
+ /root/tmp/emo-trace-package-/
+ payload/...
+ emo_trace_payload_.tar.gz
+ payload_tar_sha256.txt
+ ttt_tarball_audit_entry.json
+ ttt_tarball_chain_record.json
+ ttt_tarball_chain_ref.json
+"""
+
+from __future__ import annotations
+
+import argparse
+import hashlib
+import json
+import os
+import subprocess
+import tarfile
+import textwrap
+import uuid
+from datetime import datetime, timezone
+from pathlib import Path
+from typing import Any
+
+
+def utc_now_iso() -> str:
+ return datetime.now(timezone.utc).isoformat()
+
+
+def sha256_bytes(data: bytes) -> str:
+ return hashlib.sha256(data or b"").hexdigest()
+
+
+def sha256_file(path: Path) -> str:
+ h = hashlib.sha256()
+ with path.open("rb") as f:
+ for chunk in iter(lambda: f.read(1024 * 1024), b""):
+ h.update(chunk)
+ return h.hexdigest()
+
+
+def canonical_json_bytes(obj: Any) -> bytes:
+ return json.dumps(obj, ensure_ascii=False, sort_keys=True, separators=(",", ":")).encode("utf-8")
+
+
+def merkle_root_hex(leaves_hex: list[str]) -> str:
+ if not leaves_hex:
+ return sha256_bytes(b"")
+ level: list[bytes] = [bytes.fromhex(h) for h in leaves_hex if isinstance(h, str) and len(h) == 64]
+ if not level:
+ return sha256_bytes(b"")
+ while len(level) > 1:
+ if len(level) % 2 == 1:
+ level.append(level[-1])
+ nxt: list[bytes] = []
+ for i in range(0, len(level), 2):
+ nxt.append(hashlib.sha256(level[i] + level[i + 1]).digest())
+ level = nxt
+ return level[0].hex()
+
+
+def merkle_inclusion_proof(leaves_hex: list[str], index: int) -> dict:
+ if index < 0 or index >= len(leaves_hex):
+ raise ValueError("index out of range")
+ level: list[bytes] = [bytes.fromhex(h) for h in leaves_hex]
+ proof: list[dict] = []
+ idx = index
+ while len(level) > 1:
+ if len(level) % 2 == 1:
+ level.append(level[-1])
+ sibling_idx = idx ^ 1
+ sibling = level[sibling_idx]
+ side = "left" if sibling_idx < idx else "right"
+ proof.append({"sibling": sibling.hex(), "side": side})
+ nxt: list[bytes] = []
+ for i in range(0, len(level), 2):
+ nxt.append(hashlib.sha256(level[i] + level[i + 1]).digest())
+ level = nxt
+ idx //= 2
+ root = level[0].hex()
+ return {"index": index, "root": root, "path": proof}
+
+
+def run(cmd: list[str], *, stdin: bytes | None = None, timeout_s: int = 120) -> bytes:
+ p = subprocess.run(
+ cmd,
+ input=stdin,
+ stdout=subprocess.PIPE,
+ stderr=subprocess.PIPE,
+ timeout=timeout_s,
+ check=False,
+ )
+ if p.returncode != 0:
+ raise RuntimeError(f"cmd failed ({p.returncode}): {' '.join(cmd)}\n{p.stderr.decode('utf-8',errors='ignore')}")
+ return p.stdout
+
+
+def pct_exec(pct: int, bash_cmd: str, *, stdin: bytes | None = None, timeout_s: int = 120) -> bytes:
+ return run(["pct", "exec", str(pct), "--", "bash", "-lc", bash_cmd], stdin=stdin, timeout_s=timeout_s)
+
+
+def pct_cat(pct: int, path: str, *, timeout_s: int = 120) -> bytes:
+ return pct_exec(pct, f"cat {shlex_quote(path)}", timeout_s=timeout_s)
+
+
+def shlex_quote(s: str) -> str:
+ # Minimal, safe shell quoting for paths.
+ return "'" + (s or "").replace("'", "'\"'\"'") + "'"
+
+
+def write_json(path: Path, obj: Any) -> None:
+ path.write_text(json.dumps(obj, ensure_ascii=False, sort_keys=True, indent=2) + "\n", encoding="utf-8")
+
+
+def write_text(path: Path, text: str) -> None:
+ path.write_text(text, encoding="utf-8")
+
+
+def fetch_api_json(*, trace_id: str, endpoint: str, email: str) -> Any:
+ raw = pct_exec(
+ 220,
+ f"curl -fsSL -H {shlex_quote('X-Auth-Request-Email: ' + email)} http://127.0.0.1:5000{endpoint}",
+ timeout_s=60,
+ )
+ return json.loads(raw.decode("utf-8", errors="ignore") or "{}")
+
+
+def extract_ttt_signed_record(*, trace_id: str) -> dict:
+ script = textwrap.dedent(
+ f"""
+ python3 - <<'PY'
+ import json
+ tid = {trace_id!r}
+ path = "/opt/if-emotion/data/ttt_signed_log.jsonl"
+ out = None
+ try:
+ with open(path, "r", encoding="utf-8", errors="ignore") as f:
+ for line in f:
+ line = line.strip()
+ if not line:
+ continue
+ try:
+ rec = json.loads(line)
+ except Exception:
+ continue
+ ev = rec.get("event") or {{}}
+ if isinstance(ev, dict) and str(ev.get("trace_id") or "").strip() == tid:
+ out = rec
+ except Exception:
+ out = None
+ print(json.dumps(out or {{}}, ensure_ascii=False, sort_keys=True))
+ PY
+ """
+ ).strip()
+ raw = pct_exec(220, script, timeout_s=60)
+ return json.loads(raw.decode("utf-8", errors="ignore") or "{}")
+
+
+def resolve_ttt_records_by_id(record_ids: list[str]) -> list[dict]:
+ payload = json.dumps({"ids": record_ids}, ensure_ascii=False).encode("utf-8")
+ py = """
+import json
+import sys
+import importlib.util
+import contextlib
+
+req = json.loads(sys.stdin.read() or "{}")
+ids = req.get("ids") or []
+
+spec = importlib.util.spec_from_file_location("ttt_registry_mod", "/opt/ttt-registry/ttt_registry.py")
+mod = importlib.util.module_from_spec(spec)
+
+with contextlib.redirect_stdout(sys.stderr):
+ spec.loader.exec_module(mod) # type: ignore
+ reg = mod.TTTRegistry()
+
+out = []
+for rid in ids:
+ rid = str(rid or "").strip()
+ if not rid:
+ continue
+ h = reg.redis.get(f"ttt:index:id:{rid}")
+ if not h:
+ continue
+ try:
+ out.append(reg.get(h))
+ except Exception:
+ continue
+
+print(json.dumps(out, ensure_ascii=False, sort_keys=True))
+""".strip()
+ raw = pct_exec(
+ 240,
+ f"OQS_INSTALL_PATH=/opt/ttt-registry/_oqs /opt/ttt-registry/venv/bin/python -c {shlex_quote(py)}",
+ stdin=payload,
+ timeout_s=180,
+ )
+ try:
+ data = json.loads(raw.decode("utf-8", errors="ignore") or "[]")
+ except Exception:
+ return []
+ return data if isinstance(data, list) else [data]
+
+
+def write_audit_entries(entries: list[dict]) -> None:
+ payload = json.dumps({"entries": entries}, ensure_ascii=False).encode("utf-8")
+ py = """
+import json
+import sys
+import re
+import uuid
+import redis
+from pathlib import Path
+
+req = json.loads(sys.stdin.read() or "{}")
+entries = req.get("entries") or []
+
+cfg = Path("/etc/redis/ttt.conf").read_text(encoding="utf-8", errors="ignore")
+m = re.search(r"^requirepass\\s+(\\S+)", cfg, flags=re.M)
+password = m.group(1) if m else None
+
+r = redis.Redis(host="localhost", port=6380, password=password, decode_responses=True)
+written = 0
+for e in entries:
+ cid = str(e.get("citation_id") or "").strip()
+ parts = cid.split("/")
+ uid = (parts[3] if len(parts) > 3 else "").strip()
+ try:
+ uuid.UUID(uid)
+ except Exception:
+ continue
+ r.set(f"audit:entry:{uid}", json.dumps(e, ensure_ascii=False, sort_keys=True))
+ written += 1
+
+print(json.dumps({"ok": True, "written": written}, ensure_ascii=False))
+""".strip()
+ _ = pct_exec(240, f"/opt/ttt-registry/venv/bin/python -c {shlex_quote(py)}", stdin=payload, timeout_s=60)
+
+
+def ttt_import_audit() -> dict:
+ raw = pct_exec(
+ 240,
+ "OQS_INSTALL_PATH=/opt/ttt-registry/_oqs /opt/ttt-registry/venv/bin/python /opt/ttt-registry/ttt_registry.py import-audit",
+ timeout_s=300,
+ )
+ txt = raw.decode("utf-8", errors="ignore") or ""
+ # The registry prints capability banners before JSON; best-effort parse the last JSON object.
+ for chunk in reversed([c.strip() for c in txt.splitlines() if c.strip()]):
+ if chunk.startswith("{") and chunk.endswith("}"):
+ try:
+ return json.loads(chunk)
+ except Exception:
+ break
+ return {"raw": txt.strip()}
+
+
+def build_story(trace_id: str, events: list[dict]) -> str:
+ lines = [
+ "# IF.story — contextual narrative log",
+ "",
+ f"Trace: `{trace_id}`",
+ "",
+ "Deterministic narrative projection of `trace_events.jsonl`. Each line includes the `event_hash` anchor.",
+ "",
+ ]
+ for ev in sorted(events, key=lambda e: int(e.get("idx") or 0)):
+ ts = str(ev.get("ts_utc") or "")
+ et = str(ev.get("type") or "")
+ mono_ms = int(ev.get("mono_ms") or 0)
+ data = ev.get("data") if isinstance(ev.get("data"), dict) else {}
+ h = str(ev.get("event_hash") or "")
+ summary = ""
+ if et == "request_commit":
+ summary = f"Request body commitment; commit_ok={bool(data.get('commit_ok'))} client_trace_id={data.get('client_trace_id') or ''}".strip()
+ elif et == "req_seen":
+ summary = f"REQ_SEEN witnessed; hour={data.get('hour_utc')} count={data.get('count')} merkle_root={data.get('merkle_root')}"
+ elif et == "request_received":
+ summary = f"Auth+quota succeeded; provider={data.get('provider')} model={data.get('requested_model')} stream={data.get('stream')} user_len={data.get('user_len')} auth_ms={data.get('auth_ms')}"
+ elif et == "guard_short_circuit":
+ summary = f"IF.GUARD short-circuit; reasons={data.get('reasons')}"
+ elif et == "trace_finalizing":
+ summary = f"Trace finalizing; ok={data.get('ok')} provider={data.get('provider')}"
+ else:
+ # generic
+ keys = list(data.keys())[:6] if isinstance(data, dict) else []
+ summary = f"Event data keys={keys}"
+ lines.append(f"- {ts} (+{mono_ms}ms) | `{et}` | {summary} | event_hash={h}")
+ lines += [
+ "",
+ "Notes:",
+ "- Ground truth remains `trace_events.jsonl` + `ttt_signed_record.json`.",
+ "- REQ_SEEN ledger+head are included; public key is `trace_ed25519.pub`.",
+ "",
+ ]
+ return "\n".join(lines)
+
+
+def build_manifest(payload_dir: Path) -> tuple[dict, dict[str, str]]:
+ sha_map: dict[str, str] = {}
+ files = []
+ for p in sorted(payload_dir.iterdir(), key=lambda x: x.name):
+ if not p.is_file():
+ continue
+ data = p.read_bytes()
+ sha = sha256_bytes(data)
+ sha_map[p.name] = sha
+ files.append({"path": p.name, "bytes": len(data), "sha256": sha})
+ manifest = {"files": files}
+ return manifest, sha_map
+
+
+def write_sha256s(payload_dir: Path, sha_map: dict[str, str]) -> None:
+ lines = []
+ for name in sorted(sha_map.keys()):
+ lines.append(f"{sha_map[name]} {name}")
+ (payload_dir / "sha256s.txt").write_text("\n".join(lines) + "\n", encoding="utf-8")
+
+
+def tar_payload(workdir: Path, trace_id: str) -> Path:
+ tar_path = workdir / f"emo_trace_payload_{trace_id}.tar.gz"
+ with tarfile.open(tar_path, "w:gz") as tf:
+ tf.add(workdir / "payload", arcname="payload")
+ return tar_path
+
+
+def main() -> int:
+ ap = argparse.ArgumentParser()
+ ap.add_argument("trace_id", help="Trace ID to package")
+ ap.add_argument("--email", default="ds@infrafabric.io", help="Trusted email for owner-gated endpoints")
+ ap.add_argument("--headers", default="", help="Path to captured HTTP response headers (optional)")
+ ap.add_argument("--response", default="", help="Path to captured HTTP response body (optional)")
+ ap.add_argument("--api-payload", default="", help="Path to captured request JSON (optional)")
+ ap.add_argument("--out-dir", default="", help="Output directory (default: /root/tmp/emo-trace-package-)")
+ args = ap.parse_args()
+
+ trace_id = str(args.trace_id).strip()
+ if not trace_id:
+ raise SystemExit("trace_id required")
+
+ out_dir = Path(args.out_dir or f"/root/tmp/emo-trace-package-{trace_id}").resolve()
+ payload_dir = out_dir / "payload"
+ payload_dir.mkdir(parents=True, exist_ok=True)
+
+ # Captured request/response artifacts (optional).
+ if args.headers:
+ write_text(payload_dir / "headers.txt", Path(args.headers).read_text(encoding="utf-8", errors="ignore"))
+ if args.response:
+ # Ensure JSON is stable.
+ raw = Path(args.response).read_text(encoding="utf-8", errors="ignore")
+ try:
+ obj = json.loads(raw)
+ write_json(payload_dir / "response.json", obj)
+ except Exception:
+ write_text(payload_dir / "response.json", raw)
+ if args.api_payload:
+ raw = Path(args.api_payload).read_text(encoding="utf-8", errors="ignore")
+ try:
+ obj = json.loads(raw)
+ write_json(payload_dir / "api_payload.json", obj)
+ except Exception:
+ write_text(payload_dir / "api_payload.json", raw)
+
+ # API snapshots (owner-gated).
+ api_trace = fetch_api_json(trace_id=trace_id, endpoint=f"/api/trace/{trace_id}", email=args.email)
+ write_json(payload_dir / "api_trace.json", api_trace)
+ api_events = fetch_api_json(trace_id=trace_id, endpoint=f"/api/trace/events/{trace_id}?limit=10000", email=args.email)
+ write_json(payload_dir / "api_events.json", api_events)
+
+ # Signed record from append-only log (ground truth).
+ ttt_rec = extract_ttt_signed_record(trace_id=trace_id)
+ if not ttt_rec:
+ raise SystemExit("ttt_signed_record not found for trace_id")
+ write_json(payload_dir / "ttt_signed_record.json", ttt_rec)
+
+ # Raw trace payload (ground truth).
+ payload_path = f"/opt/if-emotion/data/trace_payloads/{trace_id}.json"
+ trace_payload_raw = pct_exec(220, f"cat {shlex_quote(payload_path)}", timeout_s=60)
+ payload_dir.joinpath("trace_payload.json").write_bytes(trace_payload_raw)
+
+ # Trace events (canonical JSONL form).
+ events = api_events.get("events") if isinstance(api_events, dict) else None
+ if not isinstance(events, list):
+ raise SystemExit("api_events missing events[]")
+ trace_events_lines = []
+ for ev in sorted((e for e in events if isinstance(e, dict)), key=lambda e: int(e.get("idx") or 0)):
+ trace_events_lines.append(json.dumps({"event": ev}, ensure_ascii=False, sort_keys=True))
+ write_text(payload_dir / "trace_events.jsonl", "\n".join(trace_events_lines) + "\n")
+
+ # Story projection.
+ write_text(payload_dir / "if_story.md", build_story(trace_id, [e for e in events if isinstance(e, dict)]))
+
+ # Trace public key (Ed25519).
+ pub = pct_exec(220, "cat /opt/if-emotion/data/trace_ed25519.pub", timeout_s=30)
+ payload_dir.joinpath("trace_ed25519.pub").write_bytes(pub)
+
+ # REQ_SEEN hour ledger+head, derived from the trace events.
+ req_seen_ev = next((e for e in events if isinstance(e, dict) and e.get("type") == "req_seen"), None)
+ if not isinstance(req_seen_ev, dict):
+ raise SystemExit("trace has no req_seen event; cannot build completeness proof")
+ hour = str((req_seen_ev.get("data") or {}).get("hour_utc") or "").strip()
+ if not hour:
+ raise SystemExit("req_seen event missing hour_utc")
+ ledger_path = f"/opt/if-emotion/data/req_seen/{hour}.jsonl"
+ head_path = f"/opt/if-emotion/data/req_seen/heads/{hour}.json"
+ ledger_bytes = pct_exec(220, f"cat {shlex_quote(ledger_path)}", timeout_s=30)
+ head_bytes = pct_exec(220, f"cat {shlex_quote(head_path)}", timeout_s=30)
+ payload_dir.joinpath(f"req_seen_{hour}.jsonl").write_bytes(ledger_bytes)
+ payload_dir.joinpath(f"req_seen_head_{hour}.json").write_bytes(head_bytes)
+
+ # Inclusion proof for this trace_id in the hour ledger.
+ leaves: list[str] = []
+ idx_for_trace: int | None = None
+ leaf_for_trace: str = ""
+ for raw_line in ledger_bytes.splitlines():
+ if not raw_line.strip():
+ continue
+ try:
+ entry = json.loads(raw_line.decode("utf-8", errors="ignore"))
+ except Exception:
+ continue
+ lh = str(entry.get("leaf_hash") or "").strip()
+ if len(lh) != 64:
+ continue
+ leaves.append(lh)
+ if idx_for_trace is None and str(entry.get("trace_id") or "").strip() == trace_id:
+ idx_for_trace = len(leaves) - 1
+ leaf_for_trace = lh
+ if idx_for_trace is None:
+ raise SystemExit("trace_id not found in REQ_SEEN hour ledger")
+ proof = merkle_inclusion_proof(leaves, idx_for_trace)
+ proof["leaf_hash"] = leaf_for_trace
+ proof["hour_utc"] = hour
+ # Sanity: root must match head's merkle_root.
+ head_obj = json.loads(head_bytes.decode("utf-8", errors="ignore") or "{}")
+ if str(head_obj.get("merkle_root") or "") and proof["root"] != str(head_obj.get("merkle_root") or ""):
+ raise SystemExit("Merkle root mismatch (ledger != head)")
+ write_json(payload_dir / "req_seen_inclusion_proof.json", proof)
+
+ # Manifest + sha list.
+ manifest, sha_map = build_manifest(payload_dir)
+ write_json(payload_dir / "manifest.json", manifest)
+ write_sha256s(payload_dir, sha_map)
+
+ # Register child artifacts in IF.TTT (audit:entry -> import-audit -> signed records).
+ child_paths = [
+ "headers.txt",
+ "response.json",
+ "trace_payload.json",
+ "trace_events.jsonl",
+ "ttt_signed_record.json",
+ "api_trace.json",
+ "api_events.json",
+ "api_payload.json",
+ "if_story.md",
+ "trace_ed25519.pub",
+ f"req_seen_{hour}.jsonl",
+ f"req_seen_head_{hour}.json",
+ "req_seen_inclusion_proof.json",
+ ]
+ children_pre = []
+ audit_entries = []
+ created_utc = utc_now_iso()
+ for name in child_paths:
+ p = payload_dir / name
+ if not p.exists():
+ continue
+ cid_uuid = str(uuid.uuid4())
+ citation_id = f"if://citation/{cid_uuid}/v1"
+ sha = sha256_file(p)
+ rel_path = f"payload/{name}"
+ children_pre.append({"citation_id": citation_id, "rel_path": rel_path, "sha256": sha})
+ audit_entries.append(
+ {
+ "citation_id": citation_id,
+ "claim": f"emo-social trace artifact {name} for trace_id={trace_id}",
+ "source_filename": rel_path,
+ "source_sha256": sha,
+ "verification_status": "source-sha256",
+ "ingested_at": created_utc,
+ }
+ )
+ write_json(payload_dir / "ttt_children_pre.json", {"trace_id": trace_id, "created_utc": created_utc, "children": children_pre})
+ write_audit_entries(audit_entries)
+ _ = ttt_import_audit()
+
+ # Resolve signed IF.TTT records for the children.
+ child_ids = [c["citation_id"] for c in children_pre]
+ chain_records = resolve_ttt_records_by_id(child_ids)
+ write_json(payload_dir / "ttt_children_chain_records.json", chain_records)
+ # Minimal index for the bundle.
+ rec_by_id = {r.get("id"): r for r in chain_records if isinstance(r, dict) and r.get("id")}
+ children = []
+ for c in children_pre:
+ rid = c["citation_id"]
+ rec = rec_by_id.get(rid) or {}
+ children.append(
+ {
+ "citation_id": rid,
+ "rel_path": c["rel_path"],
+ "sha256": c["sha256"],
+ "content_hash": rec.get("content_hash"),
+ "pq_status": rec.get("pq_status"),
+ }
+ )
+ write_json(payload_dir / "ttt_children.json", {"trace_id": trace_id, "children": children})
+
+ # Build tarball and register it in IF.TTT.
+ tar_path = tar_payload(out_dir, trace_id)
+ tar_sha = sha256_file(tar_path)
+ write_text(out_dir / "payload_tar_sha256.txt", f"{tar_sha} {tar_path}\n")
+
+ tar_uuid = str(uuid.uuid4())
+ tar_citation_id = f"if://citation/{tar_uuid}/v1"
+ tar_audit_entry = {
+ "citation_id": tar_citation_id,
+ "claim": f"emo-social trace payload tarball (bundle) for trace_id={trace_id}",
+ "source_filename": tar_path.name,
+ "source_sha256": tar_sha,
+ "verification_status": "source-sha256",
+ "ingested_at": utc_now_iso(),
+ "source_path": str(tar_path),
+ }
+ write_json(out_dir / "ttt_tarball_audit_entry.json", tar_audit_entry)
+ write_audit_entries([tar_audit_entry])
+ _ = ttt_import_audit()
+
+ tar_chain = resolve_ttt_records_by_id([tar_citation_id])
+ if not tar_chain:
+ raise SystemExit("Failed to resolve tarball chain record from IF.TTT")
+ tar_rec = tar_chain[0]
+ write_json(out_dir / "ttt_tarball_chain_record.json", tar_rec)
+ write_json(out_dir / "ttt_tarball_chain_ref.json", {"citation_id": tar_citation_id, "content_hash": tar_rec.get("content_hash")})
+
+ print(str(out_dir))
+ return 0
+
+
+if __name__ == "__main__":
+ raise SystemExit(main())
diff --git a/emo_trace_pack.py.sha256 b/emo_trace_pack.py.sha256
new file mode 100644
index 0000000..e00fca4
--- /dev/null
+++ b/emo_trace_pack.py.sha256
@@ -0,0 +1 @@
+635671faa2b056253e8e26469d04d87f4f597c7bca0815eff038fb2b1986b548 /root/tmp/hosted_repo_update/emo_trace_pack.py
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz
new file mode 100644
index 0000000..ebcf27c
Binary files /dev/null and b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz differ
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz.sha256 b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz.sha256
new file mode 100644
index 0000000..ba2d310
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz.sha256
@@ -0,0 +1 @@
+85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87 /root/tmp/hosted_repo_update/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json
new file mode 100644
index 0000000..2dccb8f
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json
@@ -0,0 +1,9 @@
+{
+ "citation_id": "if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1",
+ "claim": "emo-social trace payload tarball (bundle) for trace_id=96700e8e-6a83-445e-86f7-06905c500146",
+ "ingested_at": "2025-12-21T10:39:38.222913+00:00",
+ "source_filename": "emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz",
+ "source_path": "/root/tmp/emo-trace-package-96700e8e-6a83-445e-86f7-06905c500146/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz",
+ "source_sha256": "85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87",
+ "verification_status": "source-sha256"
+}
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json.sha256 b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json.sha256
new file mode 100644
index 0000000..107227e
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json.sha256
@@ -0,0 +1 @@
+7174cfba22651a19ad948a0df06d7dc3dc802acd2a6f2fc8364dbf311f178332 /root/tmp/hosted_repo_update/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_audit_entry.json
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json
new file mode 100644
index 0000000..6053426
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json
@@ -0,0 +1,21 @@
+{
+ "claim": "emo-social trace payload tarball (bundle) for trace_id=96700e8e-6a83-445e-86f7-06905c500146",
+ "content_hash": "2a19ce17e4f936ed3ed5b140d77410bf99d41ba22489920e1e4cdec5b7b3a76a",
+ "evidence": {
+ "audit_key": "audit:entry:2ec551ec-0a08-487d-a41d-4d068aa8ee2f",
+ "ingested_at": "2025-12-21T10:39:38.222913+00:00",
+ "source_filename": "emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.tar.gz",
+ "source_sha256": "85eb323c8e5f11cf4dd18e612e8cde8dcdb355b3fbd6380bbc8d480a5bf97e87",
+ "verification_status": "source-sha256"
+ },
+ "id": "if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1",
+ "pq_algo": "ML-DSA-87",
+ "pq_status": "hybrid-fips204",
+ "prev_hash": "6f17d24ae67615ca089685b5b3292c0182d22cf319c77a6efe732acf144d2d34",
+ "session_id": null,
+ "signature": "9aae6314dad9b1ab31a5d5f3593859d6ca7d919a11baa7f59bdc9b80903e74d0da1399707589daf868d5c2541464df1f392d71a3f1fc4f9f43398963768eeb08",
+ "signature_ed25519": "9aae6314dad9b1ab31a5d5f3593859d6ca7d919a11baa7f59bdc9b80903e74d0da1399707589daf868d5c2541464df1f392d71a3f1fc4f9f43398963768eeb08",
+ "signature_pq": "eacc72c68d07b8a723516ac243e2f46ea17965a1f6e09601eed1e278f2ee849c22e34c8d5e1674d3d5fbb512b0a8e66c46da4dc871be72577afa8fa75bb496c46f472ed41981744466e4da93fa05caa4fe28ae2fdf94eecec0e86510bc86728f6c426dcf8402dcccc453a81c65f00cb1ceea4c1d73fd4680508554e8b11858de621fb873f391196788895abce9575fcc5171c46d72efc12c51f24980745b8590e3c2b60eafb03ff93db2f313295e3b3badaf658169c1fc3d474cd974b0086011710e7c2bb35dec1ea0c07af16eef9e781186505aafd17a0d87b62f21e6c2c5afcb6e7e20b723c529e4d41ddf7f84db5583f73d2d2b8b2260d379d9c535b278812d5ba2f2d2d03ea32d7333226ec9b4c46f73a63d6553a2a927d44e75fc74dfa70284f5e489129b80f5fa946b07738f1b0b836cbfa371124aa2b7dde6db9a4504bd9fb8018f2140e00151ece1d262182b1128773e6e7ee95e57fe037cdce7ecbc51b1d2af3fa77e9262d4dcd307e653310a8163a15ff6e1de0e16274bf36e81b39a522404b3770efb9b962fea71cfa182daee1f0d68c8725d3755c05b759c30737113b79ba0c9a3271ea904222e31b93f4d0957a16dcffcd293816c7473923b159432a9f602297e2759e0d87e7ba41294625ba8e3cacb6071945cbbfede5d2e7fff583d0ae22266b20aeb1c960ad2ace554365df4a33d264b8f702085c80a474f45fe96b4ffe09e62f3c49a79ba51d7265a3544022f75344da42b93ac916b165c38a57865a6310573517f267a35ffe778d1f3ccbc1cefd711516c98c236f5045485c54982f353b8228c572892801b09989e5cc0d6ef6120a818b70243edbf3913518d204117709cac3da3513ef030ba11609ff33732911623a6ae0e38952955c51c3fafc976d70fbf7a993eb6e347a4a51b88edc90c6e0e54a4a78543738a68ff49c21067540ffb23bda13ed567a52bd7e364aa7b37ce520a432f085d53e9f41fce3c0605869b1de293ed0e3c3fb2521f9aa4a6f2f9292e9f21ac552038adf62deb31659aad0b849f151a5e70426c90d430068d9e5c72e04a5b93b203e31d77eacb8d2f2b70ea6e984a9f41f7c2f67ccf86df51d0ba89ae813ce84d6cfe6a6513c611bbb50e006690573a67f5b6e5f36142997413c5d929f8dbcde524649369587e9ca43756b8c8b855eac1d7602e091f4dacfa7cba4dc781c14c4ab7042721ce6e4a86e2b0485ce1ff31a785f118e16be6d99dbca36de888abe372c30e602a0fad2447c9a54b6acf322983a9ff090a02dae514ec03b50538eeefe904a64e0902ed0426b6dfdecb5893864a8035296bfe5bc8beae9c2278c34de8f7cdd9659fb0e677253a69e3ba023965c8e38a8d71be2d230af6f0acfae629bab3a73380abc062a25f896298b80014bdcd2ec62d65be13ff709ab154d8fb652d553703a6808f2d0023de08b29f6cf1bdb900862bb4e5c2d131cae3d9c6feb2f559fbe98892688597136d2d7a986bd14ee1a3ac2d782fe3ac677437595ce0c21e7a364be30ed5b102961cd9f6a5164cf6d9a2f81a39e33eff42ad8da442099d77fd542ac0849f1699759782dadbb9c4d5e8e4a2073414d40fcf51c12352e57d428ce0f0a32e6ee142f6f2851e057f018de117a50c5dc2a6d33e448da44f5ec663fa8cb553773bbd4dcd1ee74e285f9bfd475404613c51905fcc0f21f60c672c00d83b9dd6ec4beaa4ff5513e0411cfb804ad5ede564f0364ac79ace09a157b18e2ea1733071f3a9ac081d352697dd6fefe5e05a9cfbb4fbf7281393c41dcd714043396fc4377df30484dd1b796ae317eab646c98795ee82f2c4196f307479015b1317fb632198ebe39370eda8bfc92ca84c6f4a7a8b6af1bcaf6bf5ba4d7a9279047d43c9f1d82ac1e882e2297f8aa874f01a50a93313c01f9d73a2602c70263c952d6bd35ed1bef340adb36699d8eff1e1ad347956c0b1434fe944bc1bb904288fd1773ce05e060dc6ccb109d983f8258e19f45b56841ea1321a2640d3e051d78d8c7eb9566aca8409822b7e561d80cf0c511553c7a7e4102bf098dd1c649d9a5edb1fbeb7409a498c1bda0c153e2cca0ce7bd942b7cb71e6af28b5823defe9e83a0fbe168b9f365219d34bc652007943ed508314984062715ab8c2661a0d703699406d4fa6a91b6a7d3b2b38daab18e6127dfc371ade341b60f4a4544e10b051373f565779814a5d27c7ac86b47ecca7e5462a01192bcd846f9dab83b91fdc116f8a6289d7dfe3a2abfad35b3b83943b2e5d3740e7dff73556d0e3fc55f821f9064eb18b125b86a260f672226cdc155ac81debd2dc96c4bce7ec723ad8e786ad8990cd8b92afc3e83b759f9ba3d900d572ae7cdb6fe6b7b2d125c5904d698797d0c56b9cf65682af620542093c5699de5ae8b3f04a65412cf1230b3481cc34638a60348b2773b71d75adb474e5f5d7099baf30c10b7a726442c76efb923b7e0f0b5ba958a4345cc99a903d627bf731031698c9f087c391b397977f15d8c0dedbf634a6af9683450c05c7395fbf6f42aa8b7ba0dac02c3cb719361f8fd14f0bdf5f8d8c90c8e045a7a6e6c7868eb5766844f00a226465f397128c6015db4ba036c697f220489e0dcb3a55d5e67ba4fd7e846c30f51de61ca35b3908d2525f541e24f5337010d9dfca3ffbf10704def6e1c973365e423050f9e41306c6afc35fd86752e68e88361160f6e62b04a373ac77562eec136959f76a1ec8a7a92003167c098b4cb59d36c49676e8818062c535af60e05843b1295905b30af4edcc00ac7a64f39f68886f06ccc978b645a12074fa660c36c8694e552f7173adef1c5cfc4c78f064e4a96d407c9a5c4fa5bc1f5bfdf27c17f8399024a9e6d10812c11339742ed1464a1d0d45accdc10bb6faf88bf54be82167d54fdedff92dec4a53b63c9c298e72658534ec3406e01cc2e5cf7a3f251b5bd18dcea0a7c08ae0b166b586276e5a30e911e65e156f548bbe55f32942cee59e1154922563568b56a8e008927fe44385149992d4121585f202b6663ed661018479b2e1f750b9c5e874aeb277f1d78c9bc84f2f0b9acffa5f2388c9613a72f74022b4006eb3f8468891716efbd67906a915bf6156a4be2e3ec44ce2820c7632e4ebb975bf185654a32afb62a0a2323fd8f05e7f9cd8e77f9bb52b459a8e2649e6a6fce8bb042c8d53c72f3ee34571c4cd2efdfadc2dd98864f2a76253e6a2a8d47806f37ab849e16d493ef8626f18c10e7bd41f434b261bad733985066cab55721a5a60b277bbed43f819e37052d698affbe82373b92f14af08f331048b22d6d7394113d80bbb61cbb190a817efa746b666f1848a557529163b96c1777e9dc60a8213600c28e1da64062652e3b9708d19b24816b702da7d4d0790f8f149ae1d66a8a6adcb22096ad29ff1f75c57f75ddb0044b066ae41adfbace000052db4d5ffa1dc4c8d1aaa5d8d88979974692e0bbaf9c8566325a3e691ca0c89ac144a24c3e550030e2c78556603b68b4f1aacf790aa6dd2462042ec45e424ea7b34952c29d6a6d034a87277c0890e5f99df16deb04b212da554975e79ccce8a93686c7326e9bb526148ccca5609c5a9845dd90e8111c412af29744586848b73523e0fd568f8059900a0538896f07c9e530be97072c0865f8501e465bc14dd87a0327b585edea2451b781dde731649bb8cdfa0c419aecaea36b9c81ad46e9b0582bd1eda8da53507d858b4e737aa0978cc87b19e2480c48d64946264eebe52371d378cb4fec03fa7c32674bdae4c21f0a3865b94121af62cafa508b1ff4b3d3e9c59654c43c7a04c0573a2d13f9dcc1721d61c03880c7f2b411cceb7e96c4eb47f73ccc2037791a47f38f8e76b3adefa983e0d39ed7ae34ea88b18365859d529485479b564f730c41618d89b7a69cdfd2dc317e8c29c73e4f24f24d0c39420e7d2238d545012e2cc782f6a910125b169a6ac141593e7b2a137cd80633636aa99d9211be222230e05a776383640e89125fbeb9ac5ea02753f6f2807781b27a2cb1829ab2d8d1f76ffee469a22e93a51e8a37433216199e6824129de2316fd703e4b9ee1db9f3c00f12113f62870d06b79dee99ba827479ddb835dbe91815307f846cf9febf804a812bda8267d21354ec6e58f8f64ad3bd1a8315588bdd366084ef4eae7d73d8d81a1a2de9b502819bbe89138d6870faf76ccf1a8acd964ea952ffdd162396804ec83ea294a9972f7bb1f90e13e878bec4bad3b8ea88a89824e75f826d9bbed7a66d8c06f8861a1b867420a4b4fc0f0e2aeea1aa683c3f0950f63cb6e71e59daae81d45c3f83eee010651bcc73af53166de3e91eccf7fcb8147dca45deefa54610a6e48fef5640ea6da58986927927e56fa721bda2c54f0aaa0afb83515fa44933d7e4a0ea199eaf2aca48bd02f6393c3edc900a73983428d80da61b853e50f2365954302e8a980abbffbcf954649883a7f95298dc2904d7605b104321a56ddfd56e8ac3b2562d2f5ef8337c3587357a1840f82f953ef313acef1376a7586efa9a1b48c01c15e9e46db68feb4d14540b7be3be97e1c7707f849927fd6a77b7b0ae81995015399644bdb57c6a3d45b980a00e89565a386d17b75d3d5bbd6a013f94899df454da3930de7eec518e1d4c5bd6f482ef2bb4cda97bffaf56fe63685bb440791b4e730ece8749ebe417c85ae8fd4215a435de6ef1882e40d8591b1081a391bb659d16903450d32a4961dfab7236cb17494b7cba40852282e0d07c9e3b45f1755162cdd5589c4e1ae6f284157234a5f4388d0b41628d645338306f2791f7356cfa421793df6db096a394380b2ee8af35ea877e601833ec48e416f3cd6ce6b42dadd00b0ce9a38dcb13f992efa55a80e9a82a2607bea27e7ca7f13393a750034cbf8f1414376cfb61b5baaec98b8e44e329802016fe790705e6135461e92434f3ccfc1c0113607f6df59796080569c6af1dca996440c4b7a943f021dc7bd4f674f2cbff39b1dce1208c11227bb7f475782a1577a02e7c9d231605a37c69798dbde6199411bc478e6ad8ebedf8526a7c937e0ecdc0025bd0e72f5d0cb1c3638efa38512e08750f70c6e7d6e5f4d4ecdd9a15afc93016351d2a11ecdb918495a54b05c0f91b3dcc69f09b0d518b019a006f21190f30d4a4e241256ca68fac15a0d68066036c4f88749ed5fbcc02842e48f7bcbe0213583d66df3e2cfe68fa17e69579953a03371bf7963d83385a4a677f81f99084c1c58abb7c85410281bc38e8a1a7e8c55612c841b5f1156d59a9cace8a761fb83b4f81b011efc1141b19866def4634707b267e637b7a49d8dde11ee0f5462d5895542da0bccf91c3e08e959065198ce706f4eee111ef76b35d1ec9dd45f64c47a1ac0d3eb1f3d16709e2a3d18773a76cc62d0d1829fc8420f925c76fa27c1dc562e1e2431fc613ad2774335c880d6534a532476faf8436f93d66dd0f90722dd7aad31c72d4aaca0e4ce006c9d49de9f5813a441d675cf12ebdcb5054f9ac3b04428529ff8939c7a9fba500beda2776e2a2cffa30cf880b56f39a546e240e4354688bb1c56cbaf5d2daceabc61ac383adde772a323c9f8ddf43bba0329b9fc6f3a33055d5f642ae987e75c47020cbd4db07c6fdcc0cc7c5b9892b9c902d6428f7f6e7498d5a0f22157533d7b5b63bd80c73f94e1740b8fb6ea7d6ed52c2fc0609ed3a38760f8ec55b5b89d395e7694c3e4c22863000e09e411252a567423c8e3a10ad74cf15f8845424986a08878236ce564aced466e715e96e5da4f12555f693eea8b4334d8bf2dbaed679e61d53f681ba1a2b033d26499d0f2f520266a275140a190ca16f351e6aa8fb6b558fea0453d8d57301fea2cd8331896d044c196a006ed2bdd8bcd355c384c18e1a7e42ace72cdc807c5ad5493f3e87946a348a725dd3255b2bacf428d33aab4fb4db84db534e3ead062612ce67b8f2930358e4ff184c1b569651cd9bac791c4d3f96110653fa81b72927002ada53b20b206ba08eefa2ac27b72753ad3de595ae555f383f3462d835f27e3dcd601176f740b45d701a4830fd0caf5b0162ce05888650103d7e9936cf81eb29c1a3e2fa0c6b2a9707eb35d4ed463136eebce40d0d3fb388e168dd014c27f00bb4c3bc6588e1d9f1b9bc03e91d855616549e6b9a77877fec498abaeb083e3c06ab1f0de31b02b68119bf6a6712c183067ed397d1de9a128c8cd5688397a209114707eecc5bd53c0e52f69cee22e7b2f474556b15a7589f9513f91153a9758eef13f50271688aa3cf23f684c2977dac4410d64a3f221f49e1be6fc09cac70033cdc4a0001aaea9abadb8bdae8c6e0466eb943669366ac68a2b1712c05605cb21831c6a02875aa6474641d3ae085f6f2edc0379b34eabf39b780a01fec97c21da956e79040fcac0708116091c0dbec09183b5f8283ced4d7e456607374b8ceec1325283141435167b1b5e1f730b30617809aa3a8e1eb0c12192abd1c709de80000000000000000000000000000000000000008121925272f3438",
+ "signer": "4f7a0233329a15700e4b177f6ca147fd1679904cd4bb1886f146fd7532d14a8f",
+ "timestamp": "2025-12-21T10:39:38.222913+00:00"
+}
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json.sha256 b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json.sha256
new file mode 100644
index 0000000..981fff0
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json.sha256
@@ -0,0 +1 @@
+232722db9fe133aea3923565695324807cd7a74e81a6db2db360a18e9aa4181f /root/tmp/hosted_repo_update/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_record.json
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json
new file mode 100644
index 0000000..8e00f3d
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json
@@ -0,0 +1,4 @@
+{
+ "citation_id": "if://citation/2ec551ec-0a08-487d-a41d-4d068aa8ee2f/v1",
+ "content_hash": "2a19ce17e4f936ed3ed5b140d77410bf99d41ba22489920e1e4cdec5b7b3a76a"
+}
diff --git a/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json.sha256 b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json.sha256
new file mode 100644
index 0000000..ef63300
--- /dev/null
+++ b/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json.sha256
@@ -0,0 +1 @@
+86efb752980e12b64a2c36c82d84bf3116cb3a48ef621e288691fa29a6902c63 /root/tmp/hosted_repo_update/emo_trace_payload_96700e8e-6a83-445e-86f7-06905c500146.ttt_chain_ref.json