
Detect contradictions in legal documents
Duration
October 2025 - Feb 2026
Product and team
Atticus
Verification stream
Role
shaping experience and discovery, AI workflows, evaluation heuristics, experimentation, vibe-prototyping.
Statements are negotiable, but integrity is not.
Corporate reporting is under growing scrutiny. As economic volatility rises, regulators demand more detailed disclosures — and even a single number can carry legal liability. Yet verification still relies on fragmented manual checks across teams and document versions, an approach that struggles as reports grow longer and more complex. Atticus was designed to modernise this process, making verification collaborative and secure by default, but one critical gap remained: maintaining consistency as documents evolve.
As LLMs began proving their potential in the legal domain, an opportunity emerged to build a 0→1 generative AI capability with no existing playbook.
This project allowed me to demonstrate how I drove alignment and shaped product direction on a three-year challenge in under five months, through continuous discovery, rapid experimentation, and strategic framing.
Key decisions I drove:
Establishing consistent content frameworks for non-deterministic AI outputs
Building a human-centered evaluation model to scale product quality
Stress-testing AI workflows under real-world constraints
Further details, including process and outcomes, are available upon request given the confidential nature of this work