In this episode, Anna Rose welcomes back Daniel Kang professor at UIUC and founding technical advisor at VAIL, for an update on ZKML and how the space has evolved since early 2023. Daniel covers the 2023-2024 cohort of ZKML tools including zkCNN, zkLLM, EZKL, and his original ZKML project, while introducing his new project ZKTorch, which offers a flexible hybrid of specialized and general-purpose approaches.
The discussion explores practical applications like verified FaceID, proof of prompt, and proof of training, along with the technical challenges of adding ZK proofs to machine learning models. Daniel shares insights on the performance trade-offs between specialized cryptographic systems and generic circuits, and how ZKTorch aims to offer both flexibility and speed for proving ML inference.
Related links:
- ZKTorch: Open-Sourcing the First Universal ZKML Compiler for Real-World AI
- ZKTorch: Compiling ML Inference to Zero-Knowledge Proofs via Parallel Proof
- ZK Torch GitHub
- Accumulation by Bing-Jyue Chen, Lilia Tang, Daniel Kang
- Episode 369: Ligero for Memory-Efficient ZK with Muthu
- Episode 356: ZK Benchmarks with Conner Swann
- Episode 364: AI and ZK Auditing with David Wong
- Episode 265: Where ZK and ML intersect with Yi Sun and Daniel Kang
- Bonus Episode: zkpod.ai & Attested Audio Experiment with Daniel Kang
- ZK13: ZKTorch: Efficiently Compiling ML Models to Zero-Knowledge Proof Protocols – Daniel Kang
- AI Agent Benchmarks are Broken
- VAIL
- zkCNN: Zero Knowledge Proofs for Convolutional Neural Network Predictions and Accuracy
- zkLLM: Zero Knowledge Proofs for Large Language Models
- MLPerf Inference: Datacenter
Check out the latest jobs in ZK at the ZK Podcast Jobs Board.
**If you like what we do:**
* Find all our links here! @ZeroKnowledge | Linktree
* Subscribe to our podcast newsletter
* Follow us on Twitter @zeroknowledgefm
* Join us on Telegram
* Catch us on YouTube
**Support the show:**
* Patreon
* ETH – Donation address
* BTC – Donation address
* SOL – Donation address