Randomness has always been essential not only for cryptography, but also for many applications such as differential privacy or Generative AI. Recent work on generative image and video synthesis has shown that knowing or recovering the seed of the pseudo random sequence can leak sensitive information such as user prompts. At the same time, it has been demonstrated that major ML libraries (e.g., TensorFlow, JAX) rely on GPU-oriented PRNGs based on cryptographically insecure PRFs originally designed for fast parallel generation rather than adversarial robustness. This creates an urgent need for cryptographically sound, GPU-efficient alternatives.
At the same time, the growing reliance on GPUs in AI and high-performance computing motivates the development of GPU-optimised symmetric primitives more broadly. Yet this design space is still largely unexplored: many operations are extremely cheap on GPUs but require rigorous cryptanalysis from a design perspective. The proposed workshop aims to bring together experts in machine learning, cryptography, cryptanalysis, and hardware to exchange knowledge, clarify requirements, and discuss strategies and challenges for designing secure, GPU-friendly PRFs, ciphers and PRNGs, with a particular focus on their use in modern ML stacks.
TBA
| 10:20 – 11:30 | Introduction |
| 11:30 – 12:15 | Talk by TBD: |
| TBD | |
| 12:15 – 13:00 | Invited Talk by TBD: |
| TBD | |
| 13:00 – 14:30 | Lunch |
| 14:30 – 15:20 | Invited Talk by Francois-Xavier Standaert (shared session with SPRING): |
| Prime Field Masking and Hard Physical Learning Problems: Design and (Crypt)Analysis Challenges | |
| 15:20 – 16:00 | Coffee Break |
| 16:00 – 16:45 | Invited Talk by Cihangir Tezcan |
TBD | |
| 16:45 – 16:50 | Closing remarks |
Photo by Wiki user: Nicholas Hartmann.