Hubert Strauss
hs6702[at]princeton.edu
Hi! I am a Research Engineer at Princeton University, within the Princeton Language & Intelligence Initiative.
My current work focuses on reinforcement learning for language models, from studying how reward signals shape learning and optimization to building efficient systems and infrastructure for it. Since generation lies at the core of these systems, I am also interested in efficient generation methods.
I previously spent time at o9Solutions (ML team), and at Dematic (Operations Research and Applied AI team).
I obtained my engineering degree (BSc./MSc.) at Arts et Metiers ParisTech and studied optimization (MSc.) at Georgia Tech.
news
| 2026-Jan | FutureFill has been accepted to ICLR 2026. See you in Rio ! |
|---|---|
| 2025-Aug | Our work which provides an optimization perspective on what makes a good reward model for RLHF has been accepted to NeurIPS 2025. See you in San Jose ! |
| 2025-May | Our work on Hardware-Efficient Attention for Fast Decoding has been accepted to COLM 2025. See you in Montreal ! |
| 2024-Jun | I joined Princeton Language & Intelligence (Princeton University) as a Research Engineer |
publications
-
FutureFill: Fast Generation from Convolutional Sequence ModelsIn International Conference on Learning Representations (ICLR 2026), 2024