Xi Wang

Xi Wang

CS PhD student @ NYU, advised by Shengjie Wang .

I build generative models for language, biological sequences, chemistry, and robotics. I believe a good generative model should generate under formal and physical constraints, follow instructions faithfully rather than improvise, and produce outputs that are grounded in reality — valid molecules, foldable proteins, feasible robotic actions. My research explores how to make this possible across domains.

Generative Models Language Biological Sequences Robotic Learning
New York University (Courant) · New York, NY
esche.wang@outlook.com

Research

My work explores incorporating formal, physical, and scientific constraints into generative models across multiple domains — making generation both expressive and grounded.

Selected Publications

News

2026
New preprint on arXiv: Rooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet Training.
ICLR 2026: 3DCS accepted (Poster).
2025
RxnBench released on arXiv.
CLC-DB published in Journal of Cheminformatics.
AtropDiff won Outstanding Paper Award at ICLR 2025 DelTa.
Chiralcat published in Artificial Intelligence Chemistry.
SubA descriptor paper published in JCIM.
2023
Uni-RNA posted on bioRxiv.