Biography
Ling Yang is currently a final-year Ph.D. student at Peking University, advised by Prof. Bin Cui and Prof. Luxia Zhang. I am also an incoming postdoctoral research fellow at Princeton University, fortunately working with Prof. Mengdi Wang. My research interests are developing advanced algorithms and frameworks about LLMs and diffusion models. I previously worked with Yang Song, Guohao Li, Shuicheng Yan, Ming-Hsuan Yang, Bernard Ghanem, Stefano Ermon, and Jure Leskovec. I serve as a program committee member or reviewer for international conferences and journals including SIGGRAPH, TPAMI, ICML, ICLR, NeurIPS, CVPR, KDD, AAAI. Feel free to contact me for potential collaborations or discussions.
Email | WeChat | Github | Google Scholar | Twitter
We have opening positions for PhDs, Masters and Research Interns (not limited to PKU and Princeton University, work online). Also, I am in charge of a reasearch team and have led a series of works on Diffusion Models and LLMs, including RPG-DiffusionMaster, Buffer of Thoughts
, SupperCorrect
, ReasonFlux
, VideoTetris
, Consistency Flow Matching
, IterComp
. Interested persons please contact me directly!
Research Summary
My goal is to build powerful AI models capable of understanding, generating and reasoning with high-dimensional data across diverse modalities. I currently focus on developing advanced generative models, including their training methodologies, architecture design, alignment, inference efficiency and applications. I am also interested in generative modeling as a tool for scientific discovery.
Generative Model Foundations
- Diffusion Theory and Framework: RPG, ContextDiff, Consistency Flow Matching, Diffusion-Sharpening, Rectified Diffusion, ConPreDiff, SADM
- LLM Reasoning: Buffer of Thought, SuperCorrect, ReasonFlux
- Multimodal LLM: HermesFlow
- Agent Framework: ScoreFlow, Multi-Agent Collaborative Data Selection
Generative Applications
- Multimodal Generation (Image/3D/4D): IterComp, VideoTetris, EditWorld, SemanticSDS, Trans4D, IPDreamer
- AI for Science: IPDiff, IRDiff, BindDM
What's New
- I release ReasonFlux
, beating OpenAI o1-preview and DeepSeek-V3 with hierarchical reinforcement learning on 8GPUs.
- 6 papers about LLMs and Diffusion Models are accepted by ICLR 2025.
- I propose SupperCorrect
, achieving new SOTA LLM reasoning performance among all 7B models.
- I propose IterComp
, leveraging iterative RLHF to achieve fast and realistic T2I generation.
- 5 papers about Diffusion Models and LLMs are accepted by NeurIPS 2024.
- I propose Consistency Flow Matching
, converging 4.4x faster than Consistency Model and 1.7x faster than Rectified Flow while achieving better FID.
- I propose a new RAG-based LLM reasoning framework, Buffer of Thoughts
(NeurIPS 2024 Spotlight).
- I release the project VideoTetris
of first compositional text-to-video generation.
- 2 papers about Diffusion Models and AI for Science are accepted by ICML 2024.
- One paper about general/molecular graph diffusion is accepted by TKDE 2024.
- One paper about improved training algorithm of Diffusion Transformers (DiT), DDPMs and Score SDEs is accepted by CVPR 2024.
- Release our SOTA LLM-controlled diffusion model, RPG-DiffusionMaster
.
- 3 papers about Diffusion Models, GNN, AI for Science are accepted by ICLR 2024.
- Our paper about protein-aware 3D molecular diffusion models is accepted by AAAI 2024.
- Our survey about Diffusion Models
is accepted by ACM Computing Surveys 2023, collaborating with OpenAI.
- One paper about text-to-image diffusion is accepted by NeurIPS 2023.
- I publish a book about Diffusion Models.
- One paper is accepted by TNNLS 2023.
- One paper is accepted by TKDE 2023.
- 2 papers are accepted as ICML 2022 Spotlight.
- One paper is accepted by CVPR 2020.
Selected Papers [Full List]
- Buffer of Thoughts: Thought-Augmented Reasoning with Large Language Models
Ling Yang, Zhaochen Yu, Tianjun Zhang, Shiyi Cao, Minkai Xu, Wentao Zhang, Joseph E Gonzalez, Bin Cui
NeurIPS 2024 spotlight paper | repo | tweet
- ReasonFlux: Hierarchical LLM Reasoning via Scaling Thought Templates
Ling Yang, Zhaochen Yu, Bin Cui, Mengdi Wang
paper | repo | tweet
- Mastering Text-to-Image Diffusion: Recaptioning, Planning, and Generating with Multimodal LLMs.
Ling Yang, Zhaochen Yu, Chenlin Meng, Minkai Xu, Stefano Ermon, Bin Cui
ICML 2024 paper | repo | tweet
- IterComp: Iterative Composition-Aware Feedback Learning from Model Gallery for Text-to-Image Generation
Xinchen Zhang*, Ling Yang*, Guohao Li, Yaqi Cai, Jiake Xie, Yong Tang, Yujiu Yang, Mengdi Wang, Bin Cui
paper | repo | tweet
- Consistency Flow Matching: Defining Straight Flows with Velocity Consistency
Ling Yang, Zixiang Zhang, Zhilong Zhang, Xingchao Liu, Minkai Xu, Wentao Zhang, Chenlin Meng, Stefano Ermon, Bin Cui
paper | repo | tweet - VideoTetris: Towards Compositional Text-to-Video Generation
Ye Tian*, Ling Yang*, Haotian Yang, Yuan Gao, Yufan Deng, Jingmin Chen, Xintao Wang, Zhaochen Yu, Xin Tao, Pengfei Wan, Di Zhang, Bin Cui
NeurIPS 2024 paper | repo | tweet
- Dpgn: Distribution propagation graph network for few-shot learning
Ling Yang, Liangliang Li, Zilun Zhang, Xinyu Zhou, Erjin Zhou, Yu Liu
CVPR 2020 paper | repo
Advising Experience
Zhaochen Yu (Master student at National University of Singapore)
Xinchen Zhang (Master student at Tsinghua University)
Ye Tian (incoming Ph.D. student at PKU).
Bohan Zeng (incoming Ph.D. student at PKU).
Zhilin Huang (Ph.D. student at Tsinghua University).
Zhilong Zhang (incoming Ph.D. student at Tsinghua University)
Yinjie Wang (Ph.D. student at The University of Chicago).
Awards
- Selected to present a talk at the KAUST Rising Stars in AI Symposium (24 people in the world), 2025
- Selected for AI Elite Forum of WAIC (20 people in the world), 2024.
- Selected for the distinguished student forum of VALSE (8 People in China), 2024.
- Selected for Annual Outstanding Author of Electronics Industry Press, 2023.
- Selected for two consecutive years in the TechBeat Influencers List (2023 list and 2024 list, 20 people in China).
- Baidu Scholarship Nominee (20 people in the world), 2023.
- National Scholarship for Ph.D student (Top 1% in PKU), 2022.
- Exceptional Award for Academic Innovation for Ph.D student (Top 1% in PKU), 2022.
- First-class Academic Scholarship, 2018, 2019, 2020.