Biography

Ling Yang is currently a final-year Ph.D. student at Peking University, advised by Bin Cui and Luxia Zhang. My research interests are Generative Modeling (Diffusion Models, LLMs) and AI for Science. I previously worked with Yang Song, Guohao Li, Shuicheng Yan, Ming-Hsuan Yang, Bernard Ghanem, Stefano Ermon, Mengdi Wang and Jure Leskovec. I serve as a program committee member or reviewer for international conferences and journals including SIGGRAPH, TPAMI, ICML, ICLR, NeurIPS, CVPR, KDD, AAAI. Feel free to contact me for potential collaborations or discussions.
Email | WeChat | Github | Google Scholar | Twitter

Research Summary

Diffusion Model

Large Language Models

Representation Learning

What's New

  • I propose SupperCorrect, achieving new SOTA performance among all 7B models.
  • I propose IterCompGitHub stars, leveraging iterative RLHF to achieve fast and realistic T2I generation.
  • I propose SemanticSDS and Trans4D to enhance compositional generation in text-to-/3D/4D scenarios.
  • Five papers about Diffusion Models and LLMs (Buffer of Thought, Spotlight) are accepted by NeurIPS 2024.
  • One paper about diffusion-based video frame interpolation is accepted by ACM Multimedia 2024.
  • I propose Consistency Flow MatchingGitHub stars, converging 4.4x faster than Consistency Model and 1.7x faster than Rectified Flow while achieving better FID.
  • I propose a new RAG-based LLM reasoning framework, Buffer of ThoughtsGitHub stars, outperforming Tree of Thought.
  • I release the project VideoTetrisGitHub stars of first compositional text-to-video generation.
  • Two papers about Diffusion Models and AI for Science are accepted by ICML 2024.
  • One paper about general/molecular graph diffusion is accepted by TKDE 2024.
  • One paper about improved training algorithm of Diffusion Transformers (DiT), DDPMs and Score SDEs is accepted by CVPR 2024.
  • Release our SOTA LLM-controlled diffusion model, RPG-DiffusionMasterGitHub stars.
  • Three papers about Diffusion Models, GNN, AI for Science are accepted by ICLR 2024.
  • Our paper about protein-aware 3D molecular diffusion models is accepted by AAAI 2024.
  • Our survey about Diffusion ModelsGitHub stars is accepted by ACM Computing Surveys 2023, collaborating with OpenAI.
  • One paper about text-to-image diffusion is accepted by NeurIPS 2023.
  • I publish a book about Diffusion Models.
  • One paper is accepted by TNNLS 2023.
  • One paper is accepted by TKDE 2023.
  • Two papers are accepted as ICML 2022 Spotlight.
  • One paper is accepted by CVPR 2020.

Selected Papers [Full List]

  • Mastering Text-to-Image Diffusion: Recaptioning, Planning, and Generating with Multimodal LLMs.
    Ling Yang, Zhaochen Yu, Chenlin Meng, Minkai Xu, Stefano Ermon, Bin Cui
    ICML 2024 paper | repo | tweet

alt text

  • Buffer of Thoughts: Thought-Augmented Reasoning with Large Language Models
    Ling Yang, Zhaochen Yu, Tianjun Zhang, Shiyi Cao, Minkai Xu, Wentao Zhang, Joseph E Gonzalez, Bin Cui
    NeurIPS 2024 spotlight paper | repo | tweet

alt text

  • IterComp: Iterative Composition-Aware Feedback Learning from Model Gallery for Text-to-Image Generation
    Xinchen Zhang*, Ling Yang*, Guohao Li, Yaqi Cai, Jiake Xie, Yong Tang, Yujiu Yang, Mengdi Wang, Bin Cui
    paper | repo | tweet

alt text

  • Consistency Flow Matching: Defining Straight Flows with Velocity Consistency
    Ling Yang, Zixiang Zhang, Zhilong Zhang, Xingchao Liu, Minkai Xu, Wentao Zhang, Chenlin Meng, Stefano Ermon, Bin Cui
    paper | repo | tweet alt text
  • VideoTetris: Towards Compositional Text-to-Video Generation
    Ye Tian*, Ling Yang*, Haotian Yang, Yuan Gao, Yufan Deng, Jingmin Chen, Xintao Wang, Zhaochen Yu, Xin Tao, Pengfei Wan, Di Zhang, Bin Cui
    NeurIPS 2024 paper | repo | tweet

alt text

  • Dpgn: Distribution propagation graph network for few-shot learning
    Ling Yang, Liangliang Li, Zilun Zhang, Xinyu Zhou, Erjin Zhou, Yu Liu
    CVPR 2020 paper | repo

alt text

Awards

  • Selected for the distinguished student forum of VALSE 2024 (8 People in China).
  • National Scholarship for Ph.D student (Top 1% in PKU), 2022.
  • Exceptional Award for Academic Innovation for Ph.D student (Top 1% in PKU), 2022.
  • First-class Academic Scholarship, 2018, 2019, 2020.