Lingpeng Kong

This page summarizes our work on diffusion reasoning models (DREAMs).
Foundation.
  • [COLM 2024] A Reparameterized Discrete Diffusion Model for Text Generation. The framework for discrete diffusion models that improves text generation through better training and sampling techniques.
  • [ICLR 2023] DiffuSeq: Sequence to Sequence Text Generation with Diffusion Models. DiffuSeq established a paradigm for using text diffusion models to handle sequence-to-sequence tasks.
  • Reasoning.
  • [ICLR 2025] Beyond Autoregression: Discrete Diffusion for Complex Reasoning and Planning. Through studying subgoal learning, we show diffusion models significantly outperform autoregressive models on reasoning tasks by better handling difficult subgoals.
  • [ICLR 2025] Implicit Search via Discrete Diffusion: A Study on Chess. DiffuSearch uses discrete diffusion to enable implicit search capabilities in language models, outperforming both searchless and MCTS-enhanced approaches in chess-related tasks.
  • [NeurIPS 2024] Diffusion of Thoughts: Chain-of-Thought Reasoning in Diffusion Language Models. Chain-of-Thought Reasoning, but in the text diffusion world.
  • Scaling.
  • [ICLR 2025] Scaling Diffusion Language Models via Adaptation from Autoregressive Models. A method to convert pre-trained autoregressive models like GPT-2 and LLaMA into competitive diffusion models through continual pre-training, making large-scale diffusion language models more accessible.