šŸ”„ Awesome Controllable Generative Models

less than 1 minute read

Published:

I’m maintaining a curated collection of recent research papers on controllable generative models!

About This Collection

A continuously updated collection of recent (2023–2025) research papers on controllable generative models, with a special focus on both UNet-based diffusion models and Transformer-based diffusion architectures.

This list emphasizes core advances in:

  • 🧭 Control mechanisms – including condition injection, adapters, multi-modal control
  • šŸ‘ļø Attention interpretation – revealing what diffusion models focus on
  • šŸŽ›ļø Frequency-based control – using spectral domain knowledge to guide generation
  • šŸ” Alignment & knowledge transfer – enabling more coherent, faithful, and data-efficient synthesis
  • šŸ§‘ā€šŸŽØ Image-to-image (I2I) editing – flexible, structure-preserving transformation across domains

Check it out!

šŸ‘‰ GitHub Repository: Awesome-Controllable-Generative-Models-Papers

⭐ 36 stars and growing!

The list covers major conferences (CVPR, ICCV, ECCV, ICML, NeurIPS, ICLR, etc.) and recent arXiv preprints. Perfect for researchers and developers exploring controllable synthesis.


šŸ’” Contributions are welcome! If you know a paper we missed or are working on a new controllable generation method, feel free to submit a pull request or open an issue.