Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generative Quanta Color Imaging

Published 28 Mar 2024 in cs.CV and cs.AI | (2403.19066v1)

Abstract: The astonishing development of single-photon cameras has created an unprecedented opportunity for scientific and industrial imaging. However, the high data throughput generated by these 1-bit sensors creates a significant bottleneck for low-power applications. In this paper, we explore the possibility of generating a color image from a single binary frame of a single-photon camera. We evidently find this problem being particularly difficult to standard colorization approaches due to the substantial degree of exposure variation. The core innovation of our paper is an exposure synthesis model framed under a neural ordinary differential equation (Neural ODE) that allows us to generate a continuum of exposures from a single observation. This innovation ensures consistent exposure in binary images that colorizers take on, resulting in notably enhanced colorization. We demonstrate applications of the method in single-image and burst colorization and show superior generative performance over baselines. Project website can be found at https://vishal-s-p.github.io/projects/2023/generative_quanta_color.html.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (69)
  1. Architecture and applications of a high resolution gated SPAD image sensor. Optics Express, 22(14):17573–17589, 2014.
  2. Stanley H. Chan. What does a one-bit Quanta Image Sensor offer? IEEE Transactions on Computational Imaging, 8:770–783, 2022.
  3. Efficient image reconstruction for gigapixel quantum image sensors. In IEEE Global Conference on Signal and Information Processing, pages 312–316, 2014.
  4. Images from bits: Non-iterative image reconstruction for Quanta Image Sensors. MDPI Sensors, 16(11), 2016a.
  5. Plug-and-Play ADMM for image restoration: Fixed-point convergence and applications. IEEE Transactions on Computational Imaging, 3:84–98, 2016b.
  6. Neural Ordinary Differential Equations. In NeurIPS, 2018.
  7. Inner product-based neural network similarity. In NeurIPS, 2023.
  8. Large convolutional model tuning via filter subspace. Available online : https://arxiv.org/abs/2403.00269, 2024. Accessed 03/06/2024.
  9. Homomorphic latent space interpolation for unpaired Image-To-Image translation. In CVPR, pages 2403–2411, 2019.
  10. RotDCF: Decomposition of convolutional filters for rotation-equivariant deep networks. In ICLR, 2019.
  11. Dynamic low-light imaging with Quanta Image Sensors. In ECCV, pages 122–138, 2020.
  12. Image reconstruction for Quanta Image Sensors using deep neural networks. In IEEE ICASSP, pages 6543–6547, 2018a.
  13. StarGAN: Unified Generative Adversarial Networks for multi-domain Image-to-Image translation. In CVPR, pages 8789–8797, 2018b.
  14. StarGAN v2: Diverse image synthesis for multiple domains. In CVPR, pages 8185–8194, 2020.
  15. Theory of Ordinary Differential Equations. McGraw-Hill, 1955.
  16. A SPAD-based QVGA image sensor for single-photon counting and quanta imaging. IEEE Transactions on Electron Devices, 63(1):189–196, 2016a.
  17. Single photon counting performance and noise analysis of CMOS SPAD-based image sensors. MDPI Sensors, 16(7), 2016b.
  18. Low-light demosaicking and denoising for small pixels using learned frequency selection. IEEE Transactions on Computational Imaging, 7:137–150, 2021.
  19. Eric R. Fossum. Some thoughts on future digital still cameras. In Image Sensors and Signal Processing for Digital Still Cameras, pages 305–314. Taylor and Francis Group, 2005.
  20. Eric R. Fossum. Modeling the performance of single-bit and multi-bit Quanta Image Sensors. IEEE Journal of the Electron Devices Society, 1(9):166–174, 2013.
  21. Eric R. Fossum. Analog read noise and quantizer threshold estimation from Quanta Image Sensor bit density. IEEE Journal of the Electron Devices Society, 10:269–274, 2022.
  22. Image classification in the dark using Quanta Image Sensors. In ECCV, pages 484–501, 2020a.
  23. HDRimaging with Quanta Image Sensors: Theoretical limits and optimal reconstruction. IEEE Transactions on Computational Imaging, 6:1571–1585, 2020b.
  24. Exposure-referred signal-to-noise ratio for digital image sensors. IEEE Transactions on Computational Imaging, 8:561–575, 2022.
  25. DLOW: Domain flow for adaptation and generalization. In CVPR, pages 2472–2481, 2019.
  26. Generative Adversarial Nets. In NeurIPS, 2014.
  27. Asynchronous single-photon 3D imaging. In ICCV, pages 7908–7917, 2019a.
  28. Photon-flooded single-photon 3D cameras. In CVPR, pages 6763–6772, 2019b.
  29. Compressive single-photon 3D cameras. In CVPR, pages 17833–17843, 2022.
  30. Single-photon tracking for high-speed vision. MDPI Sensors, 18:323, 2018.
  31. Denoising diffusion probabilistic models. NeurIPS, 2020.
  32. High flux passive imaging with single-photon sensors. In CVPR, pages 6753–6762, 2021.
  33. Image-to-Image translation with conditional adversarial networks. In CVPR, pages 5967–5976, 2017a.
  34. Image-to-Image translation with conditional adversarial networks. In CVPR, pages 5967–5976, 2017b.
  35. Image quality improvements based on motion-based deblurring for single-photon imaging. IEEE Access, 9:30080–30094, 2021.
  36. Progressive growing of GANs for improved quality, stability, and variation. In ICLR, 2018.
  37. A style-based generator architecture for Generative Adversarial Networks. IEEE TPAMI, 43(12):4217–4228, 2021.
  38. Colorization transformer. In ICLR, 2021.
  39. Speeding-up convolutional neural networks using fine-tuned CP-decomposition. In ICLR, 2015.
  40. RelGAN: Multi-domain Image-to-Image translation via relative attributes. In ICCV, pages 5913–5921, 2019.
  41. GANHopper: Multi-hop GAN for unsupervised Image-to-Image translation. In ECCV, pages 363–379, 2020.
  42. Smoothing the disentangled latent style space for unsupervised Image-to-Image translation. In CVPR, pages 10780–10789, 2021.
  43. Contrastive monotonic pixel-level modulation. In ECCV, pages 493–510, 2022.
  44. EBSR: Feature Enhanced Burst Super-Resolution with deformable alignment. In CVPR, pages 471–478, 2021.
  45. Quanta Image Sensor jot with sub 0.3e- r.m.s. read noise and photon counting capability. IEEE Electron Device Letters, 36(9):926–928, 2015a.
  46. A Pump-Gate Jot device with high conversion gain for a Quanta Image Sensor. IEEE Journal of the Electron Devices Society, 3:73–77, 2015b.
  47. Characterization of Quanta Image Sensor pump-gate jots with deep sub-electron read noise. IEEE Journal of the Electron Devices Society, 3(6):472–480, 2015.
  48. Photon-number-resolving megapixel image sensor at room temperature without avalanche gain. Optica, 4(12):1474–1481, 2017.
  49. A 0.19e- rms read noise 16.7mpixel stacked Quanta Image Sensor with 1.1 µm-pitch backside illuminated pixels. IEEE Electron Device Letters, 42(6):891–894, 2021a.
  50. Review of Quanta Image Sensors for ultralow-light imaging. IEEE Transactions on Electron Devices, 69(6):2824–2839, 2022.
  51. Burst vision using single-photon cameras. In WACV, pages 5364–5374, 2021b.
  52. Continuous and diverse Image-to-Image translation via signed attribute vectors. IJCV, 130(2):517–549, 2020.
  53. Spatiotemporal joint filter decomposition in 3D convolutional neural networks. In NeurIPS, 2021.
  54. Continual learning with filter atom swapping. In ICLR, 2022.
  55. CoMoGAN: Continuous model-guided Image-to-Image translation. In CVPR, pages 14283–14293, 2021.
  56. DCFNet: Deep neural network with decomposed convolutional filters. In ICML, 2018.
  57. Motion adaptive deblurring with single-photon cameras. In WACV, pages 1944–1953, 2021.
  58. Instance-aware image colorization. In CVPR, pages 7965–7974, 2020.
  59. Single-photon structured light. In CVPR, pages 17844–17854, 2022.
  60. ChromaGAN: Adversarial picture colorization with semantic class distribution. In WACV, pages 2434–2443, 2020.
  61. High-resolution image synthesis and semantic manipulation with conditional GANs. In CVPR, pages 8798–8807, 2018.
  62. Deep network interpolation for continuous imagery effect transition. In CVPR, pages 1692–1701, 2019.
  63. Ze Wang. Efficient Adaptation of Deep Vision Models. PhD thesis, Purdue University, 2023. Available online: https://hammer.purdue.edu/articles/thesis/Efficient_Adaptation_of_Deep_Vision_Models/22696129.
  64. Stochastic conditional generative networks with basis decomposition. In ICLR, 2020.
  65. Image generation using continuous filter atoms. In NeurIPS, 2021.
  66. Towards vivid and diverse image colorization with generative color prior. In ICCV, pages 14357–14366, 2021.
  67. Threshold uniformity improvement in 1b Quanta Image Sensor readout circuit. MDPI Sensors, 22(7):2578, 2022.
  68. Unpaired image-to-image translation using cycle-consistent adversarial networks. In ICCV, pages 2242–2251, 2017.
  69. Scaling-translation-equivariant networks with decomposed convolutional filters. Journal of Machine Learning Research, 23(68):1–45, 2022.

Summary

  • The paper introduces a Neural ODE framework that synthesizes a continuum of exposures from binary single-photon data to enhance color imaging quality.
  • It leverages convolutional filter decomposition to control exposure transitions, significantly improving colorization in low-light conditions.
  • Empirical evaluations show superior performance over traditional methods, validated with real-world CMOS and QIS camera data.

Generative Quanta Color Imaging: A New Paradigm for Single-Photon Imaging

Introduction

The field of single-photon imaging has witnessed notable advancements over the past years, primarily fueled by the development of single-photon cameras. These sensors, including single-photon avalanche diodes (SPAD) and quanta image sensors (QIS), are known for their exceptional photon-counting capabilities. This technological progress opens new avenues for imaging applications in environments with minimal light, enabling high-speed, high dynamic range, and low-bit imaging solutions. Despite these advancements, the significant data throughput generated by 1-bit sensors poses a considerable challenge, especially for low-power applications. Addressing this bottleneck, we explore an innovative approach to recover a color image from a single binary frame captured by these cameras. The cornerstone of our methodology is an exposure synthesis model constructed on top of a neural ordinary differential equation (Neural ODE), which generates a continuum of exposures from a single observation.

Core Contributions

Our paper brings forth two primary contributions:

  • The development of a neural ODE-based framework to synthesize a continuum of exposures not initially available in the measurement. By adjusting the integration interval, we can derive convolutional filters tailored for generating image representations at desired exposure levels.
  • A theoretical and empirical demonstration showing how controlled variations in filter atoms can steer exposure changes in the output image, resulting in enhanced colorization quality.

Methodology Overview

Our approach originates from the observation that single-photon camera outputs, due to being binary, significantly limit exposure variation. To tackle this, we introduce an exposure synthesis model leveraging Neural ODEs. This model enables us to produce images at various exposure levels from a single binary input, effectively facilitating consistent exposure for colorizers and yielding superior colorization results.

Exposure Synthesis with Neural ODEs: We adapt convolutional filter decomposition techniques alongside neural ODEs, allowing for the efficient encoding of filter parameters. This configuration ensures the smooth transition of filter atoms according to the exposure levels, efficiently generating exposure-corrected binary images without the need for extensive training datasets covering a wide exposure range.

Colorization: Our method encompasses single-image-based and burst-based colorization strategies. The former employs a dedicated colorization network enhancing a single exposure-corrected binary image. In contrast, the latter benefits from utilizing a set of binary images reflecting a range of exposures, fostering more effective multi-exposure colorization.

Experimental Insights

Our empirical evaluations underline the efficacy of the proposed method, showcasing superior performance in generating high-quality colorized images from binary inputs when compared against several baselines. The approach demonstrates remarkable adaptability to varying exposure conditions, significantly outperforming traditional methods in both qualitative and quantitative assessments. Moreover, tests on real-world data captured from CMOS and prototype QIS cameras further validate the practical applicability of our method, highlighting its potential in real-world single-photon imaging scenarios.

Future Prospects

This research opens the door to new possibilities in the field of generative imaging, especially in contexts where data compression is paramount. The introduction of Neural ODEs in this domain paves the way for further exploration into efficient, adaptive imaging techniques. The proposed methodology not only demonstrates an extreme case of data compression but also sets a foundation for future advancements in augmented/virtual reality applications, where power efficiency and data throughput are critical concerns.

Conclusion

In summary, our work presents a seminal approach to address the challenge of generating color images from binary frames in single-photon imaging systems. By harnessing the power of neural ODEs for exposure synthesis, we achieve notable improvements in image colorization quality, with promising implications for low-power, high-speed imaging applications. This study lays the groundwork for further exploration and development in the field, emphasizing the importance of generative models in advancing single-photon imaging technology.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 104 likes about this paper.