Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Toward Real Flare Removal: A Comprehensive Pipeline and A New Benchmark (2306.15884v1)

Published 28 Jun 2023 in cs.CV

Abstract: Photographing in the under-illuminated scenes, the presence of complex light sources often leave strong flare artifacts in images, where the intensity, the spectrum, the reflection, and the aberration altogether contribute the deterioration. Besides the image quality, it also influence the performance of down-stream visual applications. Thus, removing the lens flare and ghosts is a challenge issue especially in low-light environment. However, existing methods for flare removal mainly restricted to the problems of inadequate simulation and real-world capture, where the categories of scattered flares are singular and the reflected ghosts are unavailable. Therefore, a comprehensive deterioration procedure is crucial for constructing the dataset of flare removal. Based on the theoretical analysis and real-world evaluation, we propose a well-developed methodology for generating the data-pairs with flare deterioration. The procedure is comprehensive, where the similarity of scattered flares and the symmetric effect of reflected ghosts are realized. Moreover, we also construct a real-shot pipeline that respectively processes the effects of scattering and reflective flares, aiming to directly generate the data for end-to-end methods. Experimental results show that the proposed methodology add diversity to the existing flare datasets and construct a comprehensive mapping procedure for flare data pairs. And our method facilities the data-driven model to realize better restoration in flare images and proposes a better evaluation system based on real shots, resulting promote progress in the area of real flare removal.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Y. Dai, C. Li, S. Zhou, R. Feng, and C. C. Loy, “Flare7k: A phenomenological nighttime flare removal dataset,” arXiv e-prints, 2022.
  2. Y. Wu, Q. He, T. Xue, R. Garg, J. Chen, A. Veeraraghavan, and J. T. Barron, “How to train neural networks for flare removal,” 2020.
  3. Y. Fu, Y. Liu, J. Li, D. Luo, S. Lv, Y. Jv, and L. Xie, “Uformer: A unet based dilated complex real dual-path conformer network for simultaneous speech enhancement and dereverberation,” 2021.
  4. S. W. Zamir, A. Arora, S. Khan, M. Hayat, F. S. Khan, and M. H. Yang, “Restormer: Efficient transformer for high-resolution image restoration,” 2021.
  5. C. S. Asha, S. K. Bhat, D. Nayak, and C. Bhat, “Auto removal of bright spot from images captured against flashing light source,” in 2019 IEEE International Conference on Distributed Computing, VLSI, Electrical Circuits and Robotics (DISCOVER), 2019.
  6. P. Vitoria and C. Ballester, “Automatic flare spot artifact detection and removal in photographs,” Journal of Mathematical Imaging and Vision, 2019.
  7. X. Qiao, G. P. Hancke, and R. Lau, “Light source guided single-image flare removal from unpaired data,” in International Conference on Computer Vision, 2021.
  8. Q. Sun, E. Tseng, Q. Fu, W. Heidrich, and F. Heide, “Learning rank-1 diffractive optics for single-shot high dynamic range imaging,” in cvpr, 2020.
  9. R. Feng, C. Li, H. Chen, S. Li, C. C. Loy, and J. Gu, “Removing diffraction image artifacts in under-display camera via dynamic skip connection network,” 2021.
  10. Y. Song, Z. He, H. Qian, and X. Du, “Vision transformers for single image dehazing,” arXiv e-prints, 2022.
  11. C. Li, Y. Yang, K. He, S. Lin, and J. E. Hopcroft, “Single image reflection removal through cascaded refinement,” in 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020.
  12. B. Mildenhall, P. P. Srinivasan, M. Tancik, J. T. Barron, and N. Ren, “Nerf: representing scenes as neural radiance fields for view synthesis,” Communications of the ACM, vol. 65, no. 1, pp. 99–106, 2022.
  13. K. Zhang, G. Riegler, N. Snavely, and V. Koltun, “Nerf++: Analyzing and improving neural radiance fields,” 2020.
  14. A. Yu, V. Ye, M. Tancik, and A. Kanazawa, “pixelnerf: Neural radiance fields from one or few images,” in Computer Vision and Pattern Recognition, 2021.
  15. T. Müller, A. Evans, C. Schied, and A. Keller, “Instant neural graphics primitives with a multiresolution hash encoding,” arXiv e-prints, 2022.
  16. M. Tancik, V. Casser, X. Yan, S. Pradhan, B. Mildenhall, P. P. Srinivasan, J. T. Barron, and H. Kretzschmar, “Block-nerf: Scalable large scene neural view synthesis,” arXiv e-prints, 2022.
  17. H. Turki, D. Ramanan, and M. Satyanarayanan, “Mega-nerf: Scalable construction of large-scale nerfs for virtual fly-throughs,” 2021.
  18. R. Martin-Brualla, N. Radwan, M. Sajjadi, J. T. Barron, A. Dosovitskiy, and D. Duckworth, “Nerf in the wild: Neural radiance fields for unconstrained photo collections,” 2020.
  19. B. Mildenhall, P. Hedman, R. Martin-Brualla, P. Srinivasan, and J. T. Barron, “Nerf in the dark: High dynamic range view synthesis from noisy raw images,” 2021.
  20. M. Hullin, E. Eisemann, H. P. Seidel, and S. Lee, “Physically-based real-time lens flare rendering,” Acm Transactions on Graphics, vol. 30, no. 4, pp. 1–10, 2011.
  21. S. Lee and E. Eisemann, “Practical real-time lens-flare rendering,” Computer Graphics Forum, 2013.
  22. X. Zhang, R. Ng, and Q. Chen, “Single image reflection separation with perceptual losses,” IEEE, 2018.
  23. A. Crocherie, J. Pond, F. D. Gomez, K. Channon, and F. Fantoni, “Micro to macro scale simulation coupling for stray light analysis,” Optics Express, vol. 29, no. 23, pp. 37 639–37 652, 2021.
Citations (1)

Summary

We haven't generated a summary for this paper yet.