Papers
Topics
Authors
Recent
Search
2000 character limit reached

Adversarial training applied to Convolutional Neural Network for photometric redshift predictions

Published 24 Feb 2020 in astro-ph.IM and eess.IV | (2002.10154v1)

Abstract: The use of Convolutional Neural Networks (CNN) to estimate the galaxy photometric redshift probability distribution by analysing the images in different wavelength bands has been developed in the recent years thanks to the rapid development of the Machine Learning (ML) ecosystem. Authors have set-up CNN architectures and studied their performances and some sources of systematics using standard methods of training and testing to ensure the generalisation power of their models. So far so good, but one piece was missing : does the model generalisation power is well measured? The present article shows clearly that very small image perturbations can fool the model completely and opens the Pandora's box of \textit{adversarial} attack. Among the different techniques and scenarios, we have chosen to use the Fast Sign Gradient one-step Method and its Projected Gradient Descent iterative extension as adversarial generator tool kit. However, as unlikely as it may seem these adversarial samples which fool not only a single model, reveal a weakness both of the model and the classical training. A revisited algorithm is shown and applied by injecting a fraction of adversarial samples during the training phase. Numerical experiments have been conducted using a specific CNN model for illustration although our study could be applied to other models - not only CNN ones - and in other contexts - not only redshift measurements - as it deals with the complexity of the boundary decision surface.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.