TEGLIE: Transformer encoders as strong gravitational lens finders in KiDS
Abstract: We apply a state-of-the-art transformer algorithm to 221 deg$2$ of the Kilo Degree Survey (KiDS) to search for new strong gravitational lenses (SGL). We test four transformer encoders trained on simulated data from the Strong Lens Finding Challenge on KiDS survey data. The best performing model is fine-tuned on real images of SGL candidates identified in previous searches. To expand the dataset for fine-tuning, data augmentation techniques are employed, including rotation, flipping, transposition, and white noise injection. The network fine-tuned with rotated, flipped, and transposed images exhibited the best performance and is used to hunt for SGL in the overlapping region of the Galaxy And Mass Assembly (GAMA) and KiDS surveys on galaxies up to $z$=0.8. Candidate SGLs are matched with those from other surveys and examined using GAMA data to identify blended spectra resulting from the signal from multiple objects in a fiber. We observe that fine-tuning the transformer encoder to the KiDS data reduces the number of false positives by 70%. Additionally, applying the fine-tuned model to a sample of $\sim$ 5,000,000 galaxies results in a list of $\sim$ 51,000 SGL candidates. Upon visual inspection, this list is narrowed down to 231 candidates. Combined with the SGL candidates identified in the model testing, our final sample includes 264 candidates, with 71 high-confidence SGLs of which 44 are new discoveries. We propose fine-tuning via real augmented images as a viable approach to mitigating false positives when transitioning from simulated lenses to real surveys. Additionally, we provide a list of 121 false positives that exhibit features similar to lensed objects, which can benefit the training of future machine learning models in this field.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.