Since upcoming telescopes will observe thousands of strong lensing systems, creating fully automated analysis pipelines for these images becomes increasingly important. In this work, we make a step towards that direction by developing the first end-to-end differentiable strong lensing pipeline. Our approach leverages and combines three important computer science developments: (i) convolutional neural networks (CNNs), (ii) efficient gradient-based sampling techniques, and (iii) deep probabilistic programming languages. The latter automatize parameter inference and enable the combination of generative deep neural networks and physics components in a single model. In the current work, we demonstrate that it is possible to combine a CNN trained on galaxy images as a source model with a fully differentiable and exact implementation of gravitational lensing physics in a single probabilistic model. This does away with hyperparameter tuning for the source model, enables the simultaneous optimization of nearly 100 source and lens parameters with gradient-based methods, and allows the use of efficient gradient-based posterior sampling techniques. These features make this automated inference pipeline potentially suitable for processing a large amount of data. By analysing mock lensing systems with different signal-to-noise ratios, we show that lensing parameters are reconstructed with per cent-level accuracy. More generally, we consider this work as one of the first steps in establishing differentiable probabilistic programming techniques in the particle astrophysics community, which have the potential to significantly accelerate and improve many complex data analysis tasks.

Differentiable strong lensing: Uniting gravity and neural nets through differentiable probabilistic programming / Chianese, M.; Coogan, A.; Hofma, P.; Otten, S.; Weniger, C.. - In: MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY. - ISSN 0035-8711. - 496:1(2020), pp. 381-393. [10.1093/mnras/staa1477]

Differentiable strong lensing: Uniting gravity and neural nets through differentiable probabilistic programming

Chianese M.;
2020

Abstract

Since upcoming telescopes will observe thousands of strong lensing systems, creating fully automated analysis pipelines for these images becomes increasingly important. In this work, we make a step towards that direction by developing the first end-to-end differentiable strong lensing pipeline. Our approach leverages and combines three important computer science developments: (i) convolutional neural networks (CNNs), (ii) efficient gradient-based sampling techniques, and (iii) deep probabilistic programming languages. The latter automatize parameter inference and enable the combination of generative deep neural networks and physics components in a single model. In the current work, we demonstrate that it is possible to combine a CNN trained on galaxy images as a source model with a fully differentiable and exact implementation of gravitational lensing physics in a single probabilistic model. This does away with hyperparameter tuning for the source model, enables the simultaneous optimization of nearly 100 source and lens parameters with gradient-based methods, and allows the use of efficient gradient-based posterior sampling techniques. These features make this automated inference pipeline potentially suitable for processing a large amount of data. By analysing mock lensing systems with different signal-to-noise ratios, we show that lensing parameters are reconstructed with per cent-level accuracy. More generally, we consider this work as one of the first steps in establishing differentiable probabilistic programming techniques in the particle astrophysics community, which have the potential to significantly accelerate and improve many complex data analysis tasks.
2020
Differentiable strong lensing: Uniting gravity and neural nets through differentiable probabilistic programming / Chianese, M.; Coogan, A.; Hofma, P.; Otten, S.; Weniger, C.. - In: MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY. - ISSN 0035-8711. - 496:1(2020), pp. 381-393. [10.1093/mnras/staa1477]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/874771
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 24
  • ???jsp.display-item.citation.isi??? 22
social impact