Upgrade to Pro — share decks privately, control downloads, hide ads and more …

GT ICR - Jonathan KERN

GT ICR - Jonathan KERN

Efficient Learned Deconvolution of Radio Interferometric Images Using Deep Unrolling

François Orieux

March 17, 2025
Tweet

More Decks by François Orieux

Other Decks in Research

Transcript

  1. Efficient Learned Deconvolution of Radio Interferometric Images Using Deep Unrolling

    Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 14 mars 2025 Efficient Learned Deconvolution of Radio Interferometric Images Using Deep Unrolling Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 14 mars 2025
  2. 1. Introduction 2. Algorithm unrolling for image reconstruction 3. CARB

    : Efficient uncertainty quantification 4. Conclusion Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 2 / 31
  3. New very large radio-interferometric observatory are being constructed such as

    SKA SKA-Mid : 197 planned dishes SKA-Low : 131,072 planned antennas A massive data processing challenge 20 terabits per second of data generated : 1,000 × the data rate of ALMA Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 3 / 31 Towards large-scale radio-data
  4. y = Ax + n y : visibilities x :

    sky image A : Forward operator (De)Gridding & Fourier transform n is a white Gaussian noise ∗ = = ⨀ True image PSF Dirty image Visibilities UV coverage Observed visibilities ℱ ℱ−1 ⇆ Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 5 / 31 Radio-interferometry model
  5. We aim to reconstruct the true image x Minimization problem

    arg min x ∥y − Ax∥2 2 y : visibilities, A : forward operator Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 6 / 31 Image reconstruction problem
  6. We aim to reconstruct the true image x Minimization problem

    arg min x ∥y − Ax∥2 2 y : visibilities, A : forward operator An ill posed problem We need to add constraint on x arg min x ∥y − Ax∥2 2 + R (x) Different types of constraint possible : sparsity (in various domain), learned regularizer, ... Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 6 / 31 Image reconstruction problem
  7. CLEAN algorithm (Högbom, 1974) ▪ Mostly used on point sources

    ▪ Detection of one peak in the image at a time Compressive sensing methods (Wiaux, 2009) Iterative algorithm using proximal operators for minimizing the regularization term ie, with R (x) = ∥x∥1 : zt+1 = xt − 2¸A∗(y − Axt ) , Forward step xt+1 = ST–¸ (zt+1) , Backward step (¸ : gradient step, – : threshold) Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 7 / 31 Existing methods
  8. PnP : Plug and Play (Terris et al., 2022) The

    prox operator can be replaced by a pretrained DNN xt+1 = D(xt − 2¸A∗(y − Axt )) End to end DNN (Schuler, 2013) / Learned post-processing Apply a DNN on the dirty image ▪ A lot faster than iterative algorithms ▪ Require a lot of training data ▪ Not model-based –> worse performance Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 8 / 31 Existing methods with learned components
  9. PnP : Plug and Play (Terris et al., 2022) The

    prox operator can be replaced by a pretrained DNN xt+1 = D(xt − 2¸A∗(y − Axt )) End to end DNN (Schuler, 2013) / Learned post-processing Apply a DNN on the dirty image ▪ A lot faster than iterative algorithms ▪ Require a lot of training data ▪ Not model-based –> worse performance Need for a method as fast a DNNs but taking into account the physic reconstruction model for best performance Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 8 / 31 Existing methods with learned components
  10. 1. Introduction 2. Algorithm unrolling for image reconstruction 3. CARB

    : Efficient uncertainty quantification 4. Conclusion Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 9 / 31
  11. Unrolling (Gregor, 2011) We unroll an iterative algorithm for L

    iterations and convert it as a L layers DNN One layer –> One iteration Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 10 / 31 Algorithm unrolling
  12. Proximal Gradient Descent (PGD) : ˜ xl+1 = prox( ˜

    xl + 2¸A∗( ˜ y − A ˜ xl )) , until convergence or max iteration is reached ¸ : gradient step Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 11 / 31 Algorithm unrolling for RI
  13. Proximal Gradient Descent (PGD) : ˜ xl+1 = prox( ˜

    xl + 2¸A∗( ˜ y − A ˜ xl )) , until convergence or max iteration is reached ¸ : gradient step We unroll the PGD algorithm to obtain a Learned PGD : ˜ xl+1 = gl ( ˜ xl + wl ∗ A∗( ˜ y − A ˜ xl )) , for l from 1 to L gl Trainable or pre-trained operator wl Trainable scalar or convolution kernel Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 11 / 31 Algorithm unrolling for RI
  14. We can also add an inertial term as in FISTA

    : ˜ xl+1 = gl ( ˜ xl + wl ∗ A∗( ˜ y − A ˜ xl )) ˜ xl+1 = ( ˜ xl + βl ( ˜ xl − ˜ xl )) gl Trainable or pre-trained operator wl Trainable convolution kernel ˛l Trainable scalar Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 12 / 31 Algorithm unrolling for RI
  15. We can also add an inertial term as in FISTA

    : ˜ xl+1 = gl ( ˜ xl + wl ∗ A∗( ˜ y − A ˜ xl )) ˜ xl+1 = ( ˜ xl + βl ( ˜ xl − ˜ xl )) gl Trainable or pre-trained operator wl Trainable convolution kernel ˛l Trainable scalar Loss function L = 1 Ntrain Ntrain X i=0 ‚ ‚ ‚ xtrue i − xpredicted i ‚ ‚ ‚ 2 2 ‚ ‚xtrue i ‚ ‚ 2 2 Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 12 / 31 Algorithm unrolling for RI
  16. To work directly with visibilities we use a Non-Uniform FFT

    : Kaisser-Bessel Gridding + FFT One layer of our unrolled DNN : Denoiser Which denoiser to choose? Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 13 / 31 Our unrolled DNN
  17. For point sources : R (x) = ∥x∥1 Soft thresholding

    gl (x) = (0, if|x| ≤ ‚l |x − ‚l|, otherwise ‚l is a trainable parameter Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 14 / 31 Different kind of regularization operators
  18. For point sources : R (x) = ∥x∥1 Soft thresholding

    gl (x) = (0, if|x| ≤ ‚l |x − ‚l|, otherwise ‚l is a trainable parameter For more complex sources : R (x) = P i ∥Ψi x∥1 We seek sparsity in a dictionary of wavelet bases (for example the first eight Daubechies wavelets) Wavelet denoiser gl (x) = argmin u ∥u − x∥2 2 + –l X i ∥Ψi u∥1 , –l is the learned regularization/thresholding parameter Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 14 / 31 Different kind of regularization operators
  19. For more complex images g can be replaced by a

    pretrained deep denoiser. D can for example be a UNet : PnP prior gl (x) = D(x, ff) ff is a learned noise level parameter Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 15 / 31 Different kind of regularization operators
  20. Two types of data ▪ Simulated point sources : randomly

    generated with 1% of sources following absolute gaussian distribution for each image ▪ Galaxy simulation Simulated UV coverage based on MEERKAT antennas - 1h observation time Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 16 / 31 Experiments Setup
  21. True Reconstructed Absolute Error Y 0.0 0.2 0.4 0.6 0.8

    1.0 0.0 0.2 0.4 0.6 0.8 1.0 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.000 0.002 0.004 0.006 0.008 0.010 0.012 0.014 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 0.00 0.02 0.04 0.06 0.08 0.10 0.12 0.14 0.16 0.000 0.002 0.004 0.006 0.008 0.010 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 0.00 0.02 0.04 0.06 0.08 0.10 0.000 0.002 0.004 0.006 0.008 0.010 Dirty images generated with SNR=15dB Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 17 / 31 Simulated point sources
  22. We compare several reconstruction method ▪ FISTA-like with a wavelet

    prior ˜ xl+1 = g( ˜ xl + 2¸A∗(y − Axl )) ˜ xl+1 = ( ˜ xl + βl ( ˜ xl − ˜ xl )) ▪ PnP : PGD with a DRUNet denoiser as prior ˜ xl+1 = D( ˜ xl + 2¸A∗(y − Axl )) ▪ End to End DNN (Restormer) applied to dirty image A∗y ▪ Unrolled FISTA with DRUNet denoiser xl+1 = D([xl + w ∗ A∗(y − Axl )]) ˜ xl+1 = ( ˜ xl + βl ( ˜ xl − ˜ xl )) Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 18 / 31 Realistic astronomical images
  23. True image Dirty image UV coverage FISTA wavelet PSNR :

    42.5dB PnP PSNR : 44.0dB Restormer PSNR : 27.4dB Unrolling PSNR : 44.1dB Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 19 / 31 Realistic astronomical images
  24. True image Dirty image UV coverage FISTA wavelet PSNR :

    43.0dB PnP PSNR : 46.8dB Restormer PSNR : 25.5dB Unrolling PSNR : 46.5dB Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 20 / 31 Realistic astronomical images
  25. FISTA wavelet PGD DRUNet Unrolling Wavelet Unrolling DRUNet 30 35

    40 45 50 PSNR (dB) Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 21 / 31 Galaxy simulations : Importance of the prior
  26. Unrolling results : ▪ Greatly reduced computation budget (10× less

    iterations) ▪ Great reconstruction quality Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 22 / 31 Faster reconstruction : algorithm unrolling
  27. Unrolling results : ▪ Greatly reduced computation budget (10× less

    iterations) ▪ Great reconstruction quality Unrolling drawbacks : ▪ Lost interpretation of the reconstruction ▪ Is it the fixed point of an equation? ▪ Is the reconstruction related to a posterior probability distribution? ▪ UQ is missing These drawbacks limit its scientific application Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 22 / 31 Faster reconstruction : algorithm unrolling
  28. 1. Introduction 2. Algorithm unrolling for image reconstruction 3. CARB

    : Efficient uncertainty quantification 4. Conclusion Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 23 / 31
  29. Based on the equivariant Bootstrap framework of Tachella & Pereyra

    (2024) Given an observation model y = Ax + n (e.g. RI imaging), group actions {Tg}g∈G such that Tg x ∈ X and a reconstruction method ˆ x(y) = f(y) (e.g. our unrolling algorithm) : For i = 1, ... , N : 1. Draw transform gi from G and sample noise ni ∼ N (0, ff2I) 2. Build bootsrap measurement ˜ yi = ATgi ˆ x(y) + ni = Agi ˆ x(y) + ni 3. Reconstruct ˜ xi = T−1 gi ˆ x(˜ yi ) 4. Collect error estimate ei = ∥ˆ x(y) − ˜ xi ∥2 Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 24 / 31 CARB : Conformalized Augmented Radio Bootstrap
  30. Motivation : ▪ Unsupervised method → No ground truth required

    ▪ Independent of the reconstruction method and each sample can run in parallel ▪ Well-suited to ultra-fast reconstruction methods, e.g. unrolled algorithms ▪ Carefully selected group transforms allow us to explore the big nullspace of the RI imaging forward operator and better characterise the errors CARB method consists of : 1. Fast reconstruction algorithm 2. Equivariant bootstrap framework 3. Adapted group actions for the RI imaging problem 4. Conformalisation procedure to guarantee coverage Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 25 / 31 CARB : Conformalized Augmented Radio Bootstrap
  31. The group actions we consider are : 1. Circular shift

    translations not exceeding 2 pixels, 2. Image flips over the horizontal and vertical axis, 3. Rotations of 90-degrees multiples, 4. Invertible 2D radially-symmetric filters in the specific form of low-shelving and high-shelving filters with varying cuttoff frequencies. Examples of filter transformations : 0 20 40 60 80 100 120 Frequency 0.0 0.2 0.4 0.6 0.8 1.0 Amplitude Low-shelving filters High-shelving filters Each time we apply a random composition of these transformations, where each transformation is applied with a given probability. Pixel-wise UQ maps From the collection of N bootstrap samples, {˜ xi }N i=1 , we build confidence regions, C¸ , for x (ground truth) using q¸ the top ¸-quantile of the samples {|ˆ x(y) − ˜ x|}N i=1 , with C¸ = {x : |x − ˆ x(y)| < q¸}. Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 26 / 31 CARB : Conformalized Augmented Radio Bootstrap
  32. Uncertainty quantification performance comparison (90% confidence interval) Method Length Coverage

    Quantile Regression (QR) 0.15 14% Conformalized QR 204.08 92% Parametric Bootstrap 0.07 0% Equivariant Bootstrap 0.13 7% Augmented Radio Bootstrap 0.29 87% CARB 0.34 91% Coverage plots for equivariant bootstrap methods with different group actions. 0.0 0.2 0.4 0.6 0.8 1.0 Confidence level 0.0 0.2 0.4 0.6 0.8 1.0 Empirical coverage Parametric Rotations Rotations and flips Rotations and flips and 1-shifts Rotations and flips and 2-shifts Rotations and flips and 3-shifts Augmented Radio Bootstrap Ideal Tight intervals and very good coverage! Results showcase : ▪ the importance of selecting adapted group actions, ▪ the conformalisation is useful once the intervals are already good. We still need to validate the method on higher dimensions. Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 27 / 31 CARB : Results
  33. Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR

    SKA 20/06/2024 28 / 31 CARB : Qualitative UQ comparison
  34. 1. Introduction 2. Algorithm unrolling for image reconstruction 3. CARB

    : Efficient uncertainty quantification 4. Conclusion Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 29 / 31
  35. Conclusion : ▪ Efficient reconstruction method for radio interferometric images

    using algorithm unrolling ▪ Faster than current iterative methods (∼ 10 iterations) ▪ Capable of working directly with visibilities ▪ Can work with uncertainty quantification method Perspectives : ▪ Apply unrolling method and UQ on bigger, realistic images ▪ Improve design of network for better performance and adaptability Jonathan Kern, Jérome Bobin, Tobias Liaudat Christophe Kervazo GT ICR SKA 20/06/2024 30 / 31 Conclusion and perspectives