Multi-GPU maximum entropy image synthesis for radio astronomy

The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has an statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridded MEM, which is tested using interferometric and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This has allowed us to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 minutes, instead of the 2.5 days that takes on CPU.

Comments: 11 pages, 13 figures

Similar Publications

We present a hierarchical probabilistic model for improving geometric stellar distance estimates using color--magnitude information. This is achieved with a data driven model of the color--magnitude diagram, not relying on stellar models but instead on the relative abundances of stars in color--magnitude cells, which are inferred from very noisy magnitudes and parallaxes. While the resulting noise-deconvolved color--magnitude diagram can be useful for a range of applications, we focus on deriving improved stellar distance estimates relying on both parallax and photometric information. Read More

Computing the inverse covariance matrix (or precision matrix) of large data vectors is crucial in weak lensing (and multi-probe) analyses of the large scale structure of the universe. Analytically computed covariances are noise-free and hence straightforward to invert, however the model approximations might be insufficient for the statistical precision of future cosmological data. Estimating covariances from numerical simulations improves on these approximations, but the sample covariance estimator is inherently noisy, which introduces uncertainties in the error bars on cosmological parameters and also additional scatter in their best fit values. Read More

Polarimetric observations of celestial sources in the hard X-ray band stand to provide new information on emission mechanisms and source geometries. PoGO+ is a Compton scattering polarimeter (20-150 keV) optimised for the observation of the Crab (pulsar and wind nebula) and Cygnus X-1 (black hole binary), from a stratospheric balloon-borne platform launched from the Esrange Space Centre in summer 2016. Prior to flight, the response of the polarimeter has been studied with polarised and unpolarised X-rays allowing a Geant4-based simulation model to be validated. Read More

This work employs a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and W$_{H\alpha}$ vs. [NII]/H$\alpha$ (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT datasets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [OIII]/H\beta, log [NII]/H\alpha, and log EW(H{\alpha}) optical parameters. Read More

X-ray and gamma-ray polarimetry is a promising tool to study the geometry and the magnetic configuration of various celestial objects, such as binary black holes or gamma-ray bursts (GRBs). However, statistically significant polarizations have been detected in few of the brightest objects. Even though future polarimeters using X-ray telescopes are expected to observe weak persistent sources, there are no effective approaches to survey transient and serendipitous sources with a wide field of view (FoV). Read More

High-Performance Adaptive Optics systems are rapidly spreading as useful applications in the fields of astronomy, ophthalmology, and telecommunications. This technology is critical to enable coronagraphic direct imaging of exoplanets utilized in ground-based telescopes and future space missions such as WFIRST, EXO-C, HabEx, and LUVOIR. We have developed a miniaturized Deformable Mirror controller to enable active optics on small space imaging mission. Read More

Cosmic Rays (CR) are high energy particles which come from the universe. When one of those particles enters to the atmosphere of the earth it produces an air shower, conformed by secondary particles in which the initial energy is distributed. The Pierre Auger Observatory, located in Argentina, is dedicated to the study of those events. Read More

This paper considers a new method for the binary asteroid orbit determination problem. The method is based on the Bayesian approach with a global optimisation algorithm. The orbital parameters to be determined are modelled through a posteriori, including a priori and likelihood terms. Read More

The Community Coordinated Modeling Center (CCMC) at NASA Goddard Space Flight Center is a multi-agency partnership to enable, support and perform research and development for next-generation space science and space weather models. CCMC currently hosts nearly 100 numerical models and a cornerstone of this activity is the Runs on Request (RoR) system which allows anyone to request a model run and analyze/visualize the results via a web browser. CCMC is also active in the education community by organizing student research contests, heliophysics summer schools, and space weather forecaster training for students, government and industry representatives. Read More