The observed Lyman-α flux power spectrum (FPS) is suppressed on scales below ∼ 30kms−1∼ 30kms−1. This cut-off could be due to the high temperature, T0, and pressure, p0, of the absorbing gas or,... Show moreThe observed Lyman-α flux power spectrum (FPS) is suppressed on scales below ∼ 30kms−1∼ 30kms−1. This cut-off could be due to the high temperature, T0, and pressure, p0, of the absorbing gas or, alternatively, it could reflect the free streaming of dark matter particles in the early universe. We perform a set of very high resolution cosmological hydrodynamic simulations in which we vary T0, p0, and the amplitude of the dark matter free streaming, and compare the FPS of mock spectra to the data. We show that the location of the dark matter free-streaming cut-off scales differently with redshift than the cut-off produced by thermal effects and is more pronounced at higher redshift. We, therefore, focus on a comparison to the observed FPS at z > 5. We demonstrate that the FPS cut-off can be fit assuming cold dark matter, but it can be equally well fit assuming that the dark matter consists of ∼7 keV sterile neutrinos in which case the cut-off is due primarily to the dark matter free streaming. Show less
Context. The direct detection and characterization of planetary and substellar companions at small angular separations is a rapidly advancing field. Dedicated high-contrast imaging instruments... Show moreContext. The direct detection and characterization of planetary and substellar companions at small angular separations is a rapidly advancing field. Dedicated high-contrast imaging instruments deliver unprecedented sensitivity, enabling detailed insights into the atmospheres of young low-mass companions. In addition, improvements in data reduction and point spread function (PSF)-subtraction algorithms are equally relevant for maximizing the scientific yield, both from new and archival data sets.Aims. We aim at developing a generic and modular data-reduction pipeline for processing and analysis of high-contrast imaging data obtained with pupil-stabilized observations. The package should be scalable and robust for future implementations and particularly suitable for the 3–5 m wavelength range where typically thousands of frames have to be processed and an accurate subtraction of the thermal background emission is critical.Methods. PynPoint is written in Python 2.7 and applies various image-processing techniques, as well as statistical tools for analyzing the data, building on open-source Python packages. The current version of PynPoint has evolved from an earlier version that was developed as a PSF-subtraction tool based on principal component analysis (PCA).Results. The architecture of PynPoint has been redesigned with the core functionalities decoupled from the pipeline modules. Modules have been implemented for dedicated processing and analysis steps, including background subtraction, frame registration, PSF subtraction, photometric and astrometric measurements, and estimation of detection limits. The pipeline package enables end-to-end data reduction of pupil-stabilized data and supports classical dithering and coronagraphic data sets. As an example, we processed archival VLT/NACO L0 and M0 data of Pic b and reassessed the brightness and position of the planet with a Markov chain Monte Carlo analysis; we also provide a derivation of the photometric error budget. Show less
Qian, X.; Cai, X.; Portegies Zwart, S.F.; Zhu, M. 2017
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical... Show moreScientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction. Show less
Hahn, C.; Vakili, M.; Walsh, K.; Hearin, A.P.; Hogg, D.W.; Campbell, D. 2017
Astrophysical research in recent decades has made significant progress thanks to the availability of various N-body simulation techniques. With the rapid development of high-performance computing... Show moreAstrophysical research in recent decades has made significant progress thanks to the availability of various N-body simulation techniques. With the rapid development of high-performance computing technologies, modern simulations have been able to use the computing power of massively parallel clusters with more than 105 GPU cores. While unprecedented accuracy and dynamical scales have been achieved, the enormous amount of data being generated continuously poses great challenges for the subsequent procedures of data analysis and archiving. In this paper, we propose an adaptive storage scheme for simulation data, inspired by the block time step (BTS) integration scheme found in a number of direct N-body integrators available nowadays, as an urgent response to these challenges. The proposed scheme, namely, the BTS storage scheme, works by minimizing the data redundancy by assigning individual output frequencies to the data as required by the researcher. As demonstrated by benchmarks, the proposed scheme is applicable to a wide variety of simulations. Despite the main focus of developing a solution for direct N-body simulation data, the methodology is transferable for grid-based or tree-based simulations where hierarchical time stepping is used. Show less
Mandelbaum, R.; Rowe, B.; Bosch, J.; Chang, C.; Courbin, F.; Gill, M.; ... ; Schrabback, T. 2014