Files

Abstract

The study of strong gravitational lenses is a relatively new scientific field in astronomy with many applications in cosmology. Its unique observables allow astronomers to trace dark matter, determine the expansion of the universe and study galaxy evolution. While strong lenses are relatively rare phenomena, recent and future improvements in observational instrumentation are expected to lead to a sharp increase in the number and quality of their observation. With that increase comes however an observational challenge from the sheer quantity of data the lenses are hidden in. Current classification and modelling techniques are simply not capable of handling the new data in a reasonable timeframe. As a consequence this thesis concerns itself with the development and improvement of numerical methods necessary to find and analyse these newly found lenses, borrowing from deep learning and high performance computing techniques. Deep learning, a field of machine learning, shows significant success in finding lenses in simulations. Performing significantly better than any other classification method, CNN-based lens finders reach almost the required classification efficiency and accuracy for a fully automated Euclid strong lensing pipeline. Its dependency on a varied and complete training set and the difficulty creating it make its immediate application to new surveys however more difficult. Incorporating high performance computing techniques and more concurrent algorithm design into lens analysis results in crucial speed-ups for strong lens analysis. Applied to Lenstool, a popular mass modelling tool for gravitational lenses, the resulting software can be run on modern supercomputers, using Graphics Processing Units (GPU) and CPUs simultaneously. Running it on state-of-the-art hardware makes it possible to reduce computation time to an acceptable level when mass modelling complex cluster lenses.

Details

Actions

Preview