top of page

Gravitational-wave astrophysics

Listen to the universe through ripples of spacetime

Gravitational-wave astrophysics


On September 14, 2015, gravitational wave was detected for the first time in human history. In the years that follows, there was no shortage of exciting discoveries in the field of gravitational-wave astrophysics. From a binary neutron stars merger that is both observed through the gravitational wave and the electromagnetic channel, to the confirmation of the existence of intermediate mass black holes (GW190521), to a catalog of over 90 compact object mergers at the end of the third observational run of the LIGO-Virgo-KAGRA collaboration. The field of gravitational-wave astrophysics has entered a blooming era filled with all sorts of exciting science.

In the coming decade (counting from 2023), there will be next generation ground-based detectors such as the Cosmic Explorer, and Einstein Telescope, and space-based detectors like LISA coming online. The ground-based detector network will detect ALL stellar mass gravitational-wave events in the universe, such as binary black holes and binary neutron stars merger. We expect to detect up to a million events per year. The sheer number of events and their high signal-to-noise will pose new challenges to our data analysis pipeline. The same goes for LISA, as it will open up a completely different part of the gravitational-wave spectrum in the millihertz. Supermassive black hole mergers, white dwarf binaries in our galaxy, and extreme mass ratio inspiral, these are sources that LISA is expected to observe, and we have seen none of them through gravitational-wave yet.

While studying physics on a theoratical level is what I enjoy on the weekends, my strength and interest in research lies in solving data analysis problems and developing computational methods. The following topics are a number of topics I am interested in and working on at the moment.


Rapid parameter estimation

One of the most common tasks in GW is to estimate the source properties given a stream of GW data. This used to be a computationally intensive process that cause many mental health problems within the community. A typical run can take something from tens to hundreds of hours. If you misconfig your run and only find out after the finish, congrats you can go home and come back in a few days again.

In recent years, there are a number of groups, including myself, trying to speed up the process through different means, including speeding up waveform evaluation (e.g. heterodyne likelihood), smart reparameterization (Look up Javier Roulet or Soichiro Morisaka), and simulation-based inference (Max Dax). There are different rationales behind each approach, and more often than not they are actually complimentary to each other. Most of these codes can acheive standard PE on minutes scale.

We took a stab of the problem as well, and you can find detail about the method and implementation in [1,2].

Since we know the community often wants to add their own new effects to the waveform and see what can we say about the some non-GR effect. The main design choice we made when we were making our code is we want the code to be capable to solve higher dimensional problems without requiring another graduate student to suffer for a couple years. The entire workflow of implementing and inferring non-standard effect should be basically the same as doing standard PE, so as the performance. For example, if one want to introduce some extra parameters related to gravitational lensing or doppler shift in the inference problem, the person should spend most of the time working on the model instead of figuring out the sampling part. You do the modelling, we do the sampling (The GPU does). So if you have to remember one unique strength our Jim code has, that would be extensibility.




Gravitational wave census


bottom of page