Most matter in the Universe is “dark,” meaning that it does not interact with us through the usual electromagnetic interaction. In fact, most of the mass in our own galaxy, the Milky Way, is contributed by such “dark matter.” Although we cannot “see” dark matter, the gravitational field generated by their mass exerts force and influences stars’ motion. As a result, by studying the motions of stars (what is known as Galactic dynamics), we can reverse engineer and map out the gravitational potential (hence the mass) of the Milk Way. Mapping out the dark matter distribution in the Milky Way is one of the cornerstones of modern-day astronomy: the subatomic properties of dark matter should manifest themselves through the dark matter distribution in the Milky Way. Consequently, mapping the dark matter distribution will tell us what dark matter is made of, something that has been elusive to physicists for many decades.
Despite its importance, most efforts in Galactic dynamics thus far focus on fitting some analytic parametric solutions. The fact that we had to assume parametric models based on our human heuristic comes with a severe cost. The heuristics (or “Ansätze”) that we impose is often the more considerable systematic uncertainty of the study. For e.g., The Milky Way mass estimate can vary substantially with the choice of parametric model. But such status quo has drastically changed in recent years in light of physics-inspired machine learning. This study will build on the recent work from our group (Green & Ting, 20), which “engraves” the universal physical laws of Galactic Dynamics into neural networks, with which we can solve Galactic dynamics without Ansätze. We will apply our model to data from the Gaia satellite which consists of positions and velocities from billions of stars and map out the mass distribution of the Milky Way.
Map the gravitational potential (and hence the dark matter distribution) of the Milky Way via non-parametric machine learning approaches, going beyond classical parametric models.
Python programming (Pytorch, Tensorflow) and experience in flow-based models are essential. Decent background in physics (classical mechanics) and familiarity with physics-inspired neural networks (e.g., Hamiltonian neural networks, symbolic regression) are desirable but not critical.