COALA user programs

Users of accepted proposals

  1. Tiago Pereira
  2. Kimberly Cullen (supervised by Geoff and Alex)
  3. Mohammad Nawaz (supervised by Geoff and Alex)
  4. Devika Kamath (supervised by Amanda)
  5. Wolfgang Kerzendorf
  6. Stuart Sim
  7. Ross Parkin
  8. Natsuko Izutani (supervised by Chiaki)
  9. Luke Shingles (supervised by Amanda)
  10. Cherie Fishlock (supervised by Amanda)
  11. Sharon Rapoport (supervised by Stuart)
  12. Sanjoy Nandi (supervised by Geoff)
 

Tiago Pereira

  • Position:

    Post-doc

  • A brief description of your research aims:

    My current research is focused on the studying the 3D effects on line formation in stellar atmospheres, in particular with regards with abundance determinations from 3D models. There are two kinds of calculations I'm performing: 3D non-LTE radiative transfer for a few lines, and high-resolution calculations of the full spectrum coming from 3D models (still under LTE).

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    The computations I will be running will be mostly batch jobs, in parallel using 8-32 cores. I'm hoping to use COALA for convergence testing and probing the parameter space. Thus I plan to do short runs (typically 1h-4h) that will be very helpful in setting up the runs on a larger supercomputer. The 3D NLTE code (Multi3D) is MPI-domain decomposed and at present resolutions the optimal number of cores is 32. The full spectrum code (ASSET) will scale linearly to thousands of cores, although I don't expect to run it with more than 32. In terms of disk-space, each run takes about 25 Gb. It would be most helpful if there could be some common storage in COALA with a low-latency access, so that I don't have to transfer all these files across the network (which is often very time-consuming). I would need perhaps 200 Gb for the testing of a few snapshots.

 

Wolfgang Kerzendorf

  • Position:

    Student, supervised by Brian Schmidt

  • The title of your research proposal:

    Exploring the physics of Type Ia supernovae through spectral analysis

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    Type Ia supernovae are one of the most luminous events in the universe. They are important in astronomy as a cosmological distance indicator. Type Ia supernovae are the physical endpoint of some white dwarfs in a binary system. The nucleosynthesis in the explosion has an important impact in the abundance pattern of the interstellar medium. Embarrassingly, we know little about them. Their progenitor scenario as well as the details of the explosion mechanism remain a mystery.

    In recent years there has been a dramatic increase in surveys targeting these events. There are now many well sampled light curves which are mainly used for cosmological distance measurements. The number of spectroscopically well sampled Type Ia explosions is also rapidly increasing. I will construct an auutomatic analysis tool for Type Ia supernova spectra. This tool will produce matching synthetic spectra to exisiting observed ones. For the radiative transfer code I use a code developed by Mazzali et al. (2008). It is well tested and reproduces supernova spectra well. It produces a supernova spectrum for a given luminosity, photospheric velocity, time since explosion and several elemental abundances. Tweaking these parameters to produce a spectrum that fits an observed spectrum is a complex, as of yet, manual task.

    An automation will also provide us with error analysis, as well as assuring reproducibility. To summarize the problem: Given an observed spectrum I will produce a fitting synthetic spectrum tweaking 10 parameters with different numerical ranges. The calculation for one set of parameters on a state of the art laptop computer is roughly 1 minute per parameter set. The search space is large and complex.

    Sampling the whole parameter space is not feasible due to cpu time and storage space requirements. As an individual parameter calculation takes a minute the algorithm should search the parameter space in parallel. Additionally it needs to be very robust against local extrema. Genetic Algorithms fulfill all of these requirements. Currently we are working with 150 indivdual parameter sets for each generation. A scheduler distributes each parameter set to a cloud-like computing environment (COALA contributes 32 processors). This enables us to decrease the average time for each spectrum from 1 minute to about 1 second on average. The project is still in the testing phase, but has already delivered promising results.

 

Stuart Sim

  • Position:

    SkyMapper Fellow

  • The title of your research proposal:

    Modelling the spectra and light curves of supernovae

  • A brief description of your research aims:

    I use a Monte Carlo radiative transfer (written by myself and Markus Kromer, MPA) to compute synthetic spectra and light curves for models of supernova explosions. By comparison with data, these synthetic spectra allow us to constrain both the explosion physics and the nature of the supernova progenitor systems. To date the code has mostly been applied to multi-dimensional models of Type Ia supernova - at Stromlo I will keep working on the modelling of Type Ias but also plan to extend the work to include Type Ib/c supernovae, including GRB supernovae (the physics of Ib/c supernovae is very similar to that of Ias). This work will interface with those working at Stromlo on observations of supernovae, particularly those involved with the transient search in the SkyMapper project.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    The computations (Monte Carlo radiative transfer code) would all be performed in a batch mode with parallel processors. The code has already been extensively tested and used on the machines at MPA and scales very well up to several thousand processors - so it will perform very well on a machine with a moderate numbers of processors like Coala. The code has a hybrid MPI-OPENMP parallelisation so that it can run on systems with either (or both) in operation. Coala does not have enough processors for me to undertake full multi-dimensional simulations - for these I will apply for time via the ANU partner share on the big machine. However, Coala will be ideal for doing code development tests (I plan to add several new capabilities to the code in the near future - polarization and late-phase nebular spectra) and should also be adequate for running 1D models (which will form the initial phase of work on the modelling of SN Ib/c). I would expect that a serious test calculation might use around 32 or 64 processors for anywhere from a few hours up to one day. A full production-level 1D calculation (with all physics switched on) would use up to 64 processors for a few days. [The code can be restarted from any state so, although production level simualtions will require fairly long integrated run times, they do not require that resources be allocated continuously for a very long period.]

  • Any additional information:

    I also have a very similar Monte Carlo radiative transfer code with which I work on the modelling of X-ray spectra (this time for the study of AGN). Although I will initially be mainly focused on getting my supernova code operational here, I will later want to continue to work with the X-ray code. Those simulations are less demanding that the supernova calculations but scale equally well - that code is MPI parallel only. So, in due course, I would also like to be allowed to use Coala for that project.

 

Ross Parkin

  • Position:

    Post-doc

  • The title of your research proposal:

    Accretion disks, jets, and winds.

  • A brief description of your research aims:

    For accretion to occur through a disk, the orbiting material must lose angular momentum. In recent years it has become apparent that magnetohydrodynamic (MHD) processes are important for transporting angular momentum outwards, and allowing material to flow radially inwards. I will explore this interesting phenomenon via simulations of magnetised Keplerian accretion disks. For this purpose I will use the MHD code PLUTO, which will be extended to include radiation transfer, as well as sub-grid viscosity and resistivity. Coala provides an ideal platform for code development and testing. Initial simulations will be performed in 2D, with perhaps a few low resolution 3D runs. Ultimately, large-scale 3D radiation-MHD simulations will require the NCI National Facility.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    I anticipate that the 2D simulations will require roughly 8-32 cores (1-4 nodes), and 3D simulations may require of the order of 32 cores (4 nodes). All simulations will run under MPI. Memory requirements should not be too excessive. Coala may be very handy for visualisation of simulation data, particularly 3D datacubes.

  • Any additional information:

    In future, and if time permits, I may also use Coala to perform simulations of binary star systems - namely, the turbulent wind-wind collision region. These simulations would likely use the PLUTO code, making use of its adaptive-mesh refinement capabilities.

 

Natsuko Izutani

  • Position:

    Visiting Student

  • If a student, name your supervisor:

    Chiaki Kobayashi

  • The title of your research proposal:

    supernova nucleosynthesis with neutrino processes

  • A brief description of your research aims:

    I will calculate supernova yields including neutrino processes for galactic chemical evolution.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    single processor, 4-8 cores

  • Any additional information:

    none

 

Luke Shingles

  • Position:

    Honours Student

  • If a student, name your supervisor:

    Amanda Karakas

  • The title of your research proposal:

    The sulfur anomaly in planetary nebulae and post-AGB stars

  • A brief description of your research aims:

    I will be testing nucleosynthesis models of AGB stars to see if they can explain observations of sulfur depletion in their end products (planetary nebulae).

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    As the computation will be performed at NCI, only storage for the results is requested. About 50-100 GB of storage space should be sufficient.

  • Any additional information:

    None

 

Cherie Fishlock

  • Position:

    PhD Student

  • If a student, name your supervisor:

    Amanda Karakas & Dave Yong

  • A brief description of your research aims:

    To construct new AGB evolution models using updated opacity tables and reaction rates and comparing with observations where there has been two distinct populations observed in the halo separated by alpha elements. By attempting to model these observations I hope to explain this dichotomy.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    All our calculations are single processor jobs and require Fortran.

  • Any additional information:

     

  • The title of your research proposal:

    N/A

 

Sharon Rapoport

  • Position:

    Student

  • If a student, name your supervisor:

    Brian Schmidt

  • A brief description of your research aims:

    While GRBs provide a powerful tool to study galaxy formation at high redshifts, they also represent a unique setting to observe extreme physical processes. Extensive research on Long GRBs has found them to typically be associated with the collapse of a massive star. However, the exact features of this process are not yet understood, and is especially poorly constrained in the very bright GRBs used to study the high-redshift universe. This proposal seeks to investigate the properties of the GRB-SN and therefore enhance our understanding of the GRB central engine, GRB progenitors, and the associated physical properties.

    Long-duration GRBs are associated with broad-line Ibc Supernovae. Studies reveal a possible linkage between the metallicity in the environment of the Supernova and whether it will produce a GRB (Sollerman et al., 2005; Modjaz et al., 2008, 2006). However, conclusive results are still limited due to the small sample of 4 certain GRB-SN connections, only one of which has been properly modelled (2003dh). Sollerman et al observed 3 galaxies which were identified to host a SN with an associated GRB. Their results favour low metallicity, sub-luminous galaxies with L < L* and Z < Z* in their star forming phase, which agrees with other studies of higher redshift GRBs (Le Floc'h et al., 2003). A study of SN 1998bw confirms the high mass Collapsar model (Woosley, Eastman and Schmidt 1998, MacFayden and Woosley 1999) with an estimated progenitor mass of ~ 30M_solar, but this object's Gamma Ray Burst properties were orders of magnitude less energetic than the typical GRBs we study at large redshifts. In 2003, SN2003dh/GRB 030329 became the first bright GRB to have its SN studied in detail. It remains unique in this respect to date. Since GRB030329, a series of GRB-SN have been studied,( GRB031203, GRB060218, GRB080110, and GRB100316 - see table) , but all of these are intrinsically dim GRBs, unlike SN2003dh.

    A significant part of Rapoport's PhD thesis will be dedicated to better understand the Supernovae that become GRBs, and to investigate the connections between GRB central engines and the progenitor stars that explode. Therefore, we propose to model the exploding star's mass, total explosion energy, 56Ni production (which probes the central engine of the GRB/SN), and potentially, a-sphericity using Sim's radiative transfer Monte-Carlo code. These measurements should help better define the progenitor systems that lead to bright GRBs, illuminate the range of physics that lead to bright GRBs, and allow us to better understand and predict the biases that are associated with using GRBs as tracers of star formation.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

    The computations (Monte Carlo radiative transfer code) would all be performed in a batch mode with parallel processors. The code has already been extensively tested and used on the machines at MPA and scales very well up to several thousand processors - so it will perform very well on a machine with a moderate numbers of processors like Coala. The code has a hybrid MPI-OPENMP parallelisation so that it can run on systems with either (or both) in operation. Coala does not have enough processors for me to undertake full multi-dimensional simulations - for these I will apply for time via the ANU partner share on the big machine. However, Coala will be ideal for doing code development tests (I plan to add several new capabilities to the code in the near future - polarization and late-phase nebular spectra) and should also be adequate for running 1D models (which will form the initial phase of work on the modelling of SN Ib/c). I would expect that a serious test calculation might use around 32 or 64 processors for anywhere from a few hours up to one day. A full production-level 1D calculation (with all physics switched on) would use up to 64 processors for a few days. [The code can be restarted from any state so, although production level simualtions will require fairly long integrated run times, they do not require that resources be allocated continuously for a very long period.]

  • Any additional information:

    None

 

Sanjoy Kumar Nandi

  • Position:

    PhD Student

  • If a student, name your supervisor:

    Geoffrey Bicknell

  • The title of your research proposal:

    Black hole-jet connection and high energy emission

  • A brief description of your research aims:

    In this project, I will use the relativistic magnetohydrodynamics code PLUTO in the vicinity of black hole with relevant parameters. The mechanism involving the magnetic dissipation on the propagation of jet will be studied.

  • A brief description of the nature of the computations to be performed (resources required, interactive/batch job, i/o requirements, single processor/parallel jobs, scalability):

     

  • Any additional information:

    None

Updated:  21 October 2017/Responsible Officer:  RSAA Director/Page Contact:  Webmaster