Plotting over various subhalos

Hyunwoo KIM
  • 19 Feb '20

Hello IllustrisTNG team,

I'm trying to plot Dark Matter, Gas Density, Gas Temperature, and Stellar Light of most massive subhaloes in TNG-100 simulation and make a time-lapse evolution video. In order to do this, my algorithm does the following

  • obtain simple sublink tree, sort it so the snap number is arranged from least to greatest
  • request cutout with parameters (example: for gas, coordinates, and masses, etc.)
  • use the get function given in the API tutorial to obtain the cutout in .hdf5 file
  • use pyplot to plot the parameter (dm, gas temp, gas dens, stellar light, etc.)
  • repeat by traversing the simple sublink tree.

The problem is when I call the get function to download the cutout, it takes too much time to download, and the files are too big.

For example, in order to create one plot, whether that be DM, Stellar Light, Gas Temperature, or Gas Density, I need to download a cutout file for that specific subhalo at a specific redshift. This file is one Gigabyte each; so in order to traverse the SubLink Tree and create the time-evolution video, I need to download all 99 cutout file, and then plot the given parameters. That's about 100 GB, assuming each cutout file is ~1GB each.

I believe this is not only space taking, but it also will slow down the progress of my project.

Is there another way of doing this without downloading the cutout file?

Thank you in advance.

Dylan Nelson
  • 1
  • 19 Feb '20

Hi Hyunwoo,

If you are already using the {cutout_query} option to request cutouts which only have the exact fields that you need for visualization, then as you say, this is simply a lot of data, if you want to look at the most massive halos (fof0 in tng100-1 has ~45M gas and ~67M dm particles at z=0).

There are two options I see:

(1) Use the new visualization functionality of the API, so that images are rendered on the server, and you only need to download the image. So long as you don't want full custom control over exactly how the image making is done and looks, I would suggest this.

(2) Register for, and use, the Lab service. If you can get your visualization script working here, then you also don't need to download any data, since you can read the simulations locally, and then just download the images. But you will need to adapt your pipeline (steps 1,2,3) to use the "example scripts" approach, rather than the "web-based api" approach.

  • Page 1 of 1