I am interested in finding the separation of blackholes when they are merged within a smoothing length. How can I find the smoothing length of the blackhole particles at the time of merger? Is this equivalent to their approximate separation when merged?
As you say the BH merger criterion is a pair within BH_Hsml. Technically, I would say that the BH_Hsml values would give an upper bound to the separation pre-merger. This value is stored in the snapshots for blackholes, but only at snapshot output times. But, the small-scale dynamics of black holes in Illustris shouldn't be given too much weight as a result of our repositioning. That is, I wouldn't recommend any "raw" analysis of BH separations at ~kpc scales or below, in these regimes a post-processing reconstruction of BH orbits would be strongly motivated.
I've asked Luke Kelley if he has any extra thoughts to add here, or let us know if you have any other questions.
I think Dylan is right that the best interpretation is that "BH_Hsml is an upper limit to the separation", I think this should be accurate to about a factor of two. One additional issue is "the BH_Hsml of which BH?" If I recall correctly, there are hypothetical situations in which it could be either BH, but in generally it should be the BH with the larger smoothing length, and the smoothing lengths are correlated with the BH mass. My procedure (described in this paper) has been to use the BH_Hsml from the larger BH, and the snapshot preceding the merger event. Hypothetically you could try to infer a smoothing length closer to the time of merger based on the BH mass at the time of merger, but I don't think this would be much more accurate. Fig.2 of the linked paper (bottom-left panel) shows the distribution of BH_Hsml (interpreted as binary separations) at the time of merger, cutoff at 10 kpc. Happy to help with any other questions as well!
This helps immensely. Thank you. I had read the paper, and was trying to figure out which data category correlated to those separations. I have one more question. More of a clarification. While speaking to other members at my university who work on larger scale simulations, I inquired generally about smoothing lengths within the simulations, and, specifically, the black hole smoothing lengths. They had recommended trying locating the nearest gas particle to the black hole and using its smoothing length as the bh smoothing length. I was wondering if you could comment on this: if it is similar, better than, or worse than BH_Hsml for determining that. Based on your previous answer, I assume 'worse than,' so I am just looking to further understand this.
No problem! Two things: 1) BH_Hsml is calculated by finding the radius within which the nearest 64 gas cells are enclosed, whereas the gas smoothing lengths are based on the size of the Delaunay triangles -- which will necessarily be smaller (that's my understanding at least); 2) BH_Hsml is the parameter used to determine when the BH "merge" --- so that's the quantity you want to use. The code checks whether the BH separation d_bh < BH_Hsml, and if so, the BH are merged.
d_bh < BH_Hsml
The reason why we are saying "BH_Hsml of the more-massive BH is an 'upper limit'" is for a couple reasons: i) because of the finite time-step size, and the blackhole repositioning algorithm*, the two BH might actually be closer together than BH_Hsml when they are merged. ii) BH positions are only reliable to an accuracy of about BH_Hsml, and actually, because of the BH repositioning, the positions are even less reliable... So if the separation happens to be less than BH_Hsml, then it wouldn't really be physically meaningful.
* The BH particles are manually moved to the center of their host subhalos (galaxies), roughly each time-step.
Got it. Thank you again for the help!
Hello! The research I'm doing requires the separation information of two BHs and this has helped a lot. I'm trying to reproduce Figure 2 the paper linked above, but when I download the snapshots and extract the BH IDs and BH_hsml, I find that a BH can have multiple smoothing lengths within a single snapshot (sometimes even a subsnapshot?). I'm wondering how the snapshots and subsnapshots are organized. Thank you for your time.
Every particle, including SMBHs, have 1 entry per dataset (e.g. Coordinates, Masses, and BH_Hsml). So each SMBH has one BH_Hsml.
If you take one snapshot, load all PartType5/ParticleIDs and PartType5/BH_Hsml, these two arrays will have the same size - one entry each, per SMBH.
Thanks! It turns out there was an error in the way I was extracting the Hsml. I'm now getting what you said.
Hello, I had a follow up question on this issue. So I've extracted the smoothing lengths according to the method Dr. Kelley explained in the discussion above and I've also implemented the mass cut of 10^6 M_sun. I later realized that I was doing this for Illustris-3, but the Figure2 in the paper is plotted for Illustris-1, which I don't have the resources to download. Reproducing Fig2 for Illustris-3 reveals almost identical distributions for Mass Ratio, Total Mass and Formation Redshift but the Binary Separation distribution differs slightly. The Binary Separation in Fig2 of the paper peaks at ~10^3.4, whereas the Illustris-3 binary separation distribution peaks at ~10^3.1 and a significant fraction of the mergers are in 10^2.5 to 10^3.0. This is a long shot, but I was wondering if there's a way to find this distribution for Illustris-3, or if there's a way to extract the smoothing lengths without having to download the full Illustris-1 simulation. My main goal from this is to check the validity of my code by reproducing other data.
I'd suggest to use the Lab service - you can probably get your code working there without any trouble, and then run directly against Illustris-1.