I’m posting this, as it might be helpful for X-ray astronomers. In 2013, I ran MC-like simulations of an absorbed powerlaw. The goal was to find out how many source counts are necessary to be able to constrain an absorbed powerlaw to better than 10%. I focused on the absorption NH, and the photon index Γ, but the normalization was free to vary of course.
I chose an input model (e.g., NH=0.12×1022cm-2, Γ=1.7), simulated this input model 20 times each for a large number of total resulting source counts between 100 and 50,000. I averaged the uncertainties for the 20 spectra and the results are shown below:
A vector graphics of the plot is available for download here.
This suggests that it’s necessary to have ~1000 counts in order to constrain the photon index to ~10%. Constraining the absorption is much more difficult, dependent on the value of the absorbing column in the line of sight. At H=0.12×1022cm-2, ~30,000 counts are required for a good constraint, at lower values of NH=0.01×1022cm-2, it’s nearly impossible to constrain. For high absorbing columns of e.g., NH=5×1022cm-2, ~2000 counts are already sufficient in constraining the absorption. For most (nearly unabsorbed) extragalactic sources (such as blazars), it is usually required to fix the absorption to the Galactic value. However, in my SED fitting paper, I find consistently higher values than the LAB survey. This suggests that we do see some reddening and photoelectric absorption in either the host galaxy of the AGN, or from the nucleus of the AGN.
Disclaimer: This simulation has been run using Swift/XRT RMFs and ARFs. Source counts are independent of the instrument, so this estimate should be valid for other instruments as well. I’ve used vern cross-sections and wilm abundances with the tbnew/tbabs model available here. The simulations and the processing has been done using ISIS/S-Lang.