The attenuation of starlight by dust in galactic environments is investigated through models of radiative transfer in a spherical, clumpy interstellar medium (ISM). We show that the attenuation curves are primarily determined by the wavelength dependence of absorption rather than by the underlying extinction (absorption+scattering) curve; the observationally derived attenuation curves cannot constrain a unique extinction curve unless the absorption or scattering efficiency is specified. Attenuation curves consistent with the “Calzetti curve” are found by assuming the silicate-carbonaceous dust model for the Milky Way (MW), but with the 2175 Å bump suppressed or absent. The discrepancy between our results and previous work that claimed the Small Magellanic Cloud dust to be the origin of the Calzetti curve is ascribed to the difference in adopted albedos; we use the theoretically calculated albedos, whereas the previous works adopted albedos derived empirically from observations of reflection nebulae. It is found that the attenuation curves calculated with the MW dust model are well represented by a modified Calzetti curve with a varying slope and UV bump strength. The strong correlation between the slope and UV bump strength, as found in star-forming galaxies at 0.5\lt z\lt 2.0, is well reproduced when the abundance of the UV bump carriers is assumed to be 30%-40% of that of the MW dust; radiative transfer effects lead to shallower attenuation curves with weaker UV bumps as the ISM is more clumpy and dustier. We also argue that some local starburst galaxies have a UV bump in their attenuation curves, albeit very weak.
The Astrophysical Journal
- Pub Date:
- December 2016
- methods: numerical;
- radiative transfer;
- Astrophysics - Astrophysics of Galaxies
- 28 pages, 30 figures, submited to ApJS