Line-depth Ratios in H-band Spectra to Determine Effective Temperatures of G- and K-type Giants and Supergiants
The stellar effective temperature is the fundamental parameter of stellar atmospheres. It characterizes the spectra and plays an essential role in chemical abundance analysis. Previous optical studies have shown that line-depth ratios (LDRs) are good indicators of the effective temperature. Essentially, the ratios of absorption lines with different excitation potentials can be used as temperature scales. The most important advantage of LDRs is their robustness against interstellar reddening and extinction. Furthermore, the scales are constructed and calibrated empirically by observables. Although this method is well-established for optical spectra, we require infrared high-resolution spectroscopy to access the most obscured stars in the Galactic disk and thereby understand the Milky Way’s structure and evolution in the innermost region. This study newly derives the temperature scales from the LDRs in infrared spectra. We explored LDRs in the high-resolution H-band spectra (1.4-1.8 μm) of well-known stars in the solar neighborhood obtained with Subaru/infrared camera and spectrograph. We found nine pairs of absorption lines whose LDRs allow us to determine the temperatures of G- and K-type giants/supergiants to an accuracy of ˜60 K. Checking the dependency of our scales on stellar parameters, we found that our developed temperature scales may slightly bias the estimates for low-metal stars, [Fe/H] < -0.3 dex.