The hypothesis that the optical luminosity of Type I supernovae results from the radioactive decay of Ni('56) synthesized and ejected by the explosion has been investigated by numerical simulation of the optical spectrum resulting from a homologously expanding shell composed initially of pure Ni('56). This model, which neglects the effects of material external to the Ni('56) core, is expected to provide a reasonable representation of the supernova at late times when the star is nearly transparent to optical photons. The numerical simulation determines the temperature, ionization state, and non-LTE level populations which result from energy deposition by the radioactive decay products of Ni('56) and Co('56). The optical spectrum includes the effects of both allowed and forbidden lines. The optical spectra resulting from the simulation are found to be sensitive to the mass and ejection velocity of the Ni('56) shell. A range of these parameters has been found which results in good agreement with the observed spectra of SN1972e over a considerable range of time. In particular, evidence for the expected decaying abundance of Co('56) has been found in the spectra of SN1972e. These results are used to assess the validity of the Ni('56) model and set limits on the mass and explosion mechanism of the Type I progenitor. The possibilities for improvement of the numerical model are discussed and future atomic data requirements defined.
- Pub Date:
- Physics: Astronomy and Astrophysics;
- Radioactive Decay;
- Stellar Luminosity;
- Stellar Mass Ejection;