A spectrum standardization approach for laser-induced breakdown spectroscopy measurements
Abstract
This paper follows and completes a previous presentation of a spectrum normalization method for laser-induced breakdown spectroscopy (LIBS) measurements by converting the experimentally recorded line intensity at varying operational conditions to the intensity that would be obtained under a "standard state" condition, characterized by a standard plasma temperature, electron number density, and total number density of the interested species. At first, for each laser shot and corresponding spectrum, the line intensities of the interested species are converted to the intensity at a fixed plasma temperature and electron number density, but with varying total number density. Under this state, if the influence of changing plasma morphology is neglected, the sum of multiple spectral line intensities for the measured element is proportional to the total number density of the specific element. Therefore, the fluctuation of the total number density, or the variation of ablation mass, can be compensated for by applying the proportional relationship. The application of this method to Cu in 29 brass alloy samples, showed an improvement over the commonly applied normalization method with regard to measurement precision and accuracy. The average relative standard deviation (RSD) value, average value of the error bar, R2, root mean square error of prediction (RMSEP), and average value of the maximum relative error were: 5.29%, 0.68%, 0.98, 2.72%, 16.97%, respectively, while the above parameter values for normalization with the whole spectrum area were: 8.61%, 1.37%, 0.95, 3.28%, 29.19%, respectively.
- Publication:
-
Spectrochimica Acta - Part B: Atomic Spectroscopy
- Pub Date:
- February 2012
- DOI:
- 10.1016/j.sab.2012.01.005
- arXiv:
- arXiv:1106.0583
- Bibcode:
- 2012AcSpB..68...58W
- Keywords:
-
- Physics - Plasma Physics;
- Physics - Atomic Physics
- E-Print:
- LIBS