The effect of a Bump in the turbulence spectrum on laser propagation
Abstract
This report will address the effect of a bump in the turbulence spectrum on the propagation of a laser beam. The motivation for this work is as follows: In conventional treatments on the effect of turbulence on laser propagation, it is usually assumed that the turbulence can be described by a spectral density which represents a cascade of energy from long wavelength modes down to shorter wavelength modes. The most common assumption is that of the Kolmogorov model, where in the inertial range of the spectrum (between a minimum wavenumber k(sub L0) and maximum wavenumber k(sub l0), corresponding to the maximum size L0 and minimum size l0 of the turbulent eddies) the spectral density decreases as k(sup (3d+2)/3) (in d dimensions). In some media however, a source for injecting energy into the turbulent spectrum at a particular discrete wavelengths (or over a range of wavelengths) may be present: for example, this is the case in a turbulent plasma where an instability may be growing over a range of wavelengths. The presence of such an energy source will thus produce a bump in the turbulence spectrum, as seen below, the spectrum can then be viewed as this bump superimposed upon the background natural cascade: that is, the spectrum can be analyzed as the sum of a cascade spectrum and the bump spectrum, and the effects of each on the propagation of the beam can thus be analyzed separately.
 Publication:

NASA STI/Recon Technical Report N
 Pub Date:
 August 1990
 Bibcode:
 1990STIN...9117364M
 Keywords:

 Laser Beams;
 Light Transmission;
 Plasma Spectra;
 Spectral Energy Distribution;
 Turbulence;
 Vortices;
 Dispersing;
 Frequency Shift;
 Lasers;
 Wavelengths;
 Lasers and Masers