Nonlinear dispersion relation for nonlinear Schrödinger equation
Abstract
By using the average-Lagrangian method (average variational principle), a nonlinear dispersion relation has been derived for the cubic nonlinear Schrödinger equation. It is found that the size of the instability region in wavenumber space decreases with increasing field amplitude in comparison with the linear theory.
- Publication:
-
Journal of Plasma Physics
- Pub Date:
- April 1988
- DOI:
- 10.1017/S0022377800013040
- Bibcode:
- 1988JPlPh..39..297B