There is recent interest in the use of light beams carrying orbital angular momentum (OAM) for creating multiple channels within free-space optical communication systems. One crucial issue is that, for a given beam size at the transmitter, the beam divergence angle increases with increasing OAM. Therefore the larger the value of OAM, the larger the aperture required at the receiving optical system if the efficiency of detection is to be maintained. Confusion exists as to whether this divergence scales linearly with, or with the square root of, the beam’s OAM. We clarify how both these scaling laws are valid, depending upon whether it is the radius of the waist of the beam’s Gaussian term or the radius of rms intensity of the beam that is kept constant while varying the OAM.