Solution intervals are often used to improve the signal-to-noise ratio during radio interferometric gain calibration. This work investigates how factors such as the noise level, intrinsic gain variability, degree of model incompleteness, and the presence of radio frequency interference impact the selection of solution intervals for calibration. We perform different interferometric simulations to demonstrate how these factors, in combination with the choice of solution intervals, affect calibration and imaging outputs and discuss practical guidelines for choosing optimal solution intervals. Furthermore, we present an algorithm capable of automatically selecting suitable solution intervals during calibration. By applying the algorithm to both simulated and real data, we show that it can successfully choose solution intervals that strike a good balance between capturing intrinsic gain variability and not fitting noise as long as the data are not too inhomogeneously flagged. Furthermore, we elaborate on several practical aspects that emphasize the need to develop regularized calibration algorithms that do not require solution intervals.