Atom interferometers (AIs) are promising tools for precision measurement with applications ranging from geophysical exploration and tests of the equivalence principle of general relativity to the detection of gravitational waves. Their optimal sensitivity is ultimately limited by their detection noise. We review resonant and near-resonant methods to detect the atom number of the interferometer outputs, and we theoretically analyze the relative influence of various scheme dependent noise sources and the technical challenges affecting the detection. We show that for the typical conditions under which an AI operates, simultaneous fluorescence detection with a charge-coupled device sensor is the optimal imaging scheme. We extract the laser beam parameters such as detuning, intensity, and duration required for reaching the atom shot noise limit.