An attack is made on the problem of determining the asymptotic behavior at high energies and momenta of the Green's functions of quantum field theory, using new mathematical methods from the theory of real variables. We define a class An of functions of n real variables, whose asymptotic behavior may be specified in a certain manner by means of certain "asymptotic coefficients." The Feynman integrands of perturbation theory (with energies taken imaginary) belong to such classes. We then prove that if certain conditions on the asymptotic coefficients are satisfied then an integral over k of the variables converges, and belongs to the class An-k with new asymptotic coefficients simply related to the old ones. When applied to perturbation theory this theorem validates the renormalization procedure of Dyson and Salam, proving that the renormalized integrals actually do always converge, and provides a simple rule for calculating the asymptotic behavior of any Green's function to any order of perturbation theory.