The hot (107 to 108 kelvin), X-ray-emitting intracluster medium (ICM) is the dominant baryonic constituent of clusters of galaxies. In the cores of many clusters, radiative energy losses from the ICM occur on timescales much shorter than the age of the system. Unchecked, this cooling would lead to massive accumulations of cold gas and vigorous star formation, in contradiction to observations. Various sources of energy capable of compensating for these cooling losses have been proposed, the most promising being heating by the supermassive black holes in the central galaxies, through inflation of bubbles of relativistic plasma. Regardless of the original source of energy, the question of how this energy is transferred to the ICM remains open. Here we present a plausible solution to this question based on deep X-ray data and a new data analysis method that enable us to evaluate directly the ICM heating rate from the dissipation of turbulence. We find that turbulent heating is sufficient to offset radiative cooling and indeed appears to balance it locally at each radius--it may therefore be the key element in resolving the gas cooling problem in cluster cores and, more universally, in the atmospheres of X-ray-emitting, gas-rich systems on scales from galaxy clusters to groups and elliptical galaxies.