This paper proposes to study neural networks through neuronal correlation, a statistical measure of correlated neuronal activity on the penultimate layer. We show that neuronal correlation can be efficiently estimated via weight matrix, can be effectively enforced through layer structure, and is a strong indicator of generalisation ability of the network. More importantly, we show that neuronal correlation significantly impacts on the accuracy of entropy estimation in high-dimensional hidden spaces. While previous estimation methods may be subject to significant inaccuracy due to implicit assumption on neuronal independence, we present a novel computational method to have an efficient and authentic computation of entropy, by taking into consideration the neuronal correlation. In doing so, we install neuronal correlation as a central concept of neural network.