Rough speaking, information theory deals with data transmitted over a channel such as the internet. Modern information theory is generally considered to have been founded in 1948 by Shannon in his seminal paper, "A mathematical theory of communication." Shannon's formulation of information theory was an immediate success with communications engineers. Shannon defined mathematically the amount of information transmitted over a channel. The amount of information doesn't mean the number of symbols of data. It depends on occurrence probabilities of symbols of the data. Meanwhile, psychophysics is the study of quantitative relations between psychological events and physical events or, more specifically, between sensations and the stimuli that produce them. It seems that Shannon's information theory bears no relation to psychophysics established by German scientist and philosopher Fechner. Here I show that to our astonishment it is possible to combine two fields. And therefore we come to be capable of measuring mathematically perceptions of the physical stimuli applicable to the Weber-Fechner law. I will define the concept of new entropy. And as a consequence of this, new field will begin life.