This paper discusses the problem of bias errors introduced when frequency response and coherence functions are estimated for systems in which a time delay is present. It is shown that a time delay between the input and output of a linear system causes the cross-spectral density to fluctuate as a function of frequency, where the period of fluctuation is dependent on the magnitude of the time delay. If the analysis bandwidth is not sufficiently small, or if frequency averaging is used to reduce the variance, the cross-spectral density estimate will have a negative bias. This error is transferred to estimates of the frequency response and coherence functions. Theory is developed showing the dependence of the bias errors on the time delay, the bandwidth and the length of the sample record of the input/output processes. Two experiments were designed to check the theory. In one experiment a loudspeaker, driven by white noise, and a microphone were used. The time delay was controlled by knowing the propagation time for acoustic waves between the loudspeaker and the microphone. In a second experiment a tape recorder was used with a fixed spacing between the record/playback heads to introduce a time delay. For both experiments comparison with theory was good.