The relationship between interaural signal‐frequency disparity (Δf) and signal duration (t) was investigated using a two‐alternative temporal forced‐choice procedure. The signals consisted of a tone (400 Hz) in one ear and a tone of a different frequency (400 + Δf Hz) in the other ear, where Δf = 4, 8, or 16 Hz. As a reference condition a diotic signal (Δf = 0 Hz) was also employed. Signal duration was either 256, 512, 1024, or 2048 msec. All signals were presented in a background of wide‐band, diotic Gaussian noise. In the diotic condition performance improved as t increased. In the dichotic conditions, for a constant signal duration, performance deteriorated (average of 1 dB) as Δf increased; for a constant Δf, performance improved (average of 2.5 dB) as t increased. The results are discussed in terms of the number of alternating diotic (SO) and phase‐reversed (Sπ) “looks” available in a signal interval.

This content is only available via PDF.