Listeners are sensitive to correlations among the multiple probabilistic acoustic cues that define speech categories. In English stop consonant productions, for example, fundamental frequency (f0) is strongly correlated with voicing. Reflecting this regularity, perception of syllables varying in voice onset time is shifted with changes in f0. Such sensitivity to the long‐term regularities of a language must be balanced with enough flexibility that speech perception is able to accommodate deviations from these regularities, such as those that may arise variations from idiolect, dialect, or accented speech. The present experiments manipulate short‐term acoustic cue correlations experienced in online perception to investigate the interplay between sensitivity to long‐term and short‐term acoustic cue correlation. Using overt categorization and eyetracking, the present results indicate that speech categorization is influenced by local shifts in acoustic cue correlations that deviate from long‐term regularities of English. These experiments further examine the time course of this short‐term learning and the degree to which it generalizes. The data suggest that listeners continually monitor speech input for regularity and tune online speech perception in relation to these regularities.