The goal of this study was to analyze the time-course of sensory (bottom-up) and cognitive (top-down) processes that govern musical harmonic expectancy. Eight-chord sequences were presented to 12 musicians and 12 nonmusicians. Expectations for the last chord were manipulated both at the sensory level (i.e., the last chord was sensory consonant or dissonant) and at the cognitive level (the harmonic function of the target was varied by manipulating the harmonic context built up by the first six chords of the sequence). Changes in the harmonic function of the target chord mainly modulate the amplitude of a positive component peaking around 300 msec (P3) after target onset, reflecting top-down influences on the perceptual stages of processing. In contrast, changes in the acoustic structure of the target chord (sensory consonance) mainly modulate the amplitude of a late positive component that develops between 300 and 800 msec after target onset. Most importantly, the effects of sensory consonance and harmonic context on the event-related brain potentials associated with the target chords were found to be independent, thus suggesting that two separate processors contribute to the building up of musical expectancy.