ta.fo Journal

Music Is Not Geometry

Having played the piano and composed for nearly twenty years, I love a finished piece. Yet, I equally love the instrument itself as a precise machine. Composition is born in the gap between the two. If someone asked me how music actually operates, I would start my explanation right here.

While music is often packaged as the language of emotion, structurally speaking, it is a strict logic of intervals. It is the physics of relationships created by the ratios between frequencies.

People treat pitch perception as a binary talent where you either identify an isolated note or you fail. In reality, most musical structures run on relative pitch rather than perfect pitch. It is about hearing the distance between notes instead of their absolute names. This is not an innate superpower, but a sense you can train.

The problem is that the tools we use make it far too easy to bypass this training. Standard sheet music is essentially a vertical graph of pitch that maps frequencies onto a line. it acts as a highly efficient coordinate system for the piano. Since a black dot on the line simply becomes a finger position on the keyboard, this efficiency allows the player to perform while completely muting their ears. They rely solely on visual decoding and muscle memory.

It is exactly like typing a sentence in a foreign language without knowing the meaning.

The guitar operates on a different logic as an instrument of shapes. A chord structure remains constant while merely shifting its physical position. Although this structure naturally favors relative pitch training, tablature forces this process back into a visual grid. By providing string numbers and fret positions, it ultimately feeds you spatial coordinates instead of sound.

In both cases, visual data acts as a shortcut that allows the brain to be lazy. Because active listening requires a cost and the eye processes faster than the ear, the brain sets the visual path as the default. The circuit shorts out as information travels directly from the eye to the hand, completely alienating the ear.

Of course, you might ask if you are not still listening while you play. That is a fair point, but listening to verify an error is fundamentally different from listening to guide the performance. The former is feedback, while the latter is prediction. A mechanical keystroke only becomes a musical performance when this shift occurs.

The visual path treats music as a set of spatial instructions pointing your hand to a specific coordinate. In contrast, the auditory path deals with gravity. It feels the weight of tension and release.

On sheet music, the interval between a dominant seventh and a tonic is just a vertical gap of a few millimeters. It appears as static geometry to the eye. To the ear, however, that interval exerts a powerful gravitational pull. The dominant desperately wants to collapse into the tonic for resolution. When the eye leads, we play coordinates. But when the ear leads, we play gravity. The physical finger movement might look identical on the surface, but the underlying intent is absolutely different.

This defines the difference between reaction and prediction. Visual playing is entirely reactive. The hand immediately obeys when a signal appears. Auditory playing is predictive. Milliseconds before your finger presses the key, your brain hears the sound internally first. It has already simulated the texture and pitch.

That microscopic gap in time is crucial because the place where music actually happens is not on the score. It is exactly at that point of anticipation. A score only stores instructions, not tension. The logic of music does not reside within symbols. It exists only in that brief moment of anticipation.

It exists right before the string vibrates.

#Critique #Music #Philosophy