What’s music intelligence? Is it just AI for music?
The name “Music Intelligence Lab” might suggest that we’re all about artificial intelligence churning out new tunes. Instead, a better way to understand intelligence in this context is: the ability to adapt to one’s surrounding through abstraction and problem solving. And in music, context matters. While we do harness AI and modern algorithms, our mission is to understand music through creative computational tools, and to uncover the intelligence woven into its fabric.
Our purpose is to put both the sociocultural and artistic contexts back into the music. We use machine learning tools to understand the structures that makes music beautiful by analyzing it directly from source. We’re fascinated by the patterns, structures, and theories that give music its power, especially within the rich and intricate world of Arabic music that has its own unique modalities.
Furthermore, music isn’t merely a mental exercise; it’s a profoundly physical act. Any musician would tell you that the best performances happen when they stop thinking, they get in a state of flow, and become one with the music. It’s a state where intuition (rather than analytical intelligence) takes the lead. This type of intelligence is essential for expanding our definition of intelligence, even in the fields of AI that are not related to music: e.g. embodied artificial intelligence. We believe that truly understanding music means appreciating how the analytical and physical come together for an optimal performance. This philosophy guides our approach for incorporating AI with intelligence instrument design.
When it comes to applying machine learning algorithms to music, we’re interested in extracting insights from existing manuscripts and recordings to reveal deeper insights that could guide further developments and creativity in music creation. We leverage AI as a lens to examine the intelligence that music already possesses. This leads us to questions like: how can machine learning uncover patterns in musical compositions that might elude human analysis? How can we design instruments and interfaces that tap into our physical intelligence, making the creation of music more intuitive and expressive?
As a sister lab to the Data-Driven Modeling Lab (DDML), we approach music as a complex system: a web of interactions as intricate as any found in physics or society. Music unifies, harmonizes, and elevates. It brings together simple sounds to create something profoundly complex, much like how societies organize or scientific theories emerge from observations. In our socio-cultural environment where challenges are ever-present, music offers solace and cohesion. Recognizing music as a form of intelligence underscores its power to heal, unite, and inspire.
All these dimensions — understanding music through AI, embracing the embodied intelligence of performance, redefining what intelligence means, and viewing music as a complex system — come together to define the essence and mission of the Music Intelligence Lab.