Upcoming Events

 

Wednesday, April 4, 2018, 10:30-11:30 am, Room X704, Institute for Computing, Information and Cognitive Systems, University of British Columbia

Title: Extracting Rhythmic Information from Audio Recordings

PresenterGeorge Tzanetakis (University of Victoria)

Abstract: Rhythm refers to the hierarchical organization of musical sounds in time.There has been a long research history of using computers to analyze rhythm.In this workshop, I will describe basic rhythmic analysis tasks and trace the historic evolution of the algorithms that have been proposed to solve them. These include tempo estimation, beat tracking, downbeat detection, drum transcription, rhythm descriptors, and pattern detection. In addition to being useful in music recommendation systems and music creation software, computational rhythmic analysis can be used as a tool to study the complexities of human rhythmic performance. I will describe examples of analyzing and visualizing micro-timing information and talk about how knowledge of particular music cultures can help inform the design of algorithms for rhythm analysis in the context of computational ethnomusicology. Finally, I will briefly touch upon the importance of embodied cognition for playing music and describe ongoing efforts to create expressive robotic drummers and why effective automatic rhythmic analysis is an essential component of such an endeavor.

Workshop materials: I will be using Python notebooks to go over some of the material. For participants who wish to be more hands-on, I suggest you go through the following tutorial on Python/NumPy and Matplotlib:
http://cs231n.github.io/python-numpy-tutorial/

There is a link to a Python notebook version of the tutorial in it as well as a pointer to resources for people who are familiar with MATLAB and want to learn NumPy.


I will also be using the LibROSA Python library for audio analysis: 

https://github.com/librosa/librosa

Participants who wish to have a more hands-on experience should come to the workshop with NumPy/SciPy/Maptotlib/LibROSA pre-installed on their laptops.

  George Tzanetakis

George Tzanetakis

About the Presenter: George Tzanetakis is a Professor in the Department of Computer Science with cross-listed appointments in Electrical and Computer Engineering (ECE) and Music at the University of Victoria, Canada. He is Canada Research Chair (Tier II) in Computer Analysis of Audio and Music and received the Craigdarroch Research Award for Excellence in Artistic Expression at the University of Victoria in 2012.  In 2011, he was Visiting Faculty at Google Research. He received his Ph.D. in Computer Science at Princeton University in 2002 and was a Post-Doctoral Fellow at Carnegie Mellon University in 2002-2003. His research spans all stages of audio content analysis such as feature extraction, segmentation, classification with specific emphasis on music information retrieval. His pioneering work on musical genre classification received a IEEE Signal Processing Society Young Author Award and is frequently cited. More recently he has been exploring new interfaces for musical expression, music robotics, computational ethnomusicology, and computer-assisted music instrument tutoring. These interdisciplinary activities combine ideas from signal processing, perception, machine learning, sensors, actuators and human-computer interaction with the connecting theme of making computers better understand music to create more effective interactions with musicians and listeners.

 

 

Past Events

SYMPOSIUM: “ENTRAINMENT AND THE HUMAN-TECHNOLOGY INTERFACE"

September 14-15, 2017

This symposium explores strategies for externalizing and regulating an entraining agent: first the invention of clocks, later the metronome, and now click tracks in recordings and live concerts. The symposium supports a research project that builds on recent cognitive studies, exploring the historical and technological motivations for "playing in time," and assessing their impact on our collective engagement with music as a temporal art. Initially the project will focus on musicians and studio producers, but then on the experiences of average listeners and their awareness and appreciation of the human-technology interface.

Event details

 

SYMPOSIUM: "MODELING RHYTHMIC COMPLEXITY"

January 23-26, 2018

This symposium will explore intersecting tools and methodologies from the fields of music information retrieval, computational analysis, and experimental psychology, for application to the study of complex rhythmic structures. The symposium theme relates closely to Dr. Poudrier’s ongoing exploration of how perceptual processes and cognitive limits interact with musical creativity and expression through the use of polyrhythm and polymeter (the superposition of competing rhythms and meters). A computational approach supported with signal processing technology and complemented by behavioural studies would not only allow for the inclusion of music from oral traditions in this study, but also allow for a cross-cultural exploration of the psychological and social mechanisms at play in the creation of meaning through specific compositional techniques.

Event details

 

SYMPOSIUM: "Microtiming and Musical Motion"

March 6-9, 2018

Musical microtiming refers to small-scale rhythmic irregularities and asynchronies, mostly on the order of 50–250 milliseconds. Such phenomena can have a profound effect on how music feels to the listener, including the qualities of motion that it evokes. This symposium explores some of these kinetic aspects of microtiming—including groove and flow in popular music—as well as some new methods and compositional applications of microtiming analysis.

Event details