BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//ICMC HAMBURG 2026 - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:http://icmc2026.ligeti-zentrum.de
X-WR-CALDESC:Events for ICMC HAMBURG 2026
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Amsterdam
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20270328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20271031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Amsterdam:20260515T200000
DTEND;TZID=Europe/Amsterdam:20260515T220000
DTSTAMP:20260428T195626
CREATED:20260421T171512Z
LAST-MODIFIED:20260427T090910Z
UID:10000177-1778875200-1778882400@icmc2026.ligeti-zentrum.de
SUMMARY:Evening Concert 5B (Lübeck)
DESCRIPTION:Program Overview\nImprovising Machine #7325: Inside My Trumpet\, Again\nJeff Kaiser \nThe Letter\nMinho Kang \nMoloch whose mind is pure machinery!\nEric Lyon \nTidal Unit for Sonic Activities\nIlia Viazov and Nicola Leonard Hein \nRhythmic Traces | Twisted Electronics\nNicola Leonard Hein \nFound Violin x Aromantic Hobby \nDong Zhou \nTokens & Strings: an improvisation between an electric guitarist and a local LLM\nOlivier Jambois \n  \nAbout the pieces & artists\nJeff Kaiser: Improvising Machine #7325: Inside My Trumpet\, Again\n“Improvising Machine #7325: Inside My Trumpet\, Again” places the audience inside a trumpet\, exploring the instrument’s interior sonic world through an immersive human–machine improvisation system. The work is built from an extensive\, purpose-built sample library captured by placing microphones deep within the instrument. These samples document the mechanical sounds and embodied actions of trumpet performance without the instrument being played traditionally—collections of the sound of valves descending\, springs releasing\, air being compressed and released by slides\, valve caps loosening\, spit-valve gurgles\, and a range of non-tonal lip\, air\, and tongue sounds produced through the mouthpiece and leadpipe. \nTwenty-eight autonomous virtual agents (“robots”)\, authored by the composer in Max/MSP and hosted in Ableton Live\, inhabit a 360-degree ambisonic field surrounding the audience. Each agent draws from its own subset of the sample library and listens to the live trumpet performance in real time. Their behaviors fluctuate between responsive and indifferent\, generating shifting environments that range from highly chaotic to unexpectedly calm. As a result\, the improvising performer becomes entangled with a machine ensemble that both reflects and subverts the human gestures\, creating a continuously changing dialogue between human and technological agents. \nAbout the artist\nJeff Kaiser is a trumpet player\, media technologist\, and scholar. Classically trained as a trumpet player and composer\, Kaiser now takes an integrative\, systemic view that involves his traditional instrument\, emergent technology (in the form of custom interactive/generative software and hardware interfaces)\, space\, and audience: all being critical and integral participants in his performances. He gains inspiration and ideas from the rich history of experimental improvisation and composition\, as well as cognitive science\, and the vast timbral and formal affordances provided by combining traditional instruments with new and repurposed technologies. The roots of his music are firmly in the experimental traditions within jazz\, improvisation\, and Western art music practices. Kaiser is currently Associate Professor of Music Technology and Composition at the University of Central Missouri. \nMore information at https://jeffkaiser.com/ \n  \nMinho Kang: The Letter\nThe Letter is a work of consolation created using an FFT Channel Vocoder with Additive Synthesizer. \nHistorically\, the vocoder was developed during wartime to enable communication among allies. It reduces wideband speech to a narrower band for transmission and then reconstructs it at the receiver. In short\, a vocoder sends important words over distance and makes their faint traces audible again.\nAs a composer\, creating music is much the same. I keep listening to people and the world\, their voices. Then\, I compress\, interpret\, and reassemble those words in my own terms and offer them back as a piece.\nUnlike the vocoder’s original purpose\, in a time when war is no longer shocking news\, I wanted to use this technology to carry comfort. The lyrics come from a poem I wrote during my military service to endure a hard period (not in combat). This piece does not present a political agenda; it is a letter to anyone facing painful circumstances\, on any side\, in any degree. \nTechnically\, I aimed to design a vocoder with greater precision than a conventional channel vocoder. Instead of using bandpass filters\, I applied Fast Fourier Transform (FFT) analysis to collect more detailed and accurate amplitude information\, which allowed clearer rendering of vowel formants. This approach led to the creation of a Max for Live (M4L) FFT Channel Vocoder patch.\nI also developed an Additive Synthesizer M4L patch capable of producing a wide spectrum of sounds\, from pure sine waves to noise. When combined with the vocoder\, this synthesizer allows the clarity and harmonicity of speech to change according to the lyrics. Since the text relates to the transformation of light\, I used this Additive Synthesizer to achieve a tone painting that reflects those luminous changes. \nAbout the artist\nMinho Kang is a Korea-born composer and computer musician. His artistic interests\, which began in popular music and moved into contemporary music\, have expanded into electronic music at the intersection of technology and art. Drawing on introspective reflection and close observation of the world\, he brings diverse imaginings into his works.\nHis music has been presented at conferences and festivals including SEAMUS\, ICMC\, and the TurnUp Multimedia Festival. He completed his bachelor’s degree at Indiana University\, where he studied composition with Jeremy Podgursky\, Aaron Travers\, P. Q. Phan\, David Dzubay\, and Don Freund\, and electronic music with John Gibson and Chi Wang at the Center for Electronic and Computer Music. \n  \nEric Lyon: Moloch whose mind is pure machinery!\nAllen Ginsburg’s poem Howl was published in 1956\, the same year as the Dartmouth Summer Research Project on Artificial Intelligence. The two events portend seemingly incompatible futures that nonetheless are both with us now. A bursting forth of cultural chaos in an “armed madhouse” and the technocratic reduction of intelligence to code. Ginsburg’s poem’s ritualistic and repetitive rant about Moloch inspired this performance\, a tone poem that derives its sounds from two main sources – AI-generated music and the OB-Xd virtual analog synthesizer VST plugin manipulated using the Slewable Utility for Random Parameters (SLURP) designed by the composer. The performance interface consists of a Korg nanoKONTROL2 unit and the Google MediaPipe face landmarker. \nAbout the artist\nEric Lyon is a composer and audio researcher focused on high-density loudspeaker arrays\, dynamic timbres\, virtual drum machines\, and performer-computer interactions. His audio signal processing software includes “FFTease” and “LyonPotpourri.” He has authored two computer music books\, “Designing Audio Objects for Max/MSP and Pd\,” a guidebook for writing audio DSP code for live performance\, and “Automated Sound Design\,” a book that presents technical processes for implementing oracular synthesis and processing of sound across a wide domain of audio applications. He has written extensively about the possibilities of multichannel spatial audio. In 2016-17\, Lyon was guest editor for the Computer Music Journal on Volume 40(4) and 41(1) covering various aspects of High-Density Loudspeaker Arrays (HDLAs). \nIn 2015-16\, Lyon architected both the Spatial Music Workshop and Cube Fest at Virginia Tech to support the work of other artists working with HDLAs. In 2025 he co-created the Spatial Audio Tidepool to provide technical instruction for creative uses of high-density loudspeaker arrays. Lyon’s compositional work has been recognized with a ZKM Giga-Hertz prize\, MUSLAB award\, the League ISCM World Music Days competition\, and a Guggenheim Fellowship. Lyon teaches in the School of Performing Arts at Virginia Tech\, and is a Faculty Fellow at the Institute for Creativity\, Arts\, and Technology. \n  \nIlia Viazov and Nicola Leonard Hein: Tidal Unit for Sonic Activities\nPerformance-presentation of tusa (Tidal Unit for Sonic Activities). Tusa is a framework for Tidal Cycles live-coding environment that binds together different parts of the application in one Bash executable. It is an attempt to accomplish Tidal Cycles\, expanding it to a software DMI. It seeks to fulfill essential needs during performance with the environment\, keeping the setup very minimal yet sturdy\, while remaining modular and extendable. The framework allows the user access to the interpreter\, text editor\, reference window and server during live-coding practices.\nThe performance is aimed on live-coding improvisation with machine learning tools using spatialisation synthesis techniques. \nAbout the artists\nIlia Viazov (born in 1999 in Voronezh\, Russia) is a composer and sound artist working at the intersection of electronic music\, performance\, self-built instruments\, machine learning\, and software development. His personal and collaborative works have been presented at and supported by Ars Electronica Festival\, platformB Stuttgart\, and Darmstädter Ferienkurse. He is developing the framework tusa for Tidal Cycles live-coding environment\, a terminal implementation that allows the user run it locally\, fully interact with all parts of the environment and extend it. \nDr. Nicola Leonard Hein \nNicola Leonard Hein: Rhythmic Traces | Twisted Electronics\nThe piece Rhythmic Traces|Twisted Electronics deals with the question of how the integration of the body and skin resistance into the circuit of an analog synthesizer(Buchla Music Easel) and the connection with a machine learning-based musical agent system(SuperCollider) can change the tonal and rhythmic fluidity of the instrument and develop it beyond its limits. For this piece\, Nicola Leonard Hein uses a unique circuit-bending controller that completely alters the musical reading of the 1970s Buchla Music Easel. Furthermore\, he uses a multi-effect unit programmed in SC and realized with a Bela Microcomputer. Hein’s musical agent learns to interact musically\, creating the music in real time together with Hein on the synthesizer and developing the interaction between a human and a machine musical voice. The systemic economy of movement and the interaction with the AI musical agent create polyphonic rhythmic\, tonal\, and spatial structures. The piece focuses on the emergent Dances of Agency (Pickering). \nAbout the artist\nDr. Nicola L. Hein is a sound artist\, guitarist\, composer\, researcher\, programmer\, and professor of Sound Arts and Creative Music Technology at the University of Music Lübeck.\nHe works with A.I.-assisted human-machine interaction\, postdigital lutherie\, intermedia\, sound installations\, augmented reality\, network music\,and spatial audio. His works have been realised in more than 30 countries\, at festivals such as MaerzMusik Festival\, Sonica Festival\, Experimental Intermedia etc. \n  \nDong Zhou: Found Violin x Aromantic Hobby \nFound Violin is an improvisation system that treats the violin as just one of many sound objects. Since late 2024\, Dong Zhou has started to develop Aromantic Hobby\, a series of strap-on midi controllers. After a few prototypes\, the current controller features a bunny-shaped appearance and wirelessly transmits kinetic data from the wearer to control a chaotic synthesizer. With Found Violin played with the upper body and Aromantic Hobby on on lower body\, the musician plays a duo with themselves. \nAbout the artist\nDong Zhou is a composer-performer based in Hamburg. Zhou gained a B.A. in music engineering at the Shanghai Conservatory and an M.A. in multimedia composition at the Hamburg University of Music and Drama. Zhou won several prizes\, including the first prize in the 2018 ICMC Hacker-N- Makerthon\, the finalist of the 2019 Deutscher Musikwettbewerb\, the Nota-n-ear Award 2022\, and the shortlist of the 2025 Giga-Herz Pop Experimental Production Award. Zhou had works included in the ‘Sound of World’ Microsoft ringtones collection and was commissioned by festivals and institutions such as the Shanghai International Art Festival\, ZKM Karlsruhe\, Stimme X Festival\, etc. Zhou is currently a doctoral candidate in ICAM of Leuphana University. \n  \nOlivier Jambois : Tokens & Strings: an improvisation between an electric guitarist and a local LLM\nThis performance explores real-time co-creation between a human performer and a machine\, specifically investigating the improvisational capabilities of Large Language Models (LLMs) within a musical context. The project originates from an inquiry into the potential of using established LLM architectures —notably the one behind ChatGPT— as responsive improvisational partners. \nA primary challenge in this research is the nature of the LLM: as these models are designed for symbolic processing rather than direct audio generation\, the system must bridge the gap between acoustic signals and semantic analysis. An architecture was developed where the electric guitar’s audio is captured and processed to extract high-level audio descriptors. These descriptors are then sent to the LLM\, which analyzes the performer’s intent and generates a symbolic rhythmic response. This response is mapped to a drum sequencer controlling kick\, snare\, and hi-hat patterns.\nTo address the inherent risks of cloud-based APIs in a live performance environment—such as latency and connectivity instability—this work utilizes a local deployment. While local models often feature a smaller parameter count\, the system has been optimized through careful prompt design and constraint-based logic. This ensures a meaningful rhythmic dialogue while minimizing inference time\, achieving a critical trade-off between algorithmic complexity and real-time musical reactivity. \nIn this performance\, the generative drumming output is routed through a RAVE (Real-time Audio Variational auto-Encoder) module\, developed by IRCAM. By applying neural re-synthesis via a percussion pre-trained model\, the system transforms these source samples into complex\, evolving textures\, moving beyond static playback toward a more sophisticated timbral exploration. Throughout the improvisation\, the guitar signal is processed through custom-designed Pure Data patches\, creating a personal sonic language that oscillates between raw strings and highly transformed textures\, seeking a constant state of flux between contrast and blending with the machine-generated environment. \nAbout the artist\nOlivier Jambois is a guitarist\, composer\, and researcher working at the intersection of acoustic tradition\, analog electronics\, and digital innovation. He holds a PhD in condensed matter physics and a master’s degree in jazz and modern music\, a dual background that defines his analytical yet avant-garde approach to music.\nHe has won the Jazz à Vienne national competition in 2012\, received “Revelation” honors from Jazz Magazine for his album “Les composantes invisibles” and a grant from the Generalitat de Catalunya to support his research into DIY magnetic tape echoes (2023). He has published several albums\, performed at major european festivals. His 2025 release\, Eclosió\, featuring drummer Jim Black\, reflects his ongoing involvement in the contemporary improvisation scene.\nHe is currently professor and researcher at ENTI\, University of Barcelona\, Spain. His research focuses on AI and generative systems. \n 
URL:http://icmc2026.ligeti-zentrum.de/event/evening-concert-5b-lubeck/
LOCATION:Lübeck University of Music: Großer Saal\, Große Petersgrube 21\, Lübeck\, 23552\, Germany
CATEGORIES:15-05,Concert,Excursion to Lübeck,Music
ORGANIZER;CN="ICMC HAMBURG 2026":MAILTO:info@icmc2026.ligeti-zentrum.de
END:VEVENT
END:VCALENDAR