BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//ICMC HAMBURG 2026 - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:http://icmc2026.ligeti-zentrum.de
X-WR-CALDESC:Events for ICMC HAMBURG 2026
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Amsterdam
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20270328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20271031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Amsterdam:20260513T183000
DTEND;TZID=Europe/Amsterdam:20260513T210000
DTSTAMP:20260423T075513
CREATED:20260421T195653Z
LAST-MODIFIED:20260422T082616Z
UID:10000087-1778697000-1778706000@icmc2026.ligeti-zentrum.de
SUMMARY:Banquet
DESCRIPTION:Photo: Richard Stoehr\n  \nOn Wednesday\, May 13\, 2026\, the ICMC HAMBURG 2026 Banquet will take place at the exceptional Speicher am Kaufhauskanal – one of Harburg’s most atmospheric historic venues. This beautifully restored 19th-century half-timbered building\, originally built in 1827\, blends architectural charm with state-of-the-art event and culinary infrastructure\, creating a truly memorable setting. \nGuests can look forward to an elegant evening in a unique riverside location in Hamburg-Harburg\, where historic character meets contemporary comfort. Following the banquet\, a club concert will round off the night – open to all conference participants and perfect for continuing the conversations and connections in a more relaxed\, musical atmosphere. \nPlease note that availability is limited to 100 banquet tickets. \n\n\n\n\n\n\n\n\n\n\n\n\n\n\nBanquet tickets: 85 € \n\n\n\n\n\n\n\n\n\n\n\n\nRegistration für thr ICMC Hamburg 2026 Banquet via Converia  \n\n\n  \n\n\nPhoto: Jasmin-Marla-Dichant
URL:http://icmc2026.ligeti-zentrum.de/event/banquet/
LOCATION:Speicher am Kaufhauskanal\, Blohmstraße 22\, Hamburg\, 21079\, Germany
CATEGORIES:13-05,Special Event
ORGANIZER;CN="ICMC HAMBURG 2026":MAILTO:info@icmc2026.ligeti-zentrum.de
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Amsterdam:20260513T213000
DTEND;TZID=Europe/Amsterdam:20260513T233000
DTSTAMP:20260423T075513
CREATED:20260421T162148Z
LAST-MODIFIED:20260422T121025Z
UID:10000088-1778707800-1778715000@icmc2026.ligeti-zentrum.de
SUMMARY:Club Concert 3C
DESCRIPTION:Concert 3C is an exploration of the boundaries of collective improvisation and creative technology. The SPIIC Ensemble of the HfMT Hamburg presents a program in which the audience has a say\, algorithms extend historical works\, and artificial intelligence reinterprets human movement as a “hallucination.”\nIn the industrial atmosphere of the Speicher am Kaufhauskanal\, acoustic instruments merge with live coding\, neural synthesis\, and interactive notation. \n  \nProgram Overview\nLiquid tensioning\nFernando Egido \nSinophony for Clarence\nJuan Arturo Parra Cancino \nChimerique\nJonathan Wilson \nNEBULA\nEnrique Tomás and Moisés Horta Valenzuela \nplastique\nSe-Lien Chuang and Andreas Weixler \nShamanic Protocol\nOscar Corpo \nA Walk in Polygon Field\nRob Canning \nDEPRECATED\nDenis Polec Vocal \n  \nAbout the pieces & artists\nFernando Egido: Liquid tensioning\nLiquid Tensioning is a work for violin and double clarinet\, live notation\, live generative system\, live electronics\, and attendees’ participation (category: Improvised work for ensemble and electronics (SPIIC+ Ensemble)). Liquid tensioning is a Collaborative and interactive work in which the work is real time created by the self-evaluation of the work. The attendees will evaluate the work via a web app\, and the musical generative system will change according to the evaluation in real time. The Musicians will receive notes via a live notation system on their mobile phones. The title of the works refers to the model of tensioning provided by the generative system based on a musical tensioning that is not related to the properties of the musical material. This work belongs to a series of works in which the composer creates a self-referential musical generative system based on the real-time evaluation of the work. The main musical material of this work is its evaluation. The work duration is about 10 minutes. \nAbout the artist\nHe studied composition with José Luis de Delás at the School of Music of the University of Alcalá de Henares and received musical training in workshops with composers\, analysts\, and interpreters around the LIEM or the GCAC. He studied Computer Music with Emiliano del Cerro.\nHe has published several papers at international conferences.\nHis works have been performed at festivals such as ICMC 2025-2024-2023\, Bled international festival\, SMC Conference Graz\, Convergence Festival\, Ars electronica Linz\, Atemporánea Festival\, AIMC 2022 conference\, EVO 2021\, OUA Electroacoustic Music Festival 2020\, ISMIR 2020 in Montreal. The Seoul International Electroacoustic Music Festival 2019\, the ACMC 2019 conference in Melbourne\, SID 2015 conference in New York\, Venice Vending Machine III\, the New York City Electroacoustic Music Festival\, JIEN in the Auditory 400\, La hora acúsmatica\, SMASH Festival\, Encontres Festival in Palma of Majorca\, and ACA. \n  \nJuan Arturo Parra Cancino: Sinophony for Clarence\nSynophonie for Clarence is an ensemble and live electronics work inspired by the formal and sonic principles of Clarence Barlow’s Sinophony I (1970)\, his first electronic composition. Rather than functioning as an arrangement or transcription\, this piece operates as an instrumental extension of Barlow’s electronic sound world\, translating and reactivating its core materials through acoustic performance and real-time electronic processes. \nThe work seeks to bring into the physical space of performance elements that\, in Sinophony I\, exist only in fixed media: continuous tones\, slow harmonic transformations\, beating frequencies\, and the perceptual tension between purity and instability. These characteristics are reimagined here as a living\, performative situation\, where instrumental sound and electronics merge into a single\, evolving spectral body. \nSynophonie for Clarence builds on methods developed by Juan Parra Cancino to extract performative salients from early electronic works—elements that can be embodied\, negotiated\, and reshaped by performers in real time. Through this approach\, the piece revisits historical electronic material not as an object to be preserved unchanged\, but as a dynamic field for exploration\, experimentation\, and renewed artistic engagement. The aim is not reconstruction\, but continuation: to recover underlying processes and extend their implications into contemporary performance practice. \nBy situating acoustic instruments\, live electronics\, and spatialized sound within a shared listening ecology\, the work foregrounds collective tuning\, timbral fusion\, and emergent beating phenomena as central musical forces. The ensemble functions less as a group of independent voices than as a composite oscillator\, shaped by subtle interactions and shared attention. \nThis piece is conceived as a tribute to Clarence Barlow—composer\, educator\, and friend—honoring both his pioneering contributions to electronic music and his enduring influence on ways of thinking about sound\, structure\, and musical intelligence. \nAbout the artist\nJuan Parra Cancino studied Composition at the Catholic University of Chile and Sonology at the Royal Conservatoire The Hague\, where he completed a Master’s degree in electronic music. He received a PhD from Leiden University in 2014 on performance practice in computer music. A guitarist trained in Robert Fripp’s Guitar Craft\, he has worked extensively in live electronics. He is a researcher at the Orpheus Institute and Regional Director for Europe of the International Computer Music Association (2022–26). \n  \nJonathan Wilson: Chimerique\n“Chimerique” is about the interaction of music and language. Written and premiered in 2017\, this composition is for an ensemble featuring improvisation\, narration\, and electronics. It was realized in a collaboration with poet and translator Patricia Hartland by incorporating her English translation of “Ravines of Early Morning” by Raphael Confiant into a musical setting. The title is taken from a word in this text. It is French for “chimerical\,” and it can be defined as 1: something that takes delight in illusions\, or 2: something that is utopian\, or unreal. The narrator forms associations with this word through various phrases and passages that relate to the part of the story in which the description of “chimerique” is elaborated. Throughout this performance\, the performers listen and react to the text spoken by the narrator (and electronics). They are accompanied by electronics that consist of fixed media and live electronics from two different patches in Max/MSP using additive synthesis and granular synthesis. The musical instruments are the source material for granular synthesis. The score for this composition uses hybrid musical notation with some traditional notation for pitch and some graphic notation that leads performers subsequently to interpret not only the spoken phrases\, but also the graphic notation in their parts to determine volume\, pitch\, rhythm\, articulation\, and contour\, thereby making improvisation a necessity. The narrator and performers work together to generate a spontaneously formed through-composed work that marries text and music. The form can be described as through-composed in six sections. In the first section the performers respond only to a single phrase. In sections 2-6 the performers respond not only to phrases that delineate each section but also respond to extended narration shifting from descriptions of dreams\, the night\, madness\, illusions\, and at the end the act of dreaming itself. \nAbout the artist\nDr. Jonathan Wilson’s works have been performed at the Ann Arbor Film Festival\, European Media Art Festival\, ICMC\, SICMF\, SEAMUS\, NYCEMF\, MUSELAB\, NSEME\, Napoleon Electronic Music Festival\, Iowa Music Teachers Association State Conference\, and Midwest Composers Symposium. He is the winner of the 2014 Iowa Music Teachers Association Composition Competition. Jonathan has studied composition with Lawrence Fritts\, Josh Levine\, David Gompper\, James Romig\, James Caldwell\, Paul Paccione\, and John Cooper. In addition\, studies in conducting have been taken under Richard Hughey and Mike Fansler. Jonathan is a member of Society of Composers\, Inc.\, SEAMUS\, ICMA\, and the Iowa Composers Forum. \n  \nEnrique Tomás and Moisés Horta Valenzuela: NEBULA\nArtists working with deep-learning audio models often find that exploring their high-dimensional latent spaces requires chance-based\, combinatorial\, or technically complex machine-learning techniques. While these approaches can reveal unexpected possibilities\, they also make it more difficult to deliberately guide the models toward outcomes that are musically meaningful or aligned with specific creative intentions. \nIn this improvisation for solo instrument and two performers on live electronics\, we present an alternative approach to create a more interpretable and musically guided latent space exploration. This approach leverages Principal Component Analysis (PCA) applied to pre-encoded RAVE (Realtime Audio Variational Autoencoder) representations to reorganize the latent data into clusters that can be navigated more deliberately in performance. PCA reorganizes the encoded data into clusters based on shared timbral characteristics\, producing data clouds directly connected to the sonic properties of the source material. By structuring access to the latent space in this way\, our method bridges the gap between open-ended exploration and purposeful control\, offering performers a clearer and more intuitive means of shaping sound. \nTo prepare the improvisation\, and prior to the concert\, the solo instrumentalist provides an eight-minute recording that defines the sonic domain of the performance. This recording is encoded and analyzed\, restricting exploration to regions of the latent space shaped by the performer’s own material and giving the electronic musicians a more focused and musically coherent landscape to navigate. During the live performance\, the solo instrumentalist and the two electronic performers interact within this PCA-organized timbral map. Their trajectories through the latent space—along with the evolving clusters and sonic transformations—are projected in real time\, allowing the audience to see how latent-space navigation corresponds to audible change. \nThe musical materials resulting from this setup combine structured instrumental improvisation with electronically generated textures derived from latent-space navigation. While the overall form is left to real-time decisions between the soloist and the live performers\, the resulting sound world often alternates between rhythmically driven motifs—loosely recalling the interactive dynamics of small jazz ensembles—and more abstract electronic layers shaped through PCA-guided trajectories. These electronic textures\, produced by traversing clustered regions of the latent space\, serve as harmonically and timbrally evolving fields against which the soloist can articulate phrasing\, gesture\, and dynamic contour. The custom-built performance interfaces allow the electronic performers to shape these materials with precision\, enabling a responsive interplay in which acoustic action and machine-learned transformations continually inform one another. \nAbout the artists\nEnrique Tomás (*1981) is a sound artist\, researcher and assistant professor at the Tangible Music Lab who dedicates his time to finding new ways of expression and play with sound\, art and technology. His work explores the intersection between sound art\, computer music\, locative media and human-machine interaction.\nAs an individual artist\, Tomás’ activity is centered around ultranoise.es and focuses on performances and installations with extreme and immersive sounds and environments. He has exhibited and performed in spaces of Ars Electronica\, Sonar\, CTM\, IRCAM\, IEM\, KUMU\, SMAK\, NOVARS\, STEIM\, Steirischer Herbst\, Alte Schmiede\, etc.\, and in galleries and institutions throughout Europe and Latin America. \nMoisés Horta Valenzuela is a self-taught sound artist\, technologist\, musician\, and researcher from Tijuana\, Mexico\, based in Berlin. His work spans computer music\, neural audio synthesis\, conversational AI\, and the politics of emerging technologies\, approached through a critical lens that connects ancestral knowledge with contemporary digital culture. He has presented work internationally at Ars Electronica\, NeurIPS ML for Creativity & Design\, MUTEK México\, MUTEK AI Art Lab Montréal\, Transart Festival\, CTM Festival\, Elektron Musik Studion\, and the Sound and Music Computing Conference\, among others. \n  \nSe-Lien Chuang and Andreas Weixler: plastique\ninteractive audiovisual comprovisation for e-quitar\, green leaves & i-hands – GLISS – Green Leaves Imaginary Scenic Score\nDuration: ca. 8 min \nAbout the artists\nAndreas Weixler\, born 1963 in Graz\, Austria\, is a composer for computer music with an emphasis in\nintermedia realtime processing. He is teaching at the mdw Vienna\, InterfaceCulture in Linz and serves associate university professor at the CMS – computer music studio of Anton Bruckner\nUniversity in Linz where he initiated the intermedia concert hall the Sonic Lab.\nStudies of contemporary composition at KUG in Graz\, Austria with diploma by\nBeat Furrer\, completed by international projects and residencies. \nSe-Lien Chuang is a composer born in Taiwan in 1965 and based in Austria since 1991. Her work focuses on contemporary instrumental composition and improvisation\, computer music\, and audiovisual interactivity. She has presented works and lectures internationally in Europe\, Asia\, and the Americas at events such as ICMC\, ISEA\, and NIME. From 2016 to 2019\, she taught for the Computer Music Studio at Bruckner University Linz. Since 1996\, she has co-run Atelier Avant Austria\, specializing in audiovisual interactive systems\, real-time processing and computer music. \n  \nOscar Corpo: Shamanic Protocol\nShamanic Protocol is an online sound ritual performed by a partially damaged virtual entity. Its memory is an incomplete and corrupted archive\, composed of residual sonic materials related to shamanic rituals\, music therapy\, sound-based healing practices\, and data derived from musical epigenetics. Reshaped by the available data and the presence of connected users\, these fragments are reprocessed and reorganised each time the system is accessed\, generating a sonic ritual that follows a recognisable structure yet never manifests in the same way twice. The sound ritual has no declared purpose: it remains unclear whether the entity performs the rite as an attempt to repair itself\, an act of archive restoration\, a process meant to affect human listeners\, or simply because this process constitutes its way of operating. The variability of the outcome may suggest either a gradual recovery or a progressive deterioration of the system. The resulting sonic output exists in a space between therapeutic effect\, system malfunction\, and autonomous algorithmic process. The shifts between fragile calm\, overload\, interruption\, and recovery reveal the instability of the system that generates it. No clear boundary is drawn between healing\, malfunction\, or expression: these states coexist and remain indistinguishable within the process. The rite can be experienced as a purely electronic process\, or human performers\, in any instrumental or vocal configuration\, may take part in its enactment. Musicians are invited to participate in the ritual rather than interpret a fixed musical text. Guided by an open\, interpretative score\, performers do not execute predefined material but engage in the ritual itself\, interacting with the electronic layer by listening\, responding\, and aligning their gestures with the evolving sonic environment. The notation offers indications of behaviour\, density\, register\, and gesture rather than prescribed material; in this way\, performers take part in the rite by freely amplifying\, refracting\, and destabilising the entity’s activity. The score prescribes no precise instrumentation or techniques; in this instance\, the ritual is performed with a string ensemble alongside soprano saxophone\, bass clarinet\, piano\, and percussion. Performers do not guide the system\, nor do they follow it; instead\, they remain in a state of attentive coexistence with its unfolding behaviour. Each performance is therefore situated\, shaped by specific conditions\, configurations\, and presences.\nThe process does not call for interpretation: repair and damage are no longer separable; function and meaning no longer distinguishable. \nAbout the artist\nOscar Corpo (born 8 April 1997\, Naples\, Italy) is an Italian composer based in Hamburg. He studied Composition and Multimedia Composition in Naples\, and is now a PhD candidate at the HfMT Hamburg\, focusing on AI and collective improvisation with Ensemble 404. His work spans electronic\, instrumental\, vocal\, improvisation\, and music theatre. He has collaborated with Alexander Schubert\, Berliner Philharmoniker\, La Biennale di Venezia\, and Lux Nova Duo\, among others. \n  \nRob Canning: A Walk in Polygon Field\nA Walk in Polygon Field is a graphic score environment for controlled improvisation\, composed for 1–4 instrumentalists with electronics and surround diffusion. Three polygons—pentagon\, hexagon\, heptagon—rotate at different rates\, producing polymetric phase relationships (5-against-6-against-7). Performers activate objects orbiting these shapes\, interpreting compound visual motion as sonic material. An outer ring generates OSC data driving spatial processing.\nThe score defines states\, behaviours\, and constraints; performers negotiate what these structures sound like. Each polygon side represents a discrete performance state—pitch region\, articulation\, texture—but specific mappings remain open. Musicians enter and withdraw from a shared texture whose density and pacing emerge from collective decision-making.\nAuthored entirely in SVG\, the work embeds performance semantics directly into visual element identifiers\, executed by a browser-based runtime on networked tablets. This approach\, detailed in the accompanying paper “Scores That Run: Graphic Notation with Embedded Performance Semantics\,” demonstrates how open web standards support animated notation without specialised infrastructure. Each performance traces a different route—music negotiated through shared encounter with a moving score. \nFull Guide to Interpretation\, Programme Notes and supporting materials including Supercollider live electronics patch are available online: \nhttps://robcanning.github.io/oscilla/compositions/polygonfield2026/ \nAbout the artist\nRob Canning (Dublin\, 1974) is a composer\, improviser\, and creative technologist whose work explores animated notation\, improvisation\, and the dynamics of networked musical systems. He holds a PhD in composition from Goldsmiths\, University of London\, where his research examined distributed authorship in computer-assisted music. A long-time advocate of Free and Open Source Software\, he develops Oscilla\, an open-source platform for animated graphic notation and networked performance. \n  \nDenis Polec: DEPRECATED  \nDEPRECATED establishes a recursive feedback loop between a biological subject and a cluster of interpretative algorithms. The work investigates the friction between human indeterminacy and machine determinism. \nThe Setup A lone performer occupies the center of the stage\, stripped of traditional instrumentation. Facing them is a “panopticon” of sensors: computer vision cameras and open microphones. The human subject oscillates between legible behavior and “abnormal” states—engaging in erratic gestures\, non-semantic vocalizations\, and visceral spasms designed to evade learned pattern recognition. \nThe Process Simultaneously\, three isolated AI instances dissect this input in real-time. Unable to process the chaotic reality of the “Now\,” the systems hallucinate: Computer Vision misinterprets trauma as choreography; a Large Language Model forces these errors into a coherent narrative; and Neural Audio Synthesis re-synthesizes the fabrication into sterilized perfection. \nAbout the artist\nDenis Polec operates at the intersection of sound art and algorithmic criticism. His practice rejects the notion of human-machine collaboration\, focusing instead on the friction\, latency\, and inherent violence of predictive systems. Polec constructs adversarial performance systems that expose the limitations of neural networks when confronted with the chaotic reality of the biological body. \n 
URL:http://icmc2026.ligeti-zentrum.de/event/club-concert-3c/
LOCATION:Speicher am Kaufhauskanal\, Blohmstraße 22\, Hamburg\, 21079\, Germany
CATEGORIES:13-05,Club Concert,Music,Special Event
ORGANIZER;CN="ICMC HAMBURG 2026":MAILTO:info@icmc2026.ligeti-zentrum.de
END:VEVENT
END:VCALENDAR