15th September, 2022, co-virtual with ICFP 2022, held in Ljubljana, Slovenia.
The ACM SIGPLAN International Workshop on Functional Art, Music, Modelling and Design (FARM) gathers together people who are harnessing functional techniques in the pursuit of creativity and expression.
Functional Programming has emerged as a mainstream software development paradigm, and its artistic and creative use is booming. A growing number of software toolkits, frameworks and environments for art, music and design now employ functional programming languages and techniques. FARM is a forum for exploration and critical evaluation of these developments, for example to consider potential benefits of greater consistency, tersity, and closer mapping to a problem domain.
FARM encourages submissions from across art, craft and design, including textiles, visual art, music, 3D sculpture, animation, GUIs, video games, 3D printing and architectural models, choreography, poetry, and even VLSI layouts, GPU configurations, or mechanical engineering designs. Theoretical foundations, language design, implementation issues, and applications in industry or the arts are all within the scope of the workshop. The language used need not be purely functional (“mostly functional” is fine), and may be manifested as a domain specific language or tool. Moreover, submissions focusing on questions or issues about the use of functional programming are within the scope.
14:00-15:30 FARM demos
18:30-19:30 FARM keynote
19:30-21:00 FARM performances
This year, we’ll do the keynote, demos and the performance evening all on the evening of Thursday, September 15.
The event is open to the public. Very reasonably-priced tickets will be available at the venue. Admissions is free for folks registered for ICFP. We intend to live-stream the event.
The schedule can also be found on the ICFP conference website.
Demo: New View on Plasma Fractals – From the High Point of Array Languages
Oleg Kiselyov, Toshihiro Nakayama
Fold Yer Loops! is a solo electronic music performance for augmented electric guitar and computer. The guitar is augmented with extra switches, capacitive buttons and potentiometers as well as an accelerometer and gyroscope capturing the movement of the instrument. An ESP32 low energy microcomputer translates the inputs to Open Sound Control (OSC) messages and sends them over Bluetooth to the computer, where they control custom Supercollider code that manipulates the live guitar sound in real-time.
These technologies all combine to give the performer an enhanced set of both sonic and gestural possibilities while interacting with the instrument. Stylistically, the music oscillates between ambient minimalism and avant noise drone while being peppered with a heavy dose of free improvisation and mutant rock noises.
Xeno is an experimental, abstract short film based on the intense, dense and continuous twine between audio and video.
Our concept and perception of sounds derive from a world where sonic events are always the consequences of physical actions. Xeno aims to break this rule by employing creatively the audio-video relationship, in a scenario where sonic and visual objects appear simultaneously, yet with an undefined linkage. It creates and proposes an altered, alien reality in which the action-reaction bond is puzzled and jeopardised and, consequently, our perception and experience become deceptive. Which media is the cause or consequence of the other? Does a relationship really exist?
Despite the enormous creative potential, most of the recent developments in live coding are based on the idea of the sequencer and the drum machine where events are associated to notes as discrete moments in a parametric space.
Nesso’s performance challenges this convention by using the Adapt sound engine, in which modules defining sound processes combine themselves in a network of influences, creating evolving sound morphologies. In this way, the machine gains a degree of autonomy by listening to its own output and analyzing the current configuration of the routings between these modules. The performer continuously interacts with this evolving system, trying to manage its behavior over time by altering the structure while discovering new possibilities and configurations.
Algoforte is a livecoded performance involving a Disklavier (a player-piano that is controlled via midi), two microphones, a sound amplification system and a laptop performer displaying their programming interface on a video projection screen. While the sound of the piano is picked up through the microphones and amplified, it is also captured into the live audio system and the sound is recorded, processed and manipulated further in realtime, with the results added to the speakers mix. The performer sits next to the piano, which plays based on live midi instructions produced by algorithms that are written in real time. The performer does not touch the piano but concentrates on his programming which is clearly visible on the projection.
The proposal of this project is to approach memory as a concept through space and time as a living and interactive entity. A metaphor of how we interact with objects and the environment and the relation of this with the digital world.
This is an audiovisual live coding performance constituted by a virtual VR space where 3D photogrammetry sculptures and sound are being distorted in real time.
Through the usage of live coding tools such as Hydra and Orca, the proposal connects everything as an experience to travel through different live coded universes on this DIY browser instrument.
FARM adheres to the ICFP 2022 Code of Conduct.
Workshop Chair: John Leo (Halfaya Research)
Program Chair: Youyou Cong (Tokyo Institute of Technology)
Performance Chair: Luka Frelih (Ljudmila Art & Science Laboratory)
Publicity Chair: Michael Sperber (Active Group GmbH)