Music Community Lab And Live Code NYC

button-icon-arrow-right
button-icon-arrow-left

button-icon-arrow-leftBack

Event

Music Community Lab and Live Code NYC

2 February 2019

New York

Added 04-Jan-2019

Description

Music Community Lab and Live Code NYC present a day exploring the intersection of live code, music/sonic art, visuals, and beyond.

Featuring talks and workshops from artists and creators of live code performance tools, as well as open time for discussion, hacking, collaboration, serendipity, and performance.

Mission

- To create live code artworks/performances/tools
- To offer workshops on various live coding (visual and music) languages/frameworks that showcase the breadth of NYC’s live coding community
- To facilitate collaboration
- To promote the growth of the live coding community through outreach

Tentative Schedule

12:00 PM - Introductory Talks
2:00 PM - Hacking & Workshops I
3:30 PM - Break
4:00 PM - Hacking & Workshops II
7:00 PM - Performances

Talks

"Interdisciplinary Live Coding"
Kate Sicchio, Assistant Professor, Virginia Commonwealth University
This talk explores how live coding may be used as a methodology for creative practice, ranging from music to visuals to dance.

"The Last Cloud"
Ben Taylor
A net art performance live coding several browser windows to mash-up web content.

 

"Video Techniques for Live Coding"
John Pasquarello, Creative Technologist, Future Colossal
A talk about using more videographic techniques for live coding. Because generated graphics are SOOO mainstream. Find ways to bring photos, videos, and other "irl" graphics into your visual sets!

"Picnic: Pseudo Live Coding MIDI with Park"
Damon Holzborn, Musician and Artist, Rustle Works
Park is a modular composition and performance system developed for the Web MIDI API. It is an attempt to combine the conceptual simplicity of a modular-style step sequencer with the algorithmic flexibility of a live coding language.

"Livecoding a Storm"
Lee Tusman, SUNY Purchase College and Campbell Watson, IBM Research
In this talk, we will describe a process for livecoding that combines improvisation and traditional livecoding tools along with triggering audio using data analysis of live weather data as input. We will demonstrate our current system and talk about ways to extend this.

 

"Getting Started with Writing Music with Code"
Jessica Garson, MESSICA ARSON / Twitter / livecode.nyc
Jessica will tell the story of how she got started writing music with code and walk through how to get started. This talk will be a live coded adventure. She will show the basics of creating songs with the following libraries and languages:
- Sonic Pi and Ruby
- Tidal Cycles and Haskell 
- FoxDot and Python

 

"The modern web-based Interaction of generative music systems"
Che-Yu Wu, Interaction Designer, Creative Engineer, Graphic Designer
Web-based technologies and tools are excellent approaches to produce generative music works with new possibilities. In the presentation, I will demonstrate the experience and works of how to combine Web Audio API, Physics simulations, Firebase real-time database, and the methodology to design the user interaction to produce generative music systems based on modern web standards.

 

"Designing DSLs for livecoding in Racket"
Mustafa Khafatehmusic & code
I'll go over my experience porting a SuperCollider client from scheme to Racket, and show how we can use Racket to quickly create Domain Specific Languages

 

"Life in Live Coding"
Char Stiles, Research Fellow, Frank Ratchye STUDIO for Creative Inquiry
Danielle Rager and I created a AV set called eCosystem. Danielle live coded part of the audio while I live coded the visuals. It is based off of cellular automata implemented in GLSL. The rules are modified by the music and feedback from the visuals. I'd like to go over how we developed this set and talk about how live coding can be a catalyst for combining mathematics and (simulated) life in more ways than expected. 

 

 

Workshops

2pm

"An introduction to Live Coding in Sonic Pi"
Liam Baum, Music Teacher, BELL Academy, Bayside Queens
Sonic Pi is a programming environment for live coding music. We will learn about the basic commands to make sounds, creating generative music and how to apply these concepts to do live musical performances.
No previous coding or musical knowledge is required. Please bring headphones and a laptop with Sonic Pi installed sonic-pi.net

 

"Real-Time Interactive Visuals"
Ulysses PoppleInteractive Performer
This workshop will give you a foundation for creating interactive visuals. Starting with basic shapes, we'll work our way up to combining effects, constantly driving everything through live input.

Pre-requisites:
TouchDesigner
- Optional (but recommended): VS Code with the vscode-ldjs extensioninstalled
- Something to make noise that's not too loud or distracting. I'll be clapping and whistling.

 

"Hexadecimal Beats"
Steven Yi, Visiting Assistant Professor, RIT
Learn to live code rhythmic and melodic patterns using hexadecimal beat notation. Workshop will use the online live.csound.com site for demonstration and practice.

 

 

4pm

"Getting started with Live Coding in Python with FoxDot"
Dave Stein, Colonel Panix
Discover how to Live Code music using the FoxDot Python Library and the sound synthesis engine Supercollider. Participants will learn the techniques to create a rhythm, add a bass line, chord progression and melody as well as hints on organizing code to successfully perform in a live environment.
Participants should pre-install python, foxdot and supercollider: http://foxdot.org/installation/

"Live coding visuals with GLSL"
Char Stiles, Research Fellow, Frank Ratchye STUDIO for Creative Inquiry
In the first half I will go through a quick excavation of what tools are out there, and walk through setting up a shader that responds to sound and midi from a musician. In the second half I will go over how to set up a 3D scene via ray marching.

"Coding visuals in Javascript with Hydra"
Zach Krall, Graduate Student, Parsons School of Design (MFA Design + Technology)
An introduction in live-coding visual art using Hydra (a Javascript based Video Synthesizer). Workshop goals include using output buffers and chained functions to manipulate visual input.
Please bring headphones and a laptop with Atom.io installed. GitHub Repogithub.com/zachkrall/hydra-workshop

"Turning Words into Sound"
Todd Anderson, Assistant Professor of Code + Liberal Arts, Eugene Lang College of The New School
This workshop will focus on using text, either your own written live, a corpus you have on hand, or even your own code into synthesized melodies and rhythms. The workshop will use javascript and p5.sound but the principles could be adapted to other languages and frameworks. All levels. Relevant links: toddwords.com/wordsynths toddwords.com/chiptext

Top