“Performing the space: integration among the arts”. Concerto finale

SONIC ARTS PERFORMANCE

Original works by Students of Master in SONIC ARTS

of the University of Rome Tor Vergata

in collaboration with guest dancers

1.  Bangin’ Lighting

Video background, Music, Enrico Mereu, Niccolo Chotkowski

Choreographer and dancer, Gunta Liepina

This project is a soulful interaction between a dancer, a game of lights, textural music  and bangs.  Each light point is influenced by the intensity and volume of the music and the energy of the dancer.

2. Endless Freedom

Composer, Gabriel Guerrero Solis

Dancer, Julia DePaoli

A dancing usually dancer has to adapt to music, creating an interaction from one direction: Music for dance. Why not make an interaction that is bidirectional?

This is the objective of our project. First, the dancer decides when the music begins and when it ends, and, while she is dancing, she is also composing choreography in relation to the music, its characteristics, changing the dynamics of the music. Also, a video elaboration is projected on the screen behind the dancer, and this too is interactive, depending on the choices the dancer makes.

3. The Space Between

Live Video: Alessio Sbarzella

Dancer: Gunta Liepina

Music: Reach for the Dead by Boards of Canada

The Space Between is an audio-reactive video performance improvisation. There will be an “interplay” between the player, the dancer and the music. The dancer and the player will improvise listening to the music. The video will change based both on the music input and the player’s choices.The entire project is fully integrated in Max 4 Live and synchronized with Ableton Live.

4.   Mantra number 2

Sound design, Live video, and music,  Stefano Borgia

Arash Jooyafar

Dancers, Jie Lin How, Julia DePaoli

Mantra is a Sanskrit root word, and means a set of words and sounds with particular repeated many times. The meaning is released, ‘Man” means mind and “Tra” means free.

The concept of this project is to create moving images influenced by the sound of guitar playing and movements of dancers.

For the video, we designed a patch which generates a Jitter object that we control with MIDI and responds to the amount of sound. John Crawford’s Active Space enables us to have live video processing from a webcam which is background for the Jitter object. All controls in the Active Space and Max/MSP patches are controlled by MIDI. As an audio input, we have live, electronic guitar performing and a fixed frequency as background sound. All the guitar sounds are created by digital pedal effect and amplifier.

5.  Borden’ s

Mandrelli Michele, Gabriele Pierro, Niccolo’ Chotkowski

Everything begins with some kind of non-sense and at the end it shows all its sense.

6. Light and Shapes

Composer, Daniele Toma

Dancer : Jie Lin How

Concept:

Explore a dimension of expression, where sound and image are linked, as materials in the hands of the artist, who will have the freedom to follow an idea without the bounds of rules, even the ones of physics due to the instrument. The only one, is the body, movement is sound.

Technical Details:

All the project has been created on Max/msp, and it’s divided in two main patches. One is for the generation of 3D objects, in particular, what I called Spike, a sound intensity reactive shape, created by javascript and used by Jitter to modify its dimensions, the closure of itself along length and height and other parameters like position and camera angle, those are all changed by a MIDI controller, to have a direct interaction with the shape. The other patch is for the tracking of movement by cam, made using the cv.jit library. It can recognize several masses of pixel that are moving together, called blobs, every one of those has a center and a dimension that are used to control the parameters of a FM synth.

The result is to have the generation of sounds in correspondence of the movement of the dancer causing deformations of the Spike, that in the meantime is moved and modified from by the controller.

7. Thunderclaps

Sound Design: Valerio Nevi, Alessandro Malcangi, Luca Nave

Video : Letizia Gionfrida

Dancer : Noel Dilworth

The project studies the use of interactive technology with dance.

The video acquisition is being captured live using the Microsoft Kinect camera which allows for “hands-free” interaction while detecting gestures. 

Two different Max patches have been developed for video interactions One is controlled by sound using a condenser microphones and the other one is made my points connecting with the kinect acquired datas in order to increase and decrees points’s numbers and for the spatial visualization of  each plane.

The whole effect has been made by overlapping these two video aforementioned.

VIDEO INTERACTIONS – USING SOUND AND KINECT- ABSTRACT

AUDIO

A pre-edited track has been elaborated upon by developing melodies and song arrangements with piano and arpa sounds.  In addition, there are a combinations of  different sound effects to evoke a spacey and trippy quality throughout the composition, which aims to pull this work into multiple dimensions.

Regia del suono: Anna Terzaroli

Special thanks to Professor Giovanni Costantini,

and Guest Artist Workshop teachers, John Crawford and Lisa Naugle.

Università degli Studi di Roma Tor Vergata

Auditorium E. Morricone

Via Columbia, 1 – Roma

3 luglio 2015, ore 16.30

(Ingresso libero- free entrance)

Scarica la locandina

I commenti sono chiusi