Vapor de Sueño was the first piece created by the educational innovation project LitiumLab. This group was formed in the Granada Fine Arts Faculty by several teachers and researchers in different fields: dancing, informatics, electronics, music… In video, we were two people Nökeö and myself. I was in charge of the responsive and real time part and Nökeö developed the ‘renderized’ videos.
The whole script was based on Lewis Carrol’s Alice in Wonderland.
For the Blue Caterpillar scene I designed a particle system with the software eMotion. It was projected in the floor with a beamer in zenital position. Near to it, we put a computer, IR lighting and a hand camera with a IR filter. That camera made the tracking of the dancers in the stage thanks to CCV by NuiGroup. The TUIO messages were sent over the wifi network to my computer where the particles were generated. Original music composed by Manuel Palma.
Another scene was the White Rabbit and the topic was Time.
Patricia had few sensors on her ankle and wrist. The acceleration data was sent to the receiver located in Jose López-Montes computer, it was there decoded and distributed in OSC format over the network. She played with the movement in order to the give the impression that the rabbit is going back and forth in time. The sound patch created for that was working excellent. In my side, the video part, a professional camera was recording the dancer and broadcasting the signal to my computer. With that signal I created a composition in QC which places several ‘slices’ of time in a 3D space. The controls allowed me to set the number of slices, its separation or the moment in time each one represent, going from around 20 seconds till milliseconds.
The next scene is my favorite from the book, the tea party.
My role in the choreography was the mad hatter. We described this character also as the ‘time alchemist’. A table was designed with that concept in mind, it had a transparent surface and several laboratory flasks and glasses on it. Inside the table there was a computer with a camera and reactivision software for fiducial tracking. Those fiducials were placed at the bottom of each element of the table, therefore the position in x and y axis, the rotation and the relative position one to each other of this elements were sent over the LAN for sound generation and visual interaction. The particle system this time was made up with 0’s and 1’s as elements and a green vector from origin to position. The movement was calculated thanks to a 3D perlin noise oscillation. The idea behind was to show a binary ordered system representing the thoughts of the mad hatter. Those thoughts became chaos with the time and his illness.
The main scene was the ‘Red Queen’, where I was performing and the video was rendered. Nevertheless, interaction was done in the audio side. Sensors on the Queen and a Vocoder were used to make her sound like a beehive.
All pictures by Valle Galera.