My music work focuses on creating and performing pieces that I try to make uniquely innovative, but also engaging and appealing to a broad audience and inserted in today’s sonic environment. While I appreciate and respect styles commonly associated to academia (like jazz and classical), my music is typically embedded in the underground urban world, where new artistic trends often emerge.
After growing up leading bands and developing solo projects with diverse influences (including rock, pop, Latin-American folk and protest songs, punk, jazz, etc.), I have focused on electronic music for the last several years. My productions range from highly experimental to dance/pop tracks, but always with a unique take. The use of electronic instruments opened new creative avenues where I can concentrate on expressiveness and innovation (rather than on the instrumentalists’ abilities) and where highly sophisticated and obscure musical ideas can be of mass consumption if they have the right beat.
I view music as a direct feed to our brains that builds abstract non-verbal virtual universes, which must have their own rules and self-consistencies. From this perspective, electronic instruments are just a new set of tools to produce these universes. Tools with which I can easily establish an intuitive connection thanks to my scientific background.
1. The Makers of Sense project: A new type of electronic music
As Makers of Sense, we create our own alternative reality, combining unusually warm electronica, hiphop, house, trance, dub, live instrumentation and urban elements into a new style that we call UPED (Urban Post-Electronic Dance music). You will find current info and links to music, videos, and live performances in this Makers of Sense portal and an older press kit here.
Our work process has always used technology to capture an organic performance. Most of our music starts with a jam, where we will loop machines and improvise elements until we saturate the sonic space. These snapshots of musical play are then recorded as independent audio clips, which we can reprocess and rearrange to produce a song or track.
While creating our first album as Makers of Sense, Out of the Box (read here a review), our combined musical experiences blended into a unique sound, but our production lost spontaneity as we worked with more and more machines. We thus started our current exploration, where we seek to perform electronic music in a way that is as natural as with traditional instruments, but with the augmented capabilities that a machine-assisted performance can provide.
As we sought to interface naturally with electronic instruments, we rapidly realized that the possibilities are greatly enlarged when we can choose any interface (keyboard, percussion, string, knobs, touch-sensitive screen, processed voice, etc.) to produce any sound, since the interface can connect to any sound engine. Furthermore, modern instruments are not bound by the physical properties that conditioned their ancestors. A guitar sound does not have to have an acoustic chamber that conditions the shape of the instrument or how we hold it. Piano keys do not need to be separated in the way required by the mechanical constraints of their hammers and strings. After much experimenting and performing, we have converged to a setup with which we can play with the flexibility we require. In it, we can use multiple interfaces to make multiple contributions to the music, always selecting the parts where we can be the most expressive. The kind of energy that this way of performing brings to the stage can be seen in this video of our Makers of Sense show at the Tomato Blast in Chicago.
We have also ventured into other territories. For example, this video clip for our song Po Boy shows the landscapes of the impoverished south side of Chicago while exploring new musical directions by combining blues and electronic music. Finally, we have embarked in music promotion as a creative performance art in itself, becoming in 2007 the first band to auction a New Year’s Eve live show on eBay.
2. A new type of live electronic music show
Our search for achieving the immediacy and intuitive dynamics of an acoustic performance while using the power of electronic instruments has led us to develop a new live show concept. We view this show as a machine-assisted performance rather than an electronic-music one. The combination of sequences and live playing of instruments and equipment is optimized so that the performed parts can display, both, commitment to repetitive and predictable actions and unpredictable improvised explorations. While the first may not display any musical virtuosity and could appear to be replaceable by pre-programmed material, we have learned that it is essential for an engaging performance. Indeed, repetitive human-played parts, such as a regular beat played by a drummer, provide multiple points of connection between the musicians and the audience. Each minuscule variation of an otherwise perfect beat or melody displays what we can best describe as the performers “commitment” to the predictability of the next musical phrase. It is in this commitment that the audiences’ brains can lock into the same musical path, which is one of the great pleasures of the musical experience. On the other hand, the reassurance of predictability can become boring if not interrupted by surprise, which is why we also include in the performers’ interventions of our new Makers of Sense live show enough space for improvisation. Interestingly, machine-assisted performances can actually liberate improvisers from the virtuosity required to improvise with traditional instruments. For example, an improvised piano solo will require dexterity to reach the notes in the used scale with speed and precision that is greatly complicated by the layout of keys on a piano and their dynamics (weight, width, displacement force, etc.). By contrast, when playing with the iPad based synthesizer Animoog, which we use in our Makers of Sense shows, the keys that appear on the screen can be selected to include only those that match the scales desired for the improvisation at a given moment, and the shape and response of the keys can be made to match any desired set of properties.
Our concept of machine-assisted augmented-reality performance also reaches the visual aspects of our show. For example, while it is easy in electronic music to have all the instruments concentrated in a limited physical space (e.g. around a laptop or groovebox), we intentionally spread out the equipment on stage, to connect more closely physicality to the resulting music. An important part of the performance is also given by the on-stage screens, which we are utilizing in a unique way. Today, it is very common for almost all electronic music shows to use videos, visualizations, and live VJ’s as a backdrop for the performance. In our new show, we use videos not to replace the lack of visual stimulation that is common in machine-based music, but instead to complement and augment our actions on stage. Two video screens are thus used as characters on stage with which we interact. They are connected to two on-stage infrared cameras that are placed next to each performer. As we play, just like an electric guitar player can walk center-stage to show the audience what he or she is playing, we can also show our current manipulations of machines, when relevant, by pointing the cameras ourselves and displaying the video feed on the screens. The same screens are used to display algorithmic art by Mexican digital artists and graduate student in Biology, Martin Zumaya, who recently started working with us. The screens also respond to impact with a sampled gong-like sound and can display animated faces that follow the live raps by Brother El. They thus become characters on stage that augment the show rather than a backdrop to the performance.
Some initial videos of our live performances can be found in this video of Makers of Sense performing Night Ride at Push Beats and in this video of Makers of Sense performing Ferris Wheel at Dragon Fly.
A yet unreleased video displaying the full new show production can be seen here.
3. Artistic collaborations
In addition to the creative efforts described above, I enjoy working in collaborative art projects. Some of my recent and current collaborations are described below.
I worked with choreographer and film director, Sandrine Rouxel, making the soundtrack for the French experimental film Sur le Rocher. This 10 minute film displays slow-moving human silhouettes on a natural environment of rocks on a riverbed. The shapes of the bodies and human sounds blend with the natural setting. For this soundtrack, I used only natural sounds combined and processed electronically to create a soundscape that is both organic and artificial. This project was funded by the French foundation G.R.E.C. (Research Group for Cinematographic Essays) and can be found here.
I also made the soundtrack for a series of videos that report on operations within the Ingress game. Ingress is an augmented reality massively multiplayer online game created by Niantic Labs (a startup within Google) that has of the order of a million players worldwide. Top players create videos for the Ingress Report online video channel, where major operations are described. I was asked to do the music of several of these videos, some of which can be seen here: Operation Redemption and Operation Flatland.
Finally, during last year a fellow Chilean musician, Igor Ledermann, gathered crowdfunding to make a record with the participation of world musicians from over 21 countries. I was invited to participate in this project, contributing tracks as the only electronic musician. My challenge was to create electronic beats and textures that could blend seamlessly with the world music and acoustic elements that characterize this project. The resulting album “Bridges, a Musical Journey” can be listened to here and the participant musicians are listed here.