Here begins the first ‘meaty’ blog post on the MadMapper website since the launch date. I will enjoy this moment now, a short break from all the tweets, facebook posts and emails in order to write out an email based interview I conducted with the artist Kit Webster. His work was included in the exhibition of the Mapping Festival this year. We saw his work and the workflow he was using and decided to make his life easier by granting him a Beta license of MadMapper. After providing him with a 5 minute tutorial he was on his way, wondering how he could have done without it.
I had actually seen Kits work prior to our meeting and it was interesting to see how his work had evolved in the past year. From looking at his work, it was not surprising to me that we shared a similar viewpoint about projected light and surfaces.
The following interview with Kit was conducted in the days following the opening of the Mapping Festival.
ilan: First, I need to introduce you with a brief introduction of who you are, where you are from?
Kit: I’m Kit Webster born 1981 and raised in Melbourne, Australia.
ilan: What is your background, photography? animation? video compositor?
Kit: I’m originally an electronic producer and sound designer but I eventually defected to the visual. I studied sound art at RMIT in Melbourne and during this degree I had the opportunity to do an elective on Max/MSP. This was taught by Robin Fox (google him he has a killer laser show that synchronizes laser with live sound from Max / MSP). I was blown away by the capability of visual programming interfaces and started researching new media art around the world. Legoman’s 3Destruct installation was a very inspiring piece for me and it’s an honor to have a work alongside Legoman and Thomas Vaquié at the 2011 Mapping Festival.
ilan: When did you first get into making art with video projections?
Kit: I started playing with projection a couple of years ago. I was experimenting with Max/MSP and vvvv and different concepts regarding illumination of geometric forms. I approached the RMIT gallery in Melbourne and asked if I could use the gallery during their closing period in order to prototype and film a light sculpture idea. The curator liked the work so he put me in the schedule for a solo show. That is how I produced Dataflux, a live installation syncing sound, DMX servos, strobe and projection running out of vvvv. After putting it online things went nuts.
ilan: What interests you about the medium?
Kit: I am obsessed with the idea of the digital being merged with the physical and the illusory and sensory qualities of audiovisual installation. I like the way mapping generates a kind of hyper realistic interpretation of space, and ways that we can produce works not only digitally connect with space, but also take advantage of human senses and perception. I am currently developing a work that investigates the synesthetic properties of visual and sonic spectrums and their geometric relationship with space What does this mean? Strobes, lasers, mapping, rhythm and phat subbies taken out of da club and moved to an intricately controlled gallery environment.
ilan: What was the workflow you were using for mapping projections prior to using MadMapper?
Kit: Animations are produced in Flash, arranged in After Effects, edited and comp’d in Final Cut to sync from sound out of Logic, and then finally mapped with After Effects. Ten layers and ten masks are produced. One for each element of the Enigmatica sculpture which would mean an overnight render before I could see the result in realtime and mapped.
ilan: How has the MadMapper been helpful to you and made that workflow easier?
Kit: MadMapper allows me to instantly test a visual concept over sculptural forms without having to render in After Effects. Because the visual sequences appear very differently over the sculpture and surrounding space than on the screen, I can make tighter, more sculpturally appropriate visual concepts that are based on evidence of their effectiveness. Because the sculptures I work with are large scale, the only time I can test concepts is when I am invited to present a work. Because of the short testing time that is available its essential that my workflow is rapid.
When I travel my sculptures around to different events I can now keep the same base sequences and fire up MadMapper and easily sync to different specifics of each site. This means less time maniacally huddled over a computer time and more Kit chillaxin’. WIthout question, Madmapper is the equivalent of hyper drive. It should be called Sanemapper IMO.