Two new addons for openFrameworks. Actually one is an update, and major refactor, so much so that I've changed its name: ofxCocoa (was ofxMacOSX) is a glut-replacement addon for openframeworks to allow native integration with opengl and cocoa windowing system, removing dependency on glut. Has a bunch of features to control window and opengl view creation, either programatically or via InterfaceBuilder. http://github.com/memo/msalibs/tree/master/ofxCocoa/
loading multiple QTZ files inside an openframeworks application.
rendering to screen (use FBO to render offscreen)
passing input parameters (float, int, string, bool etc) to the QTZ input ports
reading ports (input and output) from the QTZ (float, int, string, bool etc)
passing Images as ofTextures to and from the composition (you currently can pass images as QC Images, but you would have to manually convert that to ofTexture to interface with openFrameworks)
How is this different to Vades ofxQCPlugin (http://code.google.com/p/ofxqcplugin/) ?
ofxQuartzComposition is the opposite of ofxQCPlugin. ofxQCPlugin allows you to build your openframeworks application as a QCPlugin to run inside QC. ofxQuartzComposition allows you to run and control your Quartz Composition (.qtz) inside an openframeworks application.
Here there are two quartzcompositions being loaded and mixed with openframeworks graphics, in an openframeworks app. The slider on the bottom adjusts the width of the rectangle drawn by openframeworks (ofRect), the 6 sliders on the floating panel send their values directly to the composition while it's running in openframeworks.
VDMX unfortunately doesn't have this feature built-in, but fortunately has beautiful integration with Quartz Composer - allowing me to build a quad warper in QC using a GLSL vertex shader, which should be super fast.
Also, around the 4:30 mark you'll see me masking the video on the box in the back. This is also using a custom Quartz Composition which allows 4 point mask creation. Usage is almost identical to the QuadWarper, but instead of warping the image it just applies a mask, or you can invert the mask and it cuts a chunk out. You could do the same by creating new layers, grouping, using as a layer mask etc. but its a a bit more hassle I think. Using the QuadMask is a lot quicker and you can put multiple QuadMasks on the same layer to draw more complex masks.
I'm very much into creating intuitive interactivity with minimum dependency on a controlled enviroment - so the experience can easily be recreated elsewhere with minimal hardware & setup (which is why I generally prefer optical flow analysis over blob tracking if I can, for vision related projects). So a conversation in the vidvox forums about painting in Quartz Composer using the Wiimote but without using the IR sensor really sparked my interest.
My Secret Heart is a music and film installation & performance commissioned by Streetwise Opera with music composed by Mira Calix and sound design by David Sheppard. Working with video artists Flat-e, we created a film to accompany the 48 minute performance, as well as versions for an installation and short film.
Streetwise Opera are a charity who use music as a tool to help people who have experienced homelessness move forward in their lives. They run a weekly music programme, resident in 10 homeless centres around the country - and also stage an annual production which gives their performers the chance to star in quality shows where there are high-expectations, no compromise and no patronising. The voices you hear in the music, and people you see in the film, are from Streetwise workshops around the UK. 100+ Streetwise performers also sang at the My Secret Heart premiere at the Royal Festival Hall in December 2008. My Secret Heart is about their story.
The film has an abstract narrative derived from individual conversations with each of the Streetwise performers. It is a direct emotional response to their stories combined with the haunting beauty of Mira Calix's composition. Instead of focusing on a specific plot, the film embarks on a complex journey through various states of emotion, starting from pre-birth through birth, curiosity, exploration, excitement, playfulness; through to fear, anxiety and isolation. While it maintains a relatively dark and eerie mood overall, intertwined with the feelings of desperation are strong elements of hope.
The visuals were designed and created primarily with custom software written in C++/openFrameworks, with some Quartz Composer elements, rendered AfterEffects sequences and live action footage. The custom C++ app is audio-reactive and user-interactive, allowing the visuals to be 'performed' live with full control over the behaviour of the virtual inhabitants of the cylindrical aquarium-like rig.
Over the course of a few months, and after many conversations with Mira Calix and listening to the soundtrack over and over and over again, we decided roughly what the visuals should do and what kind of behaviours we wanted the visuals to perform at specific points in the song. After a lengthy coding period, I had an application that when you ran, did... nothing, but it had the potential to do everything I wanted. The application was a live performance tool with full control over its environment as well as audio playback and control, and an input recording / playback system.
Once the application was complete, I sat down with Robin from flat-e, and pressed 'play' on the app - this started the music playback and the physics recorder. While the music was playing we could control the inhabitants of the virtual world with many sliders, knobs, touchpads, mouse etc. As the music was playing we would respond in realtime by sending messages to make them move gracefully, erratically, flocking together, swimming apart, getting excited, slowing down, speeding up, telling them to die, slowly start twitching, come alive, swim to the surface, sink to the bottom etc - our actions being recorded gave us the ability to later go back and scrub to certain positions in the song and overdub and mix new behaviours we might have missed in the first round. In the end we found that actually we had to do little to no editing. The best overall performance was the one we recorded in a single 50 minute take.
The sensation of performing and recording the visuals was that of actually directing a film with thousands of virtual actors, commanding an army, digital puppetry - an approach I'm sure I will be revisiting in the very near future.
Early tests of visuals on the 'aquarium' (rig built by Gaianova):
This is an 'early current state of app' demo for a multi-discipline event I'm working on with Streetwise Opera, Mira Calix and fellow visualists Flat-e, to be showcased at the Royal Festival Hall later this year with quite a few more venues lined up.
The app was written in Processing 0135 and is running realtime at 60fps, though if I add another couple hundred eels it does drop, so I may switch to OpenFrameworks if performance does become an issue (which it probably will). There are occasional freezes in the video which happened while capturing the screen so that is a bit annoying.
I'm controlling the eels using the mouse, keyboard and Quartz Composer (just simple sliders sending OSC to vary some parameters - similar to the 'magnetic force fields' video - I'm quite into this technique now, very quick and easy to setup, and you can have loads of sliders with descriptive names at your disposal to play with, and adjust your internal variables in realtime for tweaking heaven).
The final show will have many many more features, both in the digital realm, and physical... more info coming soon...
This is a demo of creating and visualizing magnetic (kind of) fields in Processing and controlling with a tangible multitouch table and Quartz Composer. It gets more interesting after the 1 minute mark :P
I was sitting minding my own business, exploring accumulation buffers in Quartz Composer, when all of a sudden I just zoned out, and next thing I know, I found myself staring at Him on my screen. His Noodly Appendages came down and touched me, and guided my hands, connecting Quartz Composer's very own noodles in His Image.
His Noodly Screen Saver runs on Mac OSX 10.5 (Leopard) and you can download it below.
So I created the attached test composition and found some surprising results (at least I found them surprising, though in retrospect I can understand why :P).
First of all, all 3 methods are pretty quick, and are unlikely to be a bottleneck. Unless you are using the operations in an Iterator patch with a lot of iterations you won't notice any difference.
The figures below are for 2nd gen 2.33Ghz Macbook Pro:
Quartz Composer is a great piece of software for many things. It has a lot of features which really allow you to create amazing things very quickly. It also has some 'features' which allow you to lose your hair very quickly. One of these 'features' is loading images from within an iterator.
You'd think it was quite straightforward, just send a different string (either generated within the iterator or loaded from XML etc.) to the Image Downloader, but alas QC has other plans. It always loads the same image whatever string you send it!
Aldeburgh Music is an organization based in Suffolk, UK working with musicians - both professional and just starting out - to help them reach their full potential by providing them with the time and space to discover, create and explore - as well as providing inspirational scenery and a rich musical heritage.
The New Music New Media / Britten–Pears Programme offers advanced performance experience to young professional musicians in the inspiring surroundings of Snape Maltings, home of the Aldeburgh Festival founded by Benjamin Britten in 1948.
A test in motion detection in Quartz Composer 3.0.
The music is all generated in real-time by me waving my fingers, hands and arms around (or in fact any motion) in front of a standard web-cam. No post-processing was done on the audio or the video.
The concept is by no means new, but still fun nevertheless - and I'm quite happy with this implementation. I'm using a very simple frame difference technique and generating midi notes based on where-ever there is movement (actually, as QC3 cannot send midi notes I had to send the data as OSC and use OSCulator to forward them as midi).
I was at the Adobe Air Tour 2008 in London today, and while most of it was very interesting, there were moments which were rather dry... so in my boredom I created this little Quartz Composition.
Last fall I was at Flash on the Beach in Brighton, and Craig Swann was demonstrating some amazing things he'd been doing in Flash and Max MSP/Jitter. I remembered one of them today and decided to try and create something similar in Quartz Composer, the results are quite fun.
Basically it captures a very narrow slice of video input (in this case my webcam) every frame and creates a picture out of lots of slices from different times, essentially capturing a segment of time in one picture. The composition here can do it horizontally or vertically. You could try radial and other methods too.
Leopard only I'm afraid...
** Update **
Barth informs me that this is called slit-scanning, and lots of people have done amazing things with this technique! More info at Golan Levin's Slit-scanning page.
In this previous post I mention creating data-source plugins or processing existing data-sources for VDMX using Quartz Composer. Well the specific example I've given in that post isn't that useful, simply does 1-x^2.
This is a little test using GLSL in Quartz Composer 3.0, and controlling via VDMX. All happening in realtime and completely audio-reactive with no post production or timeline animations etc. The potential is humongous and very exciting!!
Soundtrack "Caliper Remote" by Autechre (from LP5 - 1998)
Who needs autechre when you have a bunch of mad girls!
P.S. I have hours of footage of this if anyone is interested :P