I would love to say that everything we worked on for the Immerse project went 100% as planned on the first try. We had several problems. Here's a few of them and how they were "solved."
- Short Throw Projectors
Our original plan was to place the projectors on the ground in front of the wall. I knew that we would have to use short throw projectors due to the image size we needed to cover the wall. A projector with a standard lens would have needed to be placed 15 to 20 feet (4.6m to 6.1m) from the wall. That would not have left much room on the stage for the band, and greatly increased the chance of someone walking between the wall and the projectors. After doing some research I settled on the NEC UM351W.
The problem with this particular projector is that the lens is on the back of the projector and the image bounces off of a mirror on the back of the unit. That means that the image has shoot across the top of the projector, and is offset- for every little bit you back the projector from the wall, the image raises off of the floor.
We really needed the image to go all the way to the floor. So we tried tilting the projector so that the image went to the floor- but then we had to back the projectors back out into the floor 10 feet (3.05m) or so because that distorted the image so much that bottom was extremely narrow. That also defeated the purpose of using the short throw projectors.
Calvary has 1.5" steel pipe running across the top of their stage wall. We thought about cantilevering the projectors from there, however this blocked the church's main projection screens that they use for song lyrics and message points. We ended up building a "goalpost" out of truss and hanging the projectors from that, angling them down, then used Madmapper to correct the keystone. That got us through the event. Since then, we've decided to mount two higher lumen, standard lens projectors from the overhead catwalk. We'll still use Madmapper, but there won't be any sight line issues.
- Software Issues
After a few days of figuring out all of those logistics, it was time to load video into VDMX and go to town building scenes. The Friday before the Sunday event, projectors were up, test scenes worked beautifully. It was a great day. I came in Saturday morning to start building the show.
Late Saturday afternoon, I was running some parts of the show to test it. That's when I noticed the video clips were lagging and freezing. I was pretty bummed. I tried several videos. Same thing. I re-built the user interface in VDMX. The video clips were still freezing. I decided to run the Activity Monitor in Mac OS X to see what was going on. That's when I noticed the computer's CPU usage was spiking close to 150%. (I feel like this might have been the only time in my life the coaching analogy of giving 110% might have actually been possible.)
I have done some research, and forum crawling since I initially started writing this post. Apple's activity monitor actually bases CPU usage off of a single-core processor. Mathematically this would be 100 X (Number of Cores). If you have a dual-core machine, Activity Monitor, or more accurately your computer has up to 200% CPU to use. A quad-core machine could use up to 400%.
In an earlier post in this series I mentioned that VDMX uses the computer's graphics card to handle all of the video rendering. Or it typically tries to. After trying some different file types, mainly ISF (Interactive Shader Format) and Quartz Composer (these file types use computer code to generate video content or effects.) I noticed the CPU usage dropped well below 50%.
Our original plan was to trigger all of the video effects from Ableton Live, which was the sequencer and background tracks that the band was using. It took some time to figure all ofthe video issues so we didn't have time to link VDMX to Ableton Live and build those sequences. For the show I ended up with my own set of in-ear monitors so I could hear the click track and triggered all of the clips and effects in real-time. Not the most ideal situation, but it worked.