One of the things we’re most interested in exploring at WonderTek Labs is the space where art and audience intersect, and nowhere does this have more power and potential than within the intersection of music, immersive theater techniques and tricks, and emerging technology. Since we started WonderTek Labs, we’ve been deep-thinking about how all these things might fit together in a future where technology and the ability of audiences to interact with the music artists create begins to shift what music even means.
All art exists in three spaces: First within the mind of the artist; second within its physical manifestation as a work of art the artist creates; and third in the space where that art intersects with each individual who experiences that work. As currently emerging technologies being developed now in the 360 video, virtual reality, augmented reality, extended reality spaces and artificial intelligence continue to expand and evolve, we are going to bear witness to a complete reimagining of what music and art mean. We at WonderTek Labs are working with highly creative music clients at the forefront of cutting edge (in some cases bleeding edge) technology to help shape this future.
A Case Study: In 2016 we shot one of the first 360 3D stereoscopic music videos, for Seattle indie-rock legends The Posies. We shot this video with a Google Jump rig from Two-Bit Circus, with their excellent and experienced crew, in a quick five-hour shoot in a warehouse in downtown Los Angeles.
While I wouldn’t recommend relying on “fix it in post” as a method for storytelling in 360, in this case that’s what we ended up doing with this project. We had zero time to storyboard anything out, we were working with artists who had never even seen a VR or 360 demo before that day, and even our highly skilled tech crew was inexperienced with this particular camera, as it had just arrived the day before, and we were super crunched for time. Not what I would call an ideal shooting situation, if I was planning one, but we did manage to get out of it with 32 takes of the complete song, shot with the camera stationery for each take, and placed in four different locations to get different perspectives.
We coached the band through the shoot and shot each take blind with Nathaniel hiding under a table on set. We didn’t have a monitor connected to the camera or any way to view the takes on the fly, so we just got as many takes through the song as we could, directing the band through it and making adjustments after each take. We shot with the camera in four different locations within the space, but kept it stationary during each take. And on the last couple takes we broke “the rules” and directed the band to move close to the camera and to look directly into the cameras as if they were looking directly into someone’s face.
Two-Bit sent all the footage off to Google and a week or so later we had nine .mp4 files with 32 takes of the entire song and the band moving around the set. It was cool, it looked great … but it wasn’t telling a story. Two-Bit wasn’t available to help with the edit, so I decided to use the project to push my own Adobe Premiere editing skills as far as I could. At the time, my Photoshop skills were what I’d consider high-intermediate, and I’d been intimately involved with many film edits in Premiere from behind my editors’ shoulders, but I’d never actually worked with Premiere myself before.
I was able to get up to speed on the basics very quickly and decided that since the song was about ruminating about loss and the memories you have of this person you’re missing, that I wanted to approach the editing from a standpoint of weaving a more universal visual tale about feeling lost and missing someone, while taking advantage of the 360 space in telling the story. I ended up layering and compositing many takes together, along with some stock 360 video footage I pulled from VideoBlocks, and working primarily with Premiere’s native filters and tools to achieve a dreamy, ethereal, ghostlike quality to the video.
Tim Dashwood of Dashwood 360 swooped in to help us out by adding a few key elements: a secondary 2D video loop that’s placed on a large white screen that was in our shooting location, which conveys the protagonist’s inner thoughts; smoke-like wisps of words floating across the space in front of you, a tweak to the stock footage that makes it appear embedded in the walls rather than layered on top of them, and our logo added to the apex and zenith. We’re working with the band to figure out their release strategy for this video, and hope to make a second 360 video with them later this year.
WonderTek Labs has several creative 360 and VR projects lined up for 2017, from straight-up 360 music videos for indie bands to live performance captures, to some killer and groundbreaking projects where we’re integrating technology and immersive theater technique into live music performance in innovative ways that will push hard against the fabric of current tech and, we hope, inspire others working in music and VR. As we can talk about these projects, we’ll post updates about them here.