Skip to main content


I recently invested in an HTC Vive for the purposes of exploring Virtual Reality. I started working on stereoscopic compositing with the Car Dynamics project and, honestly, it was amazing and fun. Being able to hear people ooh and ah over it and ask to watch it again and again was a real treat.

I haven’t posted lately because we moved back into our house, after house-surfing for a few months, post halfBound PCT. Now that we’re here, we’re looking to spruce the place up a bit and make it more homey.

That means hanging indirect lighting and doing some painting and tearing up carpet, that kind of thing. It’s been a challenge because my allergies are on fire in this space right now and I’m not sure why.

Anyhow, Brandie and I decided to start with the bathroom. Initially, I suggested doing a composite or 3D rendering of the room and seeing what different shades might do to the light in there.

Yes, I am that kind of geek.

As I started work, I got more excited and expanded the scope, so I changed my overall intent and decided to see if I could recreate the room convincingly in VR. So, here I am, my first project of 2017 proper.


Capture images and video of a single small space (in this case, my bathroom) and recreate it in Stereoscopic Virtual Reality.

If that is a success, I’d like to move on and begin fleshing out the room in full 3D, possibly moving it to the Unity Engine to see if I can bake textures in and start creating a virtual experience.



I recently applied for a job as a stereo compositor. What was cool is that the shop sent me a couple of tests to work on, including some raw footage from a 6-camera GoPro Hero setup. I used that footage and the CaraVR toolset (trial edition, clock’s ticking, since I missed the education license window while on the trail – sad, so sad!) to create my first 360 stitch with stereoscopy. It was really amazing to be able to do the work and I learned a metric-ton doing it.

After some major cogitation, I figured that I could do the same with my Ricoh, as long as I didn’t have any movement in my scene, by turning the camera a couple of times to generate the stereoscopy, since the camera will be fixed. That should allow me to solve the scene and work with the virtual cameras is Nuke to create stereoscopic depth right out of the chute.

The stretch goal of moving to Unity is likely too big for this project and will be part of the next phase. However, I would like to see if I can use Maya and Nuke to create some geometry and project in the virtual environment to create as much parallax as I can.

At the end of the day, I want the room to be as immersive as possible, given the constraint that the point of view will be somewhat fixed. If I can create the illusion that slight movement does not break the spell or feel “unreal” then I’m on to something.


A VR movie that I can post on YouTube and allow folks to look around with their 3D setup, be it a full Head Mounted Display (HMD) or something like Gear VR or Google Cardboard. Other ways to view would be with any of the various 360 video viewers for the Vive, like Vive Cinema or Whirligig.

If the process is successful, I should be able to generate a number of these videos, each utilizing a different lighting scheme in the room.

Stretch Goal: I’d also like to be able to get the environment in to Unity somehow and experience it there, but again, that’s likely the next step in the process.


I’m giving myself a week to work on the project, from video capture to camera solve, to composite and possible 3D modeling & projecting, to exported video(s). Wish me luck. This’ll be fun!



%d bloggers like this: