Creating Virtual Reality Demo Videos typically would be just capturing what the HMD display looks like. It's harder to imagine what's going on or what the project really is doing. I was reading this article from Dario Laverde about “Mixed Reality” and gave me an idea.
Here is an excellent resource from Kert Gartner who produced the Mixed Reality trailers for Fantastic Contraption and Job Simulator. What if I can do this kind of videos too but using WebVR. So, I embarked on another journey to learn more about doing Mixed Reality Capture using a HTC Vive, a green screen, and a regular webcam. Find cheapest way to accomplish this without special rigs.
Chroma Key/Green ScreenFrom Wikipedia: Chroma key compositing, or Chroma keying, is a special effects/post-production technique for compositing (layering) two images or video streams together based on color hues (Chroma range). The technique has been used heavily in many fields to remove a background from the subject of a photo or video – particularly the newscasting, motion picture and video game industries. A color range in the foreground footage is made transparent, allowing separately filmed background footage or a static image to be inserted into the scene. The Chroma keying technique is commonly used in video production and post-production.
https://en.wikipedia.org/wiki/Chroma_key
According to https://aframe.io/:
"A-Frame is a web framework for building virtual reality experiences. It was started by Mozilla VR to make WebVR content creation easier, faster, and more accessible. A-Frame lets you build scenes with just HTML while having unlimited access to JavaScript, three.js, and all existing Web APIs. A-Frame uses an entity-component-system pattern that promotes composition and extensibility. It is free and open source with a welcoming community and a thriving ecosystem of tools and components."Chrome Experimental Build
I recently found out that A-frame supports HTC Vive using this Chrome experimental build. So I downloaded this to my computer and followed the instructions to setup. I tried this sample app from Aframe.io:
And it was awesome. It's open source, you can read the code. Very easy to follow. The key is just adding these entities to the scene:
<a-entity hand-controls="left" aabb-collider="objects: .cube;" grab></a-entity>
<a-entity hand-controls="right" aabb-collider="objects: .cube;" grab></a-entity>
Inside the Vive, you can actually see your controllers, grab box or paint. I'm hooked. Time to learn what I can do with these technology.
I also found these projects: https://github.com/derickson/aframe-play
This project is a spectator component in VR. It added 3rd person camera to a background. I also found another aframe component that can do multi user experiences.
Comes with a Firebase broadcast component for multi user experiences out-of-the-box by syncing entities' component data to Firebase realtime database. The parent-child relationships between entities are maintained as well as long as all entities in the hierarchy have the broadcast component attached.
Another project I found is this snowman with Particle system. It's a Virtual Reality Snow man with Snow particle system. Then I looked around and found another project where you can throw a ball inside Vive with a physics system.
So I combined all three projects. I came up with this:
Download the Chrome Experimental Build that has HTC Vive bindings. Create 2 tabs:
First Person ControllerThe First person controller will use the HTC Vive HMD and controllers.
3rd Person cameraThe 3rd Person camera will connect to your webcam and would detect green screen background and replace the snow globe scene. The tricky part of this process is aligning both the HMD and the 3D objects that corresponds to the First Person Controller (the 1st person HMD head is represented as a cube).
The magic happens in this code https://github.com/rondagdag/aframe-snow-play/blob/master/camera.html
<a-entity position="-1.48 -0.6 0.90">
<a-camera id="head" mixin="avatar-head" wasd-controls fov="68.5" look-controls="hmdEnabled:true;" active="true" spectator="fps:30;
specDiv:#spectatorDiv;
camVideo:#videoElement;
compDiv:#composite;
gaussBlur:true">
</a-camera>
</a-entity>
Make sure that the fov attribute is the same as the field of view of your camera to alight it properly. I researched on my camera and it's 68.5
I also specified where the position of the camera which is "-1.48 -0.6 0.90". I got this number from placing one of the HTC Vive controller close to the camera, then looking at the firebase database what is stored.
It will not be perfect, but get it close enough. I still have to rotate the view to align accordingly. I don't have good lighting since it's only proof of concept. To capture the video, I use OBS Studio.
OBS StudioIt's a Free and open source software for video recording and live streaming. You can download and start streaming quickly and easily on Windows, Mac or Linux. All I did was to capture chrome window.
Future ImprovementsSo far this is the cheapest way to capture Mixed Reality Videos in Vive. This project can be improved instead of using Firebase as communication between 2 Chrome windows, use shared worker or local server communicating via socket.io. It would reduce the latency sending 1st person movements to the capturing camera.
If this project made you interested in developing with Virtual Reality, WebVR, A-Frame, please click "respect project" button and follow me. Thanks! Contact me if you have questions.
Comments