Project Hindsight was the semester long project that I worked on at the Entertainment Technology Center, CMU, Pittsburgh for the Spring of 2017. It was an amalgam of live action 360 film with 3D assets in VR with an emotional story dealing with bad driving habits and the themes of choice, consequences, guilt and responsibility. It was developed for the Oculus Rift with the Oculus Touch.
Finalists at CHI PLAY 2017 (Amsterdam),
Published at World Design Summit 2017 (Montreal) https://www.worlddesignsummit.com/talk/?id=1162
Developed Using: Unity, C#, Autodesk Maya, Python, PyQt5
Team size: 6
Role: Programmer / Technical Artist / Sound Designer
Apart from the important task of the integrating of the Oculus Touch to work the way we needed it to for the main experience, I worked on several other tools / shaders for Hindsight. The most noteworthy tools and shaders are mentioned below.
We needed a lot of different tools to simplify the pipeline and aid with other Unity related problems. These tools ranged from tools for Maya as well as simple batch scripts to make video conversion easy.
The sphere cut tool (developed for a prototype version which involved making spherical-section shaped colliders)
Upon looking at our video conversion process, I realized that we desperately needed a tool to help us automate some of it. And so, I built the VideoTester tool. On the face of it, it was a really simple tool that made the video conversion and VR-viewing happen through just a double click. All that was needed was the filling of a text file with the necessary file names, and then, the running of a .bat script. This script enabled the Unity project to know which the new videos are and on the next run of the build, the project ran with the newly converted videos. This led to at least a twofold increase in our performance.
The tool had two key components – the python script that used ffmpeg to convert the videos, and a .bat script that actually ran this python script, to make the actual running of the script easier for non-programmers – they wouldn’t have to open a terminal to execute anything.
A few custom shaders were developed for specific purposes in the project.
We needed these for various purposes – such as having shadows and transparency at the same time while also being an unlit material (this was required for our sphere which the 4k footage was projected onto – to simulate the inside of a car), and also a skybox blender which was needed for our night-to-day transition scene.
The Skybox Blender is the one seen in the video below. The “sun” and the lens flares are separate from the skybox blender itself. A simple function was written to make the transition occur. This function was passed to our event manager to make sure that the function was called at the right time.
The Unlit shader with shadows and hard threshold transparency for making our 360 car video have “windows” to look out of into a CG world we made.
View more information here: http://www.etc.cmu.edu/projects/hindsight/