Making Virtual Reality

Jaehee​ ​Cho & Ralph Vituccio – VR​ ​for Good​:​ ​Unpacking​ ​the​ ​social​ ​effects​ ​agenda​ ​in​ ​VR

Daniel Cross – Can 2D images work in 3D VR environments?

Robin Mudge – OneShot VR storytelling techniques for documentary production

VR​ ​for Good​:​ ​Unpacking​ ​the​ ​social​ ​effects​ ​agenda​ ​in​ ​VR

Injustice is a five to seven minute interactive virtual reality experience themed around racially
motivated police brutality. In Injustice, guests witness an act of racial discrimination happening in
front of them, forcing them to make moral and ethical decisions on the spot. The guest comes
face to face with the characters of the story, filmed in live action, and interacts in the space with
them directly using gaze interaction and voice recognition. Injustice is an experience aimed at
exploring the emotional impact of VR space versus traditional film.

CHI Play 2016 award winning, Injustice, was created by Kalpana, a project team of graduate
students from Carnegie Mellon’s Entertainment Technology Center. Over the past year, Injustice
has been invited to a number of conferences, including Tribeca Film Festival, Games for Change,
SIGGRAPH, and etc. Jaehee and Ralph will present
lessons they learned while working on the project. Since interactive live-action storytelling games
in VR is a unique medium, attendees will gain a new perspective and see the potential use of
virtual reality technology to impact social change.

To be more specific, people will learn the process of how we created an interactive live-action VR
experience. Topics covered will include playtesting, user interface, solving design/technical
challenges during the productions, post production workflow, and maximizing the emotional
moments in VR. We will also discuss how impactful VR experiences such as Injustice can be
employed to address some of the more serious social problems facing our society today.

Can​ ​2D​ ​images​ ​work​ ​in​ ​3D​ ​VR​ ​environments?

This case study reflects on my documentary I AM THE BLUES and the various interactive
documentary pieces created afterwards. Specifically, researching the question, can 2D images
be effective in 3D immersive environments?

Initially I received funding from the Canada Media Fund who in addition to the documentary
provided 75k with the mandate to deliver an interactive companion piece. I had collected an
exhaustive and extensive image/sound catalogue of research creation images. The film was
about the last remaining blues musicians who working in the cotton fields learned the blues.
Mainly in their 80’s and still living in Louisiana and Mississippi, these musicians have a
world of experience etched on their beautiful faces. The sparkle in their eyes as rich and
beautiful as the music they play. Additionally the film was recorded in original locations such
as the “The Blue Front Cafe” Mississippi’s oldest Juke Joint. This is the juke joint that we
modelled into a 3D environment in making a WebGL interactive project.

As the Research Chair in Interactive Documentary at Concordia University, this became my
initial research creation. During this period I have led several experiments incorporating 2D
images in VR environments.

Various approaches are being experimented with and these results will hopefully be
presented at i-docs. Experimentation includes 2D hologram images floating ghostlike as
memories in a location, characters on translucent green screens that you pass through,
image projection within 3D objects (Harry Potter style), 2D spatial mapping to build
pointcloud 3D approximations. The point of these experiments is to find ways to incorporate
images of these amazing characters and their faces as originally photographed.
Needless to say implementing 2D images into VR is not readily acceptable and I was
immediately challenged. However, it is important to consider that most VR technologists
come from gaming and are not really concerned/aware of verite documentary traditions. So,
for me the “gaming” influenced character models and rotoscoping results were not
acceptable. What i experienced in the modelling attempts was a “Luigi/Mario” transitioning
of these musicians, lost was the sparkle in their eyes and the lived experiences etched on
their beautifully aged faces. So, I am trying to find ways to salvage and incorporate these
character qualities in the images being used to create immersive environments.

OneShot VR storytelling techniques for documentary production

With spherical capture systems seeing front, back, left, right, up and down, all at once (allowing
viewers to look at a scene in any direction they want at any time), it’s not possible to focus
attention using traditional film grammar, and many ‘scenes’ have to be necessarily staged. For me
as an ex BBC documentary producer and veteran panoramic photographer, the question is how
can VR become usable in the more uncontrolled and unpredictable storytelling world of
mainstream journalism and documentary production? I have been working with a single camera VR
technique that allows some of the more standard documentary production approaches to be used,
while providing enough of the VR immersive properties to enhance the storytelling and keep the
audience enthusiastic about the VR experience.

In the real world, not a lot of time is spent looking behind us. Our attention is focused on what is
happening across the 190 to 270 degree-field of view in front of us. Relevant to this is an early VR
photographic technique that shot still pictures with a single fisheye lens, producing a circular image
that captured a 180-degree hemispherical angle of view. The flat image on film exhibited severe
geometric distortion, but when projected back through another fisheye lens onto a hemispherical
screen (rather like the dome of a planetarium), all the geometric distortion was canceled out. When
standing in the dome, a viewer had the experience of being immersed in the scene. Today’s
version of this technique replaces the physical projection of the image with real-time computer
mapping of spherical video in VR applications, running in VR headsets and on smart tablets and
phones.

The fisheye technique only requires one camera with one lens, ‘OneShot VR,’ as opposed to a
multitude of cameras needed to produce a full spherical image. Apart from simple post-production,
a huge advantage of this is that as the single lens is only looking forward, the camera can be
operated very much in the way that it would be in traditional documentary production. It can be
handheld, track action and be moved around freely, although these moves need to be mindful of
potential nausea inducing affects for viewers wearing headsets!

Powered by Khore by Showthemes