VR work

This is a quick reading list to start working on 360 Videos

It will expand in particular on the directing, storyboarding and editing processes.

Also check out this playlist of tutorial videos I've collected.

Some of my VR videos are available on my YouTube channel.

The most professional examples are instead hosted on our Facebook page: 360 Video and Audio - Concerts in Parma

Minimal workflow

Now also available on Medium.

The simplest approach is to employ an LG 360 camera in Spatial Audio mode.

This produces a decent image and first order ambisonics recording, which can be directly uploaded to YouTube.

Its image quality is pretty much equal to that of the Samsung Gear 360 or Nikon KeyMission. That is to say, it's perfectly acceptable in good lighting conditions, but it degrades very quickly when it's dark.

Most annoyingly, they all have some thermal envelope issues. The LG, in particular, has a 20 minute maximum recording time, which it cannot reach consistently after sustained use.

For more involved productions, it's best to employ Facebook's 360 Spatial Workstation, which allows to either create from scratch or convert a soundscape into their TBE format, which is basically a slightly modified Second Order Ambisonics that fits into ProTool's 8-channel bus.

It generates videos for both Facebook and YouTube, thus covering most use cases in one fell swoop.

Also, in order to obtain a good image quality, it is currently necessary to resort to an array of cameras. My current solution is better explained in the next section.

Stereoscopic (3D) video

For 3D video, we use 8 GoPro Session cameras arranged in a ring.

The mount I designed for 3D printing should be compatible with any GoPro, and is available here.

The images we obtain are around 5700 pixel wide, but have rather sizeable holes at the top and bottom.

Higher-order Ambisonics

I've collaborated on the recording of a series of music performances using arrays of GoPros for the video and either an Eigenmike or a modified cylindrical version for the audio.

In particular, we have some pieces from the 2016 Verdi Festival, a Heindel concert in the Duomo of Parma and a glee chorus in the Casa del Suono.

They're available for download in Third Order Ambisonics for use with Jump Inspector on any Cardboard-capable device here, or in TBE (basically 2nd order Ambisonics) on our Facebook page.

Room scale playback

Of course this isn't strictly speaking VR, but we've set up the "white room" of the Casa del Suono to play back third order Ambisonics in sync with our videos.

We're using the pre-existing audio system, and a computer with an Nvidia Quadro driving 4 1080p projectors.

Of course the sense of presence isn't comparable to a dedicated HMD, but it still provides an immersive, yet social, audiovisual experience.

Coming soon

I'm working on a native sound-object based player for audio and video recordings. At present time it supports playback of monoscopic 360 videos with 32-channel 3DVMS audio. It's being tested on HTC Vive, but it should be completely platform agnostic when it's ready. I'm still trying to figure out how to release the audiovisual content separately from the playback software.

An interesting aspect of this playback approach is that it somewhat approximates six degrees of freedom interpolation, especially with the audio. A photogrammetric approach would be better suited for the visual part, probably.