The Lake Buena Vista Historical Society proudly presents…
Horizons: Revisited 3D
From physics.org: If you look at an object near you and close your left and right eyes in turn, you’ll see that each has a slightly different view of the world. Your left eye sees a bit more of the left side of the object, and your right eye sees a bit more of its right side. Your brain fuses the two images together allowing you to see in three dimensions. This is known as stereoscopic vision. To create a similar effect, 3D films are captured using two lenses placed side by side, just like your eyes (or by producing computer-generated images to replicate the same effect).
But you only had one camera, how did you do this?
As we were restoring Horizons: Revisited we got to thinking that the camera was placed on a tripod in the ride vehicle and it only (for the most part) moved from left to right at a known speed (to be precise 1.53 ft/s or 47 cm/s. Such that in a specific amount of time the camera will have traveled the same distance that separates the average pair of human eyes (5.4-7.4cm). Doing some simple math we can calculate how long it takes for the ride to travel that distance: If we then divide 6.4cm by 47cm/s we get .136 seconds. Now we know that the video plays at 29.97 frames per second; again some simple math: 29.97 fps multiplied by .136 seconds yields us approximately 4.07 video frames for pupil separation. Now, what if we offset the left and right eye by 4-5 frames from our one camera video, what would the result be? Would it work and create a simulated stereoscopic effect? To our amazement, it worked!
The next step was to see how far we could push the boundaries and create the best 3D experience we can with the one camera source. Our calculations pointed to a 4 frame offset for each eye, but if we set that offset higher, would it create a deeper visual effect? As we increased the offset the depth and we were able to also make it more immersive…with one catch. Regular 3D is captured with 2 cameras, so each camera is capturing each action at the same exact moment. Imagine for a moment we are filming someone tossing a ball in the air. With a 2 camera setup when the ball reaches the peak of the toss, that frame for each eye would record the ball in the same exact position. In a single-camera setup, that same scenario would result in one eye seeing the ball at the apex while the other eye would be 4 frames behind and still in motion from the toss. The further this frame separation the more blurred and “off” the ball would look – however anything else that is static or not moving quickly would remain in 3D.
When you think about the content of the attraction, other than the audio-animatronics, movies, background projections, and lighting effects the majority of the right is static; which is exactly what we want. We targeted a frame offset that would blend rich 3D experience while still allowing for movement and such that wouldn’t distract the eye – we think we’ve managed to find a common number that works for most of the ride. Where needed we seamlessly blend to a 2D view to avoid any scenes that do not work well, specifically parts of the Omnimax film and some transition scenes.
There are 3 viewing options for Horizons: Revisited 3D. Please see our 3D information page on what you will need to view.
Red/Blue Glasses – Put on your glasses and enjoy!
VR Headset/Google Cardboard – Play full screen via YouTube app; this versio has been enhanced to 60fps for a smoother experience.
3D Televisions – Pay full screen from YouTube, set your TV to play 50%/50% or Side By Side (SBS) 3D content
For details on how this video was initially restored, check out our “Behind the Edit“