Other effects work needs a more directed step for integration, a way to move from the latlong (equirectangular projection) to a traditional perspective view (rectilinear projection) or vice versa. For After Effects, this is simple enough to accomplish one way (with the environment effect and a camera) but much harder to accomplish in reverse without extra plugins. The process also gets convoluted very quickly and rendering times can troubleshooting becomes very difficult. Node based compositors like Nuke, Fusion or Flame are far more geared to dealing with these problems.
Masking, or roto work, where an area needs to be isolated for effects work or compositing yields inconsistent results in VR. Masks closer to the equator of the latlong will appear approximately the same as expected, but areas closer to the poles will be overly sharp, and depending on the use of the mask could be a problem.
Masks for separating an actor, or object, from a background become very difficult as their movements don’t appear the same way in a panorama. An actor walking past the camera would become distorted as they move closer to the camera, and cross into a different view of the panorama. Under some circumstances this may be passable, as with some color corrections or other areas with soft masks. For others, a temporary view of the scene needs to be created, the compositing work done and the result recombined with the original.
To accomplish any more serious effects work, you’ll need some way to move between the latlong image (known as a equirectangular projection) and a normal image (rectilinear projection). This process needs to be easily reversible.
Be warned though, every time the image is resampled to change it from one projection to another, the image quality suffers – even in very high fidelity packages like Nuke. Sometimes only a single operation is enough to have very clearly negative effect on the quality of the image. As with all visual effects work, aim to isolate effects only to the affected areas and not the whole image.
Many compositing packages offer a way of moving to and from a number of different projection types. Nuke’s Spherical projection node, for example, allows moving between Mirror ball, angular map, latlong, fisheye or cubic projections with ease and with a very useful rotation parameter for both the input and output.
Where only limited use may be found from most of these for effects work, the cubic projection splits the latlong into 6 equal 90 degree square views. These are very well suited for paint, mask and effects work.
Another possible method is to use a virtual camera to render out a section of the panorama as a new piece of footage. This method is incredibly useful for performing camera related work, like 3d tracking and stabilization. This method is also very good at following actions that span a wider area than what fits inside the neat slices of the cubic projection.
To do this, assign the panorama to a sphere, and place a camera in the center looking out. Any conventional camera view can be recreated using this method. It is very important to flip the image horizontally before assigning it otherwise the camera view will be reversed and any calculations performed will be inverted.
The Nuke scanline render node often has some artifacts at the poles of the sphere, so if the they need work it may be more useful to use, or patch with, the spherical transform.
To send the results back to the panorama, duplicate the original sphere and project the results onto the sphere with a project3d node. Render the sphere with a new scanline renderer with the render mode set the “spherical” instead of panoramic. You can use the same camera, or a new one, when rendering spherically all that is needed is the position location.
Once back in rectilinear, all your effects, tools, and methods will work and the workflow is the same as a regular project.
Here is the workflow used to paint out the tripod from a 360 shot: