360° cameras are not only used to produce VR. As they capture their entire environment, we will use the technique of “overcapture” or “reframe” to create unique movements in traditional edits. The results are spectacular.

Imagine a camera operator who can see their entire environment: in front, behind, and below; whilst being able to move at the speed of light and follow a fighter plane without error. This is exactly what 360° cameras do when their use is diverted to 2D.
As 360° cameras film simultaneously in absolutely all directions, the idea is to select in post-production the most interesting frame of the image and to animate its position, following what interests us. To better understand the possibilities, here is a short example of what can be done:

As we can see in this video, the principle is the same as that of “cropping” into a 4K image when we deliver in HD, except it’s in all directions. When the affordable 360° cameras first appeared, they were not designed to be used in the traditional way. Instead, they were made for social media and not necessarily for advanced post-production.
Nevertheless, the pros soon took advantage of the possibilities offered with these cameras, despite the much lower image quality because a single, wonderful shot justifies a lesser quality of image for a short sequence. There are limits, but we are seeing these reduce over time thanks to advancements in the techology including the inclusion of stabilisers so powerful that they don’d need gimbels.

360° in 2D – how does it work?

The principle is quite simple. The 360° cameras are equipped with fisheye lenses that film their environment at 180° (in the case of 2 lenses, 120° for 3 lenses etc).


From there, a dedicated chipset or software will stitch the 2 images (or as many images as the camera has lenses) in an equirectangular way to produce this result which is similar to a distorted panorama as seen here:

It is this image that will be projected into a virtual sphere and will allow the viewer to walk 360° as if they were in the centre of this sphere. The idea is to recover a portion of the image in order to insert it into a classic edit. We will thus be able to animate the framing, change point of view as seen here.


The limitations

Obviously, all this sounds magical, but there are several limitations that must be taken into account:

  • Most of the time, the cameras produce images in 5.7K maximum (5760 x 2880 pixels) and the sensors are tiny. In other words, when we reframe (or overcapture), we will only use a small part of this resolution. The image quality is therefore quite low. We would also suggest that low light is to be avoided.
  • Some cameras produce “ready-to-use” files in equirectangular format, but others do not. It is then necessary to use additional in-house software to perform stitching (the assembly of images). The process, although simple and automated, is sometimes long. The aim is to create this “Flat” image without any faults
  • Some people already have trouble working in 4K because of lack of processing power. It is even worse in 5.7K, especially since more and more cameras record in H265 (more efficient coding, but huge resources needed for decompression and reading, see our article on the subject). It is therefore often recommended to work with proxies, see again our tutorial).
The software of the firm Insta360 (Insta360 Studio 2019) allows you to title the images, but also to reframe them.

What do you need for Reframe/Overcapture?

Although all editing software now knows how to work with VR (i.e. native 360), they are not all able to reframe and animate parameters (change of view, field of view, etc.). That’s why you have two options:

  • Use Insta360 Studio 2019 which supports the majority of 360° cameras free of charge, including those that are not of the brand. It can be downloaded here and its tutorial (very simple is here).
  • Install the Gopro plugins for Premiere Pro that will allow you to add the “VR Reframe” effect. The download for Windows is here and the download for Mac is here. Note that Gopro also offers plugin installation via its Gopro Fusion Studio software, but the latter will only be compatible with the brand’s camera (the Fusion).

Reframing in a few steps

Once the plugins are installed, simply launch Premiere Pro and import your 360 files in Flat (equirectangular). Then create a sequence (a Timeline), with the dimensions of your source rushes (5.7K, 4K) respecting the native frame rate of the camera (25/30/50P).


Simply drag the GoPro VR Reframe effect from the effect library onto the clip of your choice. The image is immediately cropped. Place the playback head at the beginning of the clip and open the Effects Options. Activate the stopwatches to indicate that you are going to animate this “reframing”. The parameters are as follows:

  • FOV (Field of View) allows you to animate the field of view of the frame, in other words, by increasing the value, you extend the field of view, and by reducing it, you zoom into the image.
  • The YAW/Pitch/Roll correspond to directions such as driving a drone, or controlling a Gimbal. For us, the YAW corresponds to the Pan (from left to right), the Pitch controls the Tilt (vertical inclination), and the Roll uses the inclination of the pitch (to correct the horizon line for example).
  • The Smooth transition parameter allows you to smooth keyframes (Ease In & Ease Out) because the plugin does not take into account the interpretation made by Premiere Pro.

Correct image distortions

As with sports cameras that have a very wide field of view, the edges of the image are often curved. To correct this defect, look for the “Lens Distortion” effect. Slide it over your clip. In the Effect Options, simply drag the Curve parameter to a negative value to reduce distortion.

La déformation est corrigée, à droite par rapport au fichier natif.

Tiny Planet effect

The Tiny Planet effect is ultra-popular, and is very easy to achieve with the Gopro VR Reframe plugin. It can:

  • Tilt the Pitch 90° (to look down in a way) and extend the field of view to nearly 100, which will “take off” the camera and collect the whole image as a “small planet”. Of course, if the camera was not perfectly vertical when shooting, you will probably have to correct the angle with the Roll or YAW.

The hyperlapse effect with motion-blur

Most 360° cameras offer a hyperlapse feature because they are ultra-stabilized. Unfortunately however, the effect is often available on the phone application only and not in computer post-production – very frustrating! However, as a workaround:

  • Shoot your footage at normal speed (as in the end of the article’s introductory video).
  • Once the file is repatriated to Premiere Pro and Reframed as you wish. Increase the playback speed between 2000 and 3000%. As it stands, you will see the hyperlapse effect, but there is no motion blur.
  • Right-click on the clip / Time Interpolation / and select Frame Blending. This adds motion blur.

Finally, to go further in the possibilities offered by 360° cameras used in 2D, we recommend this Cinecom.net video which shows the range of transitions that can be achieved in a few camera movements: