Last week and the week prior I wrote a couple of posts dealing with the topic of anamorphic filmmaking and exhibition. Both articles dealt with anamorphic’s inherent cinematic qualities and less about its raw technical aspects. It’s not that I didn’t understand the technical behind what makes an anamorphic image an anamorphic image, I did, and still do, however there was one facet to the story that I failed to address. That facet being today’s (largely) digital dominated culture -both at capture and exhibition. You see anamorphic came about as a result of filmmakers wanting to extract more from the day’s 35mm film format. They wanted scope, scale and breadth but above all they wanted to differentiate the cinematic experience from the dreaded menace known as TV. The way to get essentially “more” image onto a 35mm frame of film was to quite literally “squeeze” it in there. Using special lenses known as anamorphic lenses filmmakers did just that; they squeezed a wider field of view into a standard 35mm frame. This meant they needed to use the same anamorphic glass in order “stretch” the image back out again and for the image to look natural once more. This is anamorphic film in a nutshell. Because of the unique shape of the lenses there were other traits that quickly became associated with anamorphic filmmaking; traits such as horizontal lens flares and oval bokeh.

Again, it all sounds great and magical, but then again we’re living in an increasingly non-analog world.

The way in which digital “destroys” anamorphic filmmaking is due in no small part to its reliance upon pixels. I’ve covered this topic in previous posts too and here comes yet another example of how it (digital) makes a once analog cinema experience obsolete. While I’ll be predominately focusing the conversation on HD capture and exhibition the same “rules” will hold true when applied to 4K or UltraHD. The best HD has to offer at present is what is known as 1080p or 1,920 pixels horizontally by 1,080 pixels vertically for a total of over 2 million pixels. 1080p can be enjoyed everywhere HD is sold from streaming to Blu-ray. Where HD kills the format known as anamorphic is in its set resolution. Allow me to explain. Because there is no such thing as an anamorphic mode when it comes to digital, for instance Blu-ray, it means the remaining space not occupied by the captured image -i.e. the black bars top and bottom -are still rendered using pixels. Advocates for anamorphic-anything will tell you this is precisely why filming and/or projecting through an anamorphic lens is vital, for doing so will render the black bars non-existent. In theory this sounds true, however in reality it is not.

For example, the film “2012″ was captured and is presented in an anamorphic format known as 2.35:1. When viewed on Blu-ray and on a 16:9 HDTV one is treated to black bars top and bottom. This is normal. This also means that of Blu-ray’s 1,080 vertical pixels only about 800 of them are being used for the image, the rest are relegated to being black bars -hardly exciting work.

A still from “2012″ as its exists on Blu-ray disc. The “usable” image is 1,920 x 800. Note the black bars top and bottom.

Now, with front projectors you can add an anamorphic lens attachment and set the projector’s internal video processor to an appropriate aspect ratio mode most commonly referred to as “Anamorphic” or in some cases “Letterbox”. This effectively recreates the “effect” of an anamorphically captured image upon a frame of 35mm film by squeezing “more” into the frame. However, in the digital realm it isn’t “squeezing” anything, but rather “stretching” it so that the 800 or so pixels that were used in the original image now occupy the 1,080 available on the sensor itself. The problem with this is that you now don’t suddenly have 1,080 vertical pixels for your image, you merely have 800 stretched to appear like 1,080 pixels. This is bad. Real bad.

The same shot from “2012″ only this time stretched vertically in order to span all of HD’s 1,080 vertical pixels.

For not only are you altering the image, you’re effectively destroying 1-t0-1 pixel mapping while potentially introducing or making worse digital anomalies such as “jaggies”. These items destroy an image’s natural clarity and sharpness while simultaneously re-applying an optical manipulation that has already been applied at the mastering stage. How else do you explain the black bars on your Blu-ray disc?  Now that we’ve simulated the effect of a natively captured anamorphic image upon 35mm film by digitally stretching the image across the entire HD sensor, we must now use an anamorphic lens to then stretch it back out again so that it looks normal once more. These are two added steps that didn’t need to exist. Moreover we’re back where we started for the presence of an anamorphic lens doesn’t equal more horizontal pixels, we simply have re-arrived at our original 1,920. This is what I mean when I suggest that digital filmmaking and exhibition has rendered the concept of anamorphic somewhat moot.

While no bars are present now, the image isn’t wider than its original 1,920 resolution nor is it technically 1,080 pixels tall for the 800 pixels have simply been stretched to their original 800 pixel look -we’ve simply enlarged them. You can’t magically obtain more pixels by simply adding a lens.

Here is a close-up view of the same image at each stage in the “digital” anamorphic chain. Note the first and last frames and how similar (as in the same) they look.

Here is a close up of a single pixel (red) as displayed via a Blu-ray (aka with black bars).

That same pixel (red) stretched by the viewer’s front projector via its “anamorphic” aspect ratio mode.

That same pixel (red) post a digital anamorphic stretch and then shown through an anamorphic lens. Note it’s back to being square.

That same pixel (red) midst the same grid, only adjusted for anamorphic. Note, anamorphic did not add nor take away any pixels, it merely enlarged the ones already present, first digitally and then second optically.

But what about that wider field of view? That’s easy to solve too. Have your director of photography use a wider angle lens or take several steps back. Now, there is no denying that the 2.35:1 aspect ratio has a “cinematic” feel to it and that some of the intangibles associated with anamorphic filmmaking are worth preserving. But, as I’ve shown and described above, you’re as well off, if not better off, filming in 16:9 via today’s modern digital cameras etc. and simply cropping the image in order to achieve that desirable 2.35:1 aspect ratio as you are filming natively in it. The results will be the same because of our dependence upon pixels. More over you won’t have to suffer some of anamorphic’s “quirks” when filming, not to mention its associated costs. And as for the visual traits such as horizontal flares and oval bokeh? Well, as dead set as I was last week against them, they’re easy to cheat and when combined with cropping, make for one hell of an anamorphic looking image. Hell, almost as soon as anamorphic became a thing with 35mm film, Hollywood looked to cheat it by developing Techniscope.. Ironically, many of the films anamorphic fans tout as shining examples of the format were either partially captured in or entirely captured in Techniscope, meaning they’re not anamorphic at all. Films such as “Titanic”, “Panic Room”, “The Fighter” and even the recent hit “Silver Linings Playbook” have all be at least partially, if not entirely, filmed using the Techniscope format rather than true anamorphic. But again, once converted to digital, none of it really matters. That’s pixels for ya, ain’t they a bitch?

While I don’t want to make a habit of contradicting myself so quickly, I feel it’s important if for no other reason that it shows there is always room to grow and new things to learn. I want to thank Ray Jr. of SoCalHT, Michael Chen of The Laser Video Experience and readers like you for inspiring me to push the boundaries and dig deeper.  So, with all that said, I thank you all very much for reading and until next time, take care and stay tuned…


  • Mark Coxon

    Andrew, What do you think of products like Projection Designs 2:35:1 native projectors that use the cut down version of the WQXGA chip? That seems to eliminate your mapping and jaggie concerns, at least on the surface. . .
    Is it the projector/display side or the BluRay content side you feel is the major limiting factor there?
    I also love the argument being placed under your rotating picture header on this blog which is much closer to 2:35 than 16:9. . . :)
    Love your take.
    Mark C

  • Andrew Robinson

    Here’s the problem with 2.35:1 native displays-be it front projectors or even flat panels; the incoming signal is still only 1,920 pixels wide. So while the display may possess 2,000 plus horizontal pixels it just means that you’re upscaling and/or simply enlarging the pixels to get there, which introduces a host of possible problems and/or concerns -such as, again, no one to one pixel mapping.

    It’s not that Blu-ray is a limiting factor, it is pretty damn good. The issue is folks not understanding what any of this stuff truly means. We’re being sold goods and/or services much in the same way scissors that cut cans are sold on late night TV -there must be a better way! Sticking with anamorphic (though it’s not the ONLY example) black bars are a nuisance therefor we must rid ourselves of them. Seems simple, only that in riding ourselves of the bars we deliberately alter and thus (potentially) damage the image. Scissors that cut cans sound cool, but when you’re done having fun you’re left with a bunch of cut cans, cans with sharp edges that take up the same amount of space in the trash as they would have had you left them intact. Only now you might have to risk cutting yourself disposing of them. In other words it’s a solution to a problem that need not exist. At least in the digital realm.

    As for the 2.35:1 image at the top of my site; I never said I hated anamorphic or its associated aspect ratio(s). Quite the opposite. Thanks for reading!

  • Mark Coxon


    I am deducting form your response, that if the source material were 2350/1000 you would have a different take on this correct?
    It’s not the format, it’s the manipultaion of the data from its native form to create the format, that you take issue with?
    Is this correct?
    I was just razzing you on the header, stills and video are quite different as are cinema and web content anyway :)
    Mark C

  • Andrew Robinson

    If there was an true anamorphic format than yes my take would be different, but there isn’t nor do I think there will ever be going forward. That isn’t to say that films will no longer be shot using anamorphic lenses, they will. But once the image is captured and assigned to pixels the need for costly and/or cumbersome attachments, workflows etc become somewhat moot.

  • Mike Guidotti

    Personally I enjoy the look of the wider 2.35:1 aspect ratio when watching movies and how it is accomplished does not really matter to me, whether it is with an anamorphic lens or cropping a digital sensor, as you pointed out once it is digital it does not matter.

    I am not a fan of breaking 1:1 pixel mapping or introducing an additional lens in front of my projector which is why the JVC projector I am getting has a memory zoom lens which will “zoom the black bars” off the edge of a 2.35:1 screen.

    As long as movies are made in non 16:9 format you are going to run into this problem to some extent and I don’t see it ever changing. If the main issue is a lack of resolution 4K should improve things giving you almost 1,600 pixels instead of 800