Don’t get me wrong, I’m a HUGE supporter of 4K, however recently I’ve been somewhat critical of the format and its impending release into our homes. I’m not being critical because I’ve changed my tune with respect to 4K, but instead because I know the value it brings to the viewer and I don’t want to see it watered down in order for the consumer electronics industry to have something new to sell. Unfortunately, that is what appears to be happening.

4K, for those of you who may be scratching your heads, refers to an audio/video standard set by a group that goes by the name of DCI. The standard was adopted to create a sort of level playfield for all the studios and theatrical exhibitors to adhere to in order to avoid things like “format wars” that we consumers often deal with -think Blu-ray versus HD DVD. Or Beta versus VHS. DCI 4K, or D-Cinema 4K, as it’s more commonly known, calls for a number of standards as it pertains to a digitally projected image; standards for number of pixels, compression, color space and more. These standards were adopted because they most closely resemble, match or even best, the perceived quality of analog 35mm film. While there are a number of other aspects to the D-Cinema standard, the four most important have to do with the number of pixels, color, color space and lastly compression that the image employs.

D-Cinema 4K calls for a resolution of around 4,000 pixels across by 2,000 pixels tall -specifically 4,096 x 2,160. I use the word around because not all aspect ratios will fit exactly into the 4,096 x 2,160 spec, so DCI does allow for some leeway. In terms of color, D-Cinema 4K calls for two things; first, 12-bit color and the ability to display the CIE 1931 XYZ color space -or the entire range of human perceivable color. Lastly, D-Cinema mandates that the image be compressed using the JPEG2000 compression codec, which is robust and not widely -if ever -used at the consumer level. So that’s the professional or D-Cinema standard in a nutshell, but what does it have to do with you the viewer or me as an indie filmmaker?

For starters 4K is coming home and rapidly, meaning in the next 18 to 24 months you can expect to see more affordable 4K displays and 4K compatible devices hitting store shelves. Right now there are several 4K displays available, however many, okay all, cost more than many mid-level automobiles. So while 4K is technically available to consumers, most of us, present company included, are going to have to wait. But it’s not simply a matter of waiting until prices drop, there are some discrepancies between consumer 4K and D-Cinema 4K. Discrepancies that weigh heavily on me and how I choose to move forward with Love In Training. For example, consumer 4K is being brought to market in the resolution known as Quad Full HD or QFHD. While still technically 4K, its pixel count isn’t quite to the standard of D-Cinema; dishing out 3,840 x 2,160 or four times the resolution of HD (1920 x 2 = 3,840, 1080 x 2 = 2,160). The reason for QFHD and not true D-Cinema 4K has to do with our acceptance (or dependence) on the 16:9 aspect ratio. QFHD not only fits, but scales proportionately to 16:9 where as D-Cinema 4K does not. So, QFHD will be our consumer 4K resolution. Next, because 4K images are larger we must use a more aggressive compression scheme to “shrink” file sizes down in order to make them more accessible in the home markets; this means using h.264 (or later h.265) compression, which is far more invasive than JPEG2000. Lastly, in terms of color, our current HD standard, including Blu-ray, calls for 8-bit color as opposed to D-Cinema’s 12-bit and it also mandates the use of the color space widely known as Rec. 709. Both 8-bit color and the Rec. 709 color space are drastically smaller thus easier to “move” in a consumer market. In other words, our consumer 4K future is looking a lot like our current HD one -save maybe resolution, which will be higher.

How that affects Love In Training is simple; filming and working in 4K is very, very expensive. Even with the advent of camera systems such as the RED and Sony’s soon to be updated 4K camcorders, the cost of doing business in 4K is dramatic. Now, Love In Training is a theatrical film, in that it WILL be shown in a theater. So, logic would insist that I film in a D-Cinema compliant format, aka 4K. However, the majority of viewers will no doubt view Love In Training in the home; in which case, even with 4K right around the corner, it doesn’t make sense to work in true 4K. If our consumer 4K standard is to be nothing more than a higher resolution version of our current HD world, then why adopt the added expense of working in a cinema format if no one outside of cinema will ever see it? This is the question many filmmakers have to ask themselves and is one I take very seriously. On one hand I like pushing the envelope and on the other I want to maximize my budget and bring you the best story I can without going broke trying to film it.

I’ve been fortunate enough in my film career to work with true, RAW, 4K images and workflows and let me tell you -they suck. Not to mention how disheartening it can be at times seeing that brilliant 4K footage knowing that you’re going to essentially “squeeze” the life out of it so that it will fit on a DVD or worse be able to be streamed via Netflix or the like. This why April Showers looks like -well -shit compared to the theatrical transfer. Even with consumers being able to buy 4K displays there’s still too much carry over from our existing HD world for me to justify working in 4K. Why work in 10 or 12 bit color if you’re just going to compress it down to 8-bit at the final stages of post production? Do you know how much hard disc space you can save by just capturing in 8-bit in the first place? The answer is a lot. There are DSLRs out there that when properly implemented can capture HD with virtually no compression, meaning full 1,920 x 1,080 resolution in 8-bit, Rec. 709 color for a couple grand versus tens of thousands you’ll spend, just at the camera level, to film in 4K. 8-bit, Rec. 709, well that’s two of the standards consumer 4K calls for. As for compression, it’s been rumored that consumer 4K will use h.264 or h.265 but bump Mbits per second up to 50 as opposed to the 24-30 we enjoy now on Blu-ray discs. 50 is good, it’s better than 24-30, but many of these less expensive cameras and even the expensive 4K ones capture in excess of 100 to 150Mbits per second -some go higher than that, even in HD. So regardless I’ll still have to compress the image dramatically.

Which leaves us only with resolution? Native 4K displays are automatically going to have to scale any lesser resolution to 4K in order to fill the entire field of view. If they didn’t you’d get a small image surrounded by a lot of dead space (aka black bars) on all four sides of your TV. Since there isn’t a lot of consumer available 4K content out there most of what people have seen from early consumer 4K demos has actually been scaled HD content. Their reaction? AMAZING! Now, true 4K displayed on a 4K monitor will look even better, but, at certain distances and on certain sized screens, the difference isn’t perceivable. Is that going to stop 4K from being available on 42 or 55-inch LED displays? No, but I’m being honest with you when I say, from a distance of say six feet you won’t be able to tell the difference on a screen that small. I’ve done tests in the past few months on screens in excess of 100-inches using both HD and 4K projectors, viewing both HD and 4K content and I’m telling you -at reasonable distances there is virtually no difference. Changes in resolution once you get to the HD/2K level, become harder and harder to discern from the next. In truth, what people are reacting to most is actually the image’s color and light output -two things that the eye DOES view as making something better or worse. The problem with that truth is, consumer 4K (at this time) is not going to bring anything new to the color equation than what we already have.

So, if 4K displays will scale HD to 4K and the rest of the HD standard is left intact, what’s the argument then for filming in 4K? At this juncture, knowing that many of you will no doubt watch Love In Training at home or on portable device, 4K doesn’t make a lot of sense. Furthermore, if you’re well heeled enough to afford a 4K display at this time, I argue that by filming Love In Training in the best HD possible at this time I’ll be giving you three quarters of the image you need with the last twenty five percent being taken care of on someone else’s dime -aka Sony, LG, JVC etc. Besides, if you’re like me and sit more than say 5 or 6 feet from your TV, I doubt you’d even notice that I didn’t film in 4K had I not told you ahead of time.

I still have a lot of tests to do but wanted to share my thoughts thus far. I know I got a little technical but it’s important that we understand new and emerging technologies if for no other reason than to keep a level head about it all so we don’t go broke trying to keep pace with things we maybe don’t need. As always, I thank you for your generosity, time and support! Until next time, take care and stay tuned…

Andrew

 

UPDATE: The romantic comedy, Love In Training, referenced in this post has been put on hold indefinitely. I apologize for the confusion. For more information please read my announcement detailing the change

Tagged with:
 
  • chrisheinonen

    I agree with a lot of what you said, but one thing you overlooked is the draft standard for 4K and 8K in the home. Every 4K display coming out now is going to be outdated soon, as HDMI doesn’t support 3D or frame rates above 30p at 4K resolution yet, so no current display will be able to do those when they do finalize a standard. They will be stuck upconverting 1080p content, or showing 4K at 24p which is OK for film but bad for sports/video content.

    The main benefits that are outlined in the home 4K and 8K standard are: a colorspace that is absolutely massive compared to Rec. 709 or any consumer format, 10-bit or 12-bit luma and chroma detail instead of 8-bit, and support for 4:4:4 or 4:2:2 chroma subsampling, instead of standardizing on 4:2:0. Now all of these will require more space to store them, which might mean more advanced compression is required (though that isn’t specified in the document at this point), but we’ll have to see what they use.

    I fully agree about the resolution not being a big deal, but if a 4K standard pushes these other changes into the home, then I’m OK with it as those improvements are far more substantial and people will actually see them from where they sit. They just won’t see them if they go buy a current 4K display, though.

  • AndrewRobinsonOnline

    I didn’t bring up the 8K issue or higher frame rates because, well, I don’t want to scare folks, just give ‘em a solid overview to help enlighten them as to why I’m making some of the creative decisions I am with regards to the film. Honestly, there is so much still up in the air regarding 4K it makes me cringe every time I read about a consumer 4K product. SIM2, JVC and even Sony CineAlta have it right by embracing D-Cinema specs over rumored QFHD ones. Easy to down res, not always easy (or possible) to up-res. Just my 2 cents.

  • AndrewRobinsonOnline

    BTW, thank you so much for stopping by and for your comment! Appreciate it!

  • http://www.facebook.com/arbeck Andrew Beck

    You’re saying exactly what I’ve been trying to tell people for years, just in a lot better fashion. I’d much rather see lower compression and higher bit rate 2k files than 4k files that have increased resolution.

  • AndrewRobinsonOnline

    Feel free to share the post with whomever you feel needs convincing. :) I appreciate the comment and support! Thanks for dropping by.

  • rob3dwillox

    You have given me alot to think about. However HD now is being bcst at very low Mbs compared to 1.5Gps. We are stuffing 4-6x the resolution of SD into the old SD bandwidth. By your argument, should we have moved from SD to HD? The compression isnt always great, but i’d take it every time over SD.

  • AndrewRobinsonOnline

    Now you’re getting to the root of my bigger issue, which is we (consumers) have yet to fully exploit HD to its full potential and now we’re moving to 4K? It makes no sense to me. We compress the living hell out of our imagery, choke color and limit bandwidth just to make HD “work” only to do the same to 4K? No sense. Truth is, no one has even see true HD in the home. Did you know that most (if not all) HD displays are 10-bit color capable? 10-bit, that’s a lot and yet we only feed ‘em -if we’re lucky-compressed 8-bit color. That’s a lot left on the table. Instead we suffer banding and other bad color anomalies. Ugh. Why do we continue to use the Rec. 709 color space? We could expand it in the HD realm to CIE XYZ but we don’t. So, the same watered down HD we’ve come to enjoy will now become the same watered down 4K -only now we’ll have more resolution, which is the least noticeable improvement of the 4K format. Just my opinion. Thanks for reading and for your comment. I appreciate it!

  • Brian Poster

    Lets take apart your rant piece by piece. First and foremost one
    of the biggest advantages of 4K coming into the home is the image control the
    consumer will have with multiple viewing environments. Imagine sports as an
    example, where the viewer at home will be able to zoom into a particular detail
    on the field and maintain high image clarity when doing so. Imagine when
    viewing a large public event such as a parade, and the viewer able to push into
    close-ups revealing costume detail not possible otherwise? Imagine the ability to jump on line and check the news, or e-mail and be able to see the information crystal clear, or when the viewer wants to split the screen with internet plus tv…the examples go on and on.

    Secondly, you mention 4K was created by the DCI? That is not
    true. The DCI did not invent the “standard”, in fact 4K came along way before
    DCI even existed, and has nothing to do with “format wars” which was hardware
    based and not specific to any technical standard. Perhaps you are not aware
    that 4K can exist in many different flavors, including uncompressed, ProRes, CineForm,
    RAW, and the list goes on and on. Again, 4K has nothing to do with being
    developed by the DCI, they simply created a delivery and projection set of
    requirements standard using the JPEG2000 codec, which is yet another flavor 4K can exist
    in.

    Consumer 4K is being bought to market in QFHD for the very
    reason you are arguing against, which is broad compatibility with consumer
    level and professional camcorders that will be soon available. That is the
    point, to be able to shoot at that aspect ratio and screen the content without
    penalty regarding image detail, dynamic range and color reproduction that is
    not possible in HD. That is a benefit, not a disadvantage. And for those
    shooting in the 4096 4K, just like productions shooting widescreen HD and SD that still
    shoot for the 4:3 aspect ratio safety, the knowledgeable professionals will
    shoot with QFHD safety in mind. And if not, it is very easy to push into this
    aspect ratio in post-production, from the 4096 canvas.

    “H.264 is far more invasive than JPEG200” actually no it
    isn’t, as H.264 will incorporate Long Group of Pictures encoding using CABAC,
    which will out perform JPEG2000.

    And, REC-709 implementation had nothing to do with “moving”
    the imagery, it was because of the limitations of the NTSC display technology, not the broadcast signal.

    “It doesn’t make sense
    to work in true 4K”……Bit depth and resolution are about preserving as best as
    possible what the camera is seeing. When you start higher and go down you are
    preserving the image clarity and information far better than starting much
    lower to begin with. By the way, what do you mean by “true 4K”? Bayer pattern
    cameras such as the RED must debayer the RAW data into a useable picture, which
    is a very lossy process in which the image must be rebuilt to match 4096 4K,
    and their image sensor and color space were never producing a “true 4K” image to begin with.

    …”fortunate enough in my film career to work with true, RAW,
    4K images and workflows and let me tell you they suck..” Really? You mean the one indie film you have
    done? And again what do you mean by “true RAW 4K images”? RAW is not an image,
    it is simply RAW data from the image sensor that must be processed into
    something before it can be viewed.

    “Why work in 10 or 12 bit color if your just going to
    compress it down to 8bit at the final stages of post production?” Your very
    question you are posing to yourself shows you have no clue what you are talking
    about. Again, when you start higher you will have a better image when you go
    lower, this is not only about resolution, but quantization in color
    reproduction and other areas plays a huge role in preserving image fidelity. If
    you go out and shoot a blue sky on an 8bit codec using a DSLR you will see
    banding, quantization loss due to not enough information to preserve the
    redundancy across pixels when a high number of them comprise the same
    information (such as blue sky)

    “There are DSLR’s out there that when properly implemented
    can capture HD with virtually no compression…” Really? Have you ever used a
    DSLR that has the ability to NOT use 8bit heavily compressed H.264 with video recording? And have
    you panned the DSLR quickly and looked at the footage? Have you heard of the
    term “jello effect” where the readout of the pixels on the imager and the processing cannot handle
    the movement?

    And, very importantly you completely miss the point about
    shooting using 4K cameras. Their image sensors are far more sophisticated than
    HD image sensors, in the ability of dynamic range, signal to noise, exposure
    latitude, color reproduction, etc, that HD cameras cannot come close to. Which
    means, when shooting with these cameras you will get a far better picture than
    shooting with an HD camera, even when shooting in HD mode.

    The same principal applies with 4K televisions, aside from
    the native resolution they will have far superior color reproduction, better
    blacks, dynamic range, etc, than HD televisions. And, this ability will more faithfully display the filmmakers original intention in all aspects of his/her vision. Again, your arguments are weak, based largely on resolution, which is only part of the story of 4K.

  • AndrewRobinsonOnline

    Brian,

    First and foremost thank you for reading and for your
    lengthy comment. Now, if I may clarify and/or retort…

    You Said: “…one of
    the biggest advantages of 4K coming into the home is the image control the consumer
    will have with multiple viewing environments. Imagine sports as an example,
    where the viewer at home will be able to zoom into a particular detail on the
    field and maintain high image clarity when doing so. Imagine when viewing a
    large public event such as a parade, and the viewer able to push into close-up
    revealing costume detail not possible otherwise? Imagine the ability to jump on
    line and check the news, or e-mail and be able to see the information crystal
    clear, or when the viewer wants to split the screen with internet plus tv…the
    examples go on and on…”

    MY ANSWER: That would be clever use though no one that I have heard
    is discussing 4K in this way. Though, I’d hesitate to zoom into broadcast
    material as the compression is going to suck. Think of now only 4x worse.

    Secondly,
    you mention 4K was created by the DCI? That is not true. The DCI did not invent
    the “standard”, in fact 4K came along way before DCI even existed, and has
    nothing to do with “format wars” which was hardware based and not specific to
    any technical standard.

    MY ANSWER: DCI set about to adopt a series of standards for the
    exhibition of 4K (and 2K) material. I never said they invented 4K. I used the
    term “format wars” to give the readers a point of reference that they
    may be familiar with.

    Perhaps you are not aware that
    4K can exist in many different flavors, including uncompressed, ProRes,
    CineForm, RAW, and the list goes on and on. Again, 4K has nothing to do with
    being developed by the DCI, they simply created a delivery and projection set of
    requirements standard using the JPEG2000 codec, which is yet another flavor 4K
    can exist in.

    MY ANSWER: I am aware that 4K can exist in many different flavors,
    however, the ‘flavor’ I was discussing in this article is the one the audience
    will see either in a theater or at home. As a filmmaker the final result is
    what I focus on, since that’s the only result that will matter. It’s great that
    I may be able to have ProRes 4K or CineForm 4K but that doesn’t mean the
    audience is going to see that. In a cinema they get JPEG2000 at home they get
    h.264.

    Consumer
    4K is being bought to market in QFHD for the very reason you are arguing
    against, which is broad compatibility with consumer level and professional
    camcorders that will be soon available. That is the point, to be able to shoot
    at that aspect ratio and screen the content without penalty regarding image
    detail, dynamic range and color reproduction that is not possible in HD. That
    is a benefit, not a disadvantage. And for those shooting in the 4096 4K, just
    like productions shooting widescreen HD and SD that still shoot for the 4:3
    aspect ratio safety, the knowledgeable professionals will shoot with QFHD
    safety in mind. And if not, it is very easy to push into this aspect ratio in
    post-production, from the 4096 canvas.

    MY ANSWER: I’m not arguing against QFHD, I explain clearly why QFHD
    is the clear and smart choice. I think you misread or misunderstood me there.

    “H.264
    is far more invasive than JPEG200” actually no it isn’t, as H.264 will
    incorporate Long Group of Pictures encoding using CABAC, which will outperform
    JPEG2000. And, REC-709 implementation had nothing to do with “moving” the
    imagery, it was because of the limitations of the NTSC display technology, not
    the broadcast signal.

    MY ANSWER: I’m going to respectfully disagree with you there based
    on my personal experiences. You are more than welcome to feel otherwise. I’m
    not saying h.264 is a bad compression codec, hell, it may be ideal for the
    majority’s needs, but that doesn’t necessarily mean I have to love it as much
    as you. Rec 709 does have to do with limitations of NTSC, but if we can enjoy a
    larger color space elsewhere in the pro realm doesn’t it make sense to try and
    improve that in the consumer one?

    “It doesn’t make sense to
    work in true 4K”……Bit depth and resolution are about preserving as best as possible
    what the camera is seeing. When you start higher and go down you are preserving
    the image clarity and information far better than starting much lower to begin
    with.

    MY ANSWER: Working in full 4K from start to finish is expensive in
    many regards and this article was about an indie filmmaker’s perspective. If my
    last name was Fincher I might sing a different tune.

    By the way, what do you
    mean by “true 4K”? Bayer pattern cameras such as the RED must debayer the RAW
    data into a useable picture, which is a very lossy process in which the image
    must be rebuilt to match 4096 4K, and their image sensor and color space were
    never producing a “true 4K” image to begin with.

    MY ANSWER: I’ve never shot on the RED, I used the Dalsa Origin II
    camera system on April Showers and it filmed in the RAW file format much like
    the Blackmagic Cinema Camera does today.

    …”fortunate
    enough in my film career to work with true, RAW, 4K images and workflows and
    let me tell you they suck..” Really? You mean the one indie film you have done?
    And again what do you mean by “true RAW 4K images”? RAW is not an image, it is
    simply RAW data from the image sensor that must be processed into something
    before it can be viewed.

    MY ANSWER: And where can I see your film? (I’ll happily watch it if
    you’ll point me to it). Here you’re just being unjustly nasty. I’m not saying
    you cannot have an opinion but there’s no need to make it personal. I
    understand from your writing above that you feel strongly about 4K but stooping
    to the level of poking fun at my film career (not that you would call it that)
    really does take a lot of wind out of your proverbial sail. You’re better than
    that. I know RAW is but a format. I’m trying to be technical without getting
    too far into the nitty gritty that I lose the reader’s interest.

    “Why
    work in 10 or 12 bit color if you’re just going to compress it down to 8 bit at
    the final stages of post production?” Your very question you are posing to
    yourself shows you have no clue what you are talking about. Again, when you
    start higher you will have a better image when you go lower, this is not only
    about resolution, but quantization in color reproduction and other areas plays
    a huge role in preserving image fidelity. If you go out and shoot a blue sky on
    an 8bit codec using a DSLR you will see banding, quantization loss due to not
    enough information to preserve the redundancy across pixels when a high number
    of them comprise the same information (such as blue sky)

    MY ANSWER: I will agree with you. In an ideal world I’d love to
    always work in 10 or 12 bit color. The better you start with the better you’ll
    end with. I was trying to make a point however (maybe I was too subtle) and let
    the readers know that they’re missing out due to our current standards; standards
    which will be carried over to consumer 4K -minus resolution of course.

    “There are DSLR’s out there
    that when properly implemented can capture HD with virtually no compression…”
    Really? Have you ever used a DSLR that has the ability to NOT use 8bit heavily
    compressed H.264 with video recording? And have you panned the DSLR quickly and
    looked at the footage? Have you heard of the term “jello effect” where the
    readout of the pixels on the imager and the processing cannot handle the
    movement?

    MY ANSWER: Yes, I have. Many modern DSLRs like the Panasonic GH2/3,
    Nikon D800/D600 have what are known as clean HDMI feeds where by the capture in
    extremely high bit rates (near lossless in some instances) and in ProRes 422 or
    DNxHD via an outboard recorder. If you’re shooting to the camera’s internal card(s)
    then yes you’ll have to make do with heavily compressed h.264 QT files or the
    like. As for rolling shutter, it’s still a factor but one that is being
    minimized with each passing firmware update. There are 4K cameras out there
    will rolling shutter issues too so don’t misrepresent that it’s a DSLR problem.

    And,
    very importantly you completely miss the point about shooting using 4K cameras.
    Their image sensors are far more sophisticated than HD image sensors, in the
    ability of dynamic range, signal to noise, exposure latitude, color
    reproduction, etc, that HD cameras cannot come close to. Which means, when
    shooting with these cameras you will get a far better picture than shooting
    with an HD camera, even when shooting in HD mode.

    MY ANSWER: Great. But again, I’ve done the tests and I’ve seen other
    people do them too. At certain distances there is no visible difference between
    HD and 4K at the consumer level, so all that brilliant capture in 4K can
    arguably be replicated, sans the heightened resolution, via a good HD camera.
    I’m not saying DSLR, but there are a number of HD/2K D-cinema style cameras
    with very good sensors that do many of the things you describe above.

    The same principal applies with
    4K televisions, aside from the native resolution they will have far superior
    color reproduction, better blacks, dynamic range, etc, than HD televisions.
    And, this ability will more faithfully display the filmmakers original intention
    in all aspects of his/her vision. Again, your arguments are weak, based largely
    on resolution, which is only part of the story of 4K.

    MY ANSWER: My argument is
    that there IS IN FACT MORE to 4K than resolution, however that is all the
    consumer is going to get as nothing about our modern HD standard is going to
    change when 4K comes home. We’re still going to have h.264 compression, Rec.
    709 color space and 8-bit color. So, once again, you can capture all the
    brilliance you want in 4K but at the end of the day a brilliantly captured HD
    image will look largely, if not the same from reasonable viewing distances. I’ve
    seen it. The new crop of 4K displays do not have magic powers in that they have
    better blacks (they’re all edge lit LEDs which suck anyways), higher contrast
    (most reported contrast ratios are bull
    shit anyway) as for superior color -what are you basing this on? The incoming
    signal is what dictates what the display displays and I’m merely saying the
    incoming signal that is consumer 4K is not the full 4K experience. We’re better
    off expanding upon HD and making it even more amazing before we even try and tackle
    4K.

    Thanks for
    reading again and thanks for your comment. I do appreciate it. Going forward I
    would urge you to keep the personal barbs to a minimum as it takes away from
    your credibility. A wise man needn’t mock nor demean his opponent. Thanks
    again.

  • AndreK

    You make a strong economic case but then what about the case for those true die hards that would feel let down because they don’t get the true story you wish to portray in the cinema simple because you feel that most of the viewings after it’s initial cinema run will be in AVI format downloaded from Vuze and therefore don’t film in 4K. It would be like listening to poetry being read by someone with a lisp. Would make a person think twice about going to listen to another reading.

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    I see your point, however, you’d be amazed how many so-called 4K movies are actually filmed in HD and then simply shown via a 4K cinema projector. I honestly don’t feel as if I’m cheating the audience out of anything -in fact, I endeavor to give them an HD experience that surpasses Blu-ray. You may have to download the file or I may have to deliver it via a solid-state drive but I want to see if I can.

    On the flip side, I shot my first film, April Showers, in 4K and it was shown theatrically in 2K through 4K projectors. Why? The 4K workflow at that time was cumbersome, cost prohibitive and, well, not really sussed out. So we converted the film to 2K (1920×1080) ProRes 422 and finished it that way. That 2K file was then shown digitally in theaters. The color grades for the 2K film were then applied to the 4K files and stored.
    Thanks so much for reading and for your comments.

  • heavystarch

    Very interesting article. Quite a bit of info and very helpful in describing the coming 4K gear/film/etc. I was really looking forward to the benefits but it I hate hearing that shit might get watered down.

    Andrew – got a novice question. When I look at the settings on my PC I notice that the color is described as “True Color (32 Bit)”. Is that the same color you’re describing in your article where we view 8-Bit currently with Blu Ray and the newer standards could do 10-12 Bit?

    When I am watching blu rays I’m always disappointed by the fact that I see so much color banding in big solid color scenes.

    I know my Samsung has their “Deep Color” which is supposedly able to display far more color range but I guess Blu Ray just doesn’t do that yet.

    Thanks!

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    Your PC monitor is capable of showing higher bit rate color than your HDTV. The banding you’re experiencing is due to HD (or your HDTV’s) lower bit rate color plus the compression needed/used to fit a film onto a Blu-ray disc. Deep color is a lot like upconversion in a sense that your display or player is attempting to take an 8-bit signal and play it back or upscale it to a 10/12-bit one. The problem with that is the incoming signal is still only 8-bit so you’re not viewing a true 10/12-bit signal. Does it work? Maybe. In my experience I still see banding in Deep Color enabled displays. Thanks for reading and for your question.

  • Abul Kalaam

    “… why adopt the added expense of working in a cinema format if no one outside of cinema will ever see it?”

    I guess history will repeat itself… because back in 1950′s and 60′s Film studios were going gaga over 70mm format regardless of what television then was capable of delivering, as a result the only way i can enjoy Ben-Hur or Cleopatra today, is on a good old VHS which had decent 300 lines of resolution, that is actually a fraction of the quality of the 70mm format.

    But in the Digital age chances are that if not today in the near future the consumer will have the same 4K digital copy of the theatrically exhibited movie except won’t have a large silver screen. But again you never know!
    While the step to 4K may seem deliberate and complex, it’s the only way you can force the Industry to upgrade or else 15 years from now Studios and television productions will still be stuck in 2K (like DI was for past 15 years) and consumer may have a mobile phone capable of shooting 8K.
    The Industry should have upgraded to 4K prior to the advent of Digital Cine Cameras as it would have made the transition easy.
    And the funny thing is though 35mm 4perf is actually capable of more than 6K, the post production infrastructure never exceeded 2K, except for some specific VFX work, but the release print was still 2K.
    And let’s not talk about Vistavision or Imax!!!

    So the bottom line is: Filmmakers will always shoot in superior resolutions for quality regardless of the Post production/exhibition/distribution capabilities!
    Now imagine if there is an original print of Ben-Hur lying around somewhere in the archives, that could be scanned to 8K and released for 8K Home video, which I hope I can own someday!!!!
    So it is good to be optimistic and things will eventually come the consumer way.

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    All good points, but as an indie filmmaker (i.e. someone who pays for his own films) I have to find a happy medium that provides the viewer with a great visual experience while at the same time being practical for me and my crew. 4K isn’t that solution…yet. Thanks for reading and for taking the time to share your thoughts. I appreciate it.