The impending release of UltraHD or UHD (aka consumer 4K) has many enthusiasts both excited as well as confused. Those that are excited believe it (UltraHD) will solve whatever problem they believe they’re having at the display level while others look at it and ask, what’s it all about -or perhaps what’s it all for? More people probably ask me about UltraHD and 4K these days over any other topic facing the specialty and even pro video space. Sadly, I don’t have all the answers and those that should -i.e. the manufacturers and technology creators -are being purposely vague. Why? Because there is still so much left to determine. So what do we know?

We know this, UltraHD/UHD/4K is coming to our homes with displays capable of displaying such resolutions supposedly being available as early as summer, though more realistically late fall early winter. First generation sets are bound to be cost prohibitive for the masses with prices ranging from just over $10,000 on up to over $30,000. Many of the displays will be large 70-plus inches diagonally, though most will likely not exceed 85-inches at first. When 40-60 inch displays begin to hit the market the prices should drop dramatically as the average TV size in the US is NOT 70 plus inches but rather somewhere in the vicinity of 47. But do you need it?

To understand that part of the question you must understand the difference between HD  and UltraHD/UHD/4K. First, UltraHD/UHD/4K are all basically the same thing. Keep in mind we’re discussing UltraHD/4K ONLY as it pertains to the consumer marketplace right now, so yes, UltraHD, UHD and 4K are, in truth, going to be the same. And that is not where the similarities stop. For starters, let’s look at HD, but rather than break down all of HD’s many variations let’s focus on its best, which is Blu-ray.

Blu-ray is a high-definition disc format that is capable of holding files ranging in size between 25 and 50GB. There are larger capacity Blu-ray discs coming to market, however they’re predominantly found in the computer markets not in the specialty AV ones. So for the sake of this discussion we’re going to stick with the consumer form of Blu-ray. Blu-ray is an HD format, arguably its best, and as such has a maximum resolution of 1,920 x 1,080 pixels. Frame rates can reach up to 60 frames per second (59.94 to be exact) though the common frame rate for movies is 24p or 24 frames per second progressive. Bit rates can reach as high as up to 48Mbit/second though they’re most commonly found to be limited to 36Mbit/second, which is sufficient, not to mention stable. Audio plays a role in this equation too as Blu-ray can encapsulate “uncompressed” audio codecs such as Dolby TrueHD and DTS-HD Master Audio, but assuming UltraHD/UHD/4K format will too let’s let audio be for the moment.

Getting back to the Blu-ray standard, the compression codec most commonly used are H.262/MPEG-2, H.264/MPEG-4 and VC-1. You don’t have to know what these mean though know that H.264/MPEG-4 is arguably the most common and will play a role later in this discussion. Lastly, the format itself is an 8-bit format, with a color space known as Rec. 709. So in summation Blu-ray is; HD (1,920 x 1,080) at 24 frames per second, with up to 36Mbit/second data rates using H.264/MPEG-4 encoding with Rec. 709 sampled at 8-bit 4:2:0 all on a 50GB disc. That’s Blu-ray as it stands today.

Now, it is important to understand that many of the above referenced variables are independent of one another. For example; resolution, i.e. pixels, have ZERO influence over color. However, compression, i.e. H.264/MPEG-4, can have an effect on transfer rates, capacity etc, thus making compression far more important a  variable than say resolution. I mention this for two reasons; one, current UltraHD marketing is hammering home the importance of resolution and two, everything moving forward in the video realm -from capture to exhibition -is going to be about compression. With that being said, let’s look at UltraHD.

UltraHD is the consumer format of D-Cinema or Cinema 4K. Since the consumer format is but a vague facsimile (at present) of true Cinema 4K let’s not waste too much time discussing Cinema 4K and instead focus upon UltraHD/UHD/ consumer 4K. Here’s what we know so far, UltraHD is four times the resolution (not quality) of HD. Meaning it has a native resolution of 3,840 x 2,160 pixels. The reason for UltraHD’s perfect four times pixel count has to do with the aspect ratio 16:9, which allows for HD (1,920 x 1,080) to scale proportionately to UltraHD or 3,840 x 2,160. It’s clean. It’s simple. UltraHD can also encompass the even higher resolution known as 8K, but let’s ignore that for now as it is even further away than consumer 4K. Now, from here things become a bit more vague -at least at this stage in the game. There is the possibility that UltraHD/4K, in the consumer realm, could include a broader color space, greater frame rates and higher bit depths. I say it could however, at present, all signs are pointing to it not doing those things for one reason and one reason alone -file size.

Increasing the number of pixels, say four times as many, does increase your file size, but increasing the number of pixels while also improving color space, bit depth and sample rates will dramatically increase file sizes. A 90-minute HD film encoded onto a Blu-ray disc may be but 20GB but that same film in UltraHD with more color, higher bit depth and sample rates can easily approach 250GB to even a terabyte or more depending upon compression. See, compression is everything.

Compression, like H.264/MPEG-4 make it so that even an HD film can fit onto a reasonably sized Blu-ray disc. It is also the reason (or culprit) behind why we’ve never seen HD in its full glory. In truth, HD -at least at the capture stage -can also encompass a larger color space, higher bit-depth and better sampling, not to mention higher data rates. The reason the HD “standard” is what it is, or is what is listed above, has to do with file size and what we can reasonably store and/or playback. Throw in the concept of streaming and things become even more crazy, despite so-called HD streaming coming by way of 2 to 5GB files. In truth a single pixel can be HD (or even 4K) so long as it measures 1,920 x 1,080 something. See how resolution is less of a factor than everything else that theoretically should accompany it?

Getting back to UltraHD for a moment. While the UltraHD “standard” can encompass more it doesn’t mean we’ll necessarily get it. For example, early UltraHD displays, while having 10-bit capable panels are still -at present -restricted to receiving an 8-bit signal a la Blu-ray. Meaning all color will have to be upsampled to 10-bit versus arriving natively that way. Now before you go believing that upsampled 10-bit color is better than straight 8-bit, know that many modern or current HD displays are also 10-bit panels, therefore you’re already seeing and or living with the “effect”. So even if you buy an UltraHD set tomorrow (or months from now) what with all its “more color” claims, it’s not going to be any different than what you’re seeing now as the incoming signals will be but 8-bit. Sample rate is also likely to remain the same, though it could improve to 4:2:2 over 4:2:0 though I doubt we’ll ever get to 4:4:4 at the consumer exhibition level. If what I just said is Greek don’t worry about it as 4:2:0 and even 4:2:2 are sufficient in truth.

We know compression is changing but not necessarily for the better -at least for not for those holding out hope for a video equivalent of say Dolby TrueHD or DTS-HD Master Audio. Compression is evolving and becoming more complex in order to allow for larger files to fit into smaller spaces. While compression can allow for increased resolution and also greater bit depths, something has to be sacrificed in order to ensure that file sizes are manageable enough to playback smoothly via disc or better yet streamed via the Internet. One of the victims of compression is often bit or sample rates. Remember Blu-ray has a sample rate of up to 36Mbits/second. Via streaming we see those figures fall to as low as 5 and 7Mbits/second. That’s a lot of compression, though there is a lot of HD streaming that looks brilliant, so it can be done. And it is being done, and done to UltraHD for with the ratification of H.265 just this past week we’re going to be seeing some very complex compression coming our way very soon. Now, to keep things flowing smoothly, it is my opinion, that more than just sample rates are going to get chopped. You see it is a give and take, and while it’s great that manufacturers are promising the Earth, Moon and stars, reality is often very different.

With streaming becoming more and more prevalent the need for smaller file sizes will rule the day. So back to my earlier question; do you need UltraHD/4K? Using the argument of only four times the resolution of HD, you do not need UltraHD or 4K. No one is seeing pixels with HD viewing, if anything they’re reacting poorly to compression artifacts, not pixels. For those who say you need it for big screen viewing I still say no, for how big a screen do you need when many of us are still seeing 2K (1080p) movies projected digitally in our local cinemas? That’s a BIG screen and yet I don’t hear many complaining. Is more resolution better? Sure, but so is more light output and higher contrast, two things our eyes see more easily than resolution and two items that have nothing to do with either the HD or UltraHD/4K standard -meaning, they’re the same across the board. The jury is still out regarding whether or not we’ll ultimately get more color with UltraHD/4K, but for now (and even the foreseeable future) it’s no different than HD. And if H.265 is to become the prevalent compression scheme then we can expect images to look (largely) the same but take up less space -i.e. be brought to us with even greater compression.

I hope this helps shed some light upon our impending UltraHD/4K future as it pertains to a consumer format. Obviously there is still a great many things that need to be worked out. Though it’s easy to see how compression and streaming are or are going to play a huge role in what’s to come. The important thing to remember is not to get caught up in the more is more argument for much of what is being touted as “fact” is anything but. Manufacturers are often doing apples to oranges comparisons and saying the differences are the direct result of UltraHD/UHD/4K’s superiority, which simply isn’t true. It’s not that UltraHD/UHD/4K can’t actually be better, it can, it’s just not that simple as more and more of our video content needs to be compatible with and/or available in far more complex and compact formats. That’s the real truth.

As always I thank you so much for reading and for your continued support. Until next time, take care and stay tuned…

Andrew

Tagged with:
 
  • BigPines

    If people want to stream, they are always going to have lower quality. It isn’t UltraHD’s fault, it is a bandwidth/infrastructure problem. UltraHD on disc is going to look better than Blu-ray. We will have a larger capacity disc for UltraHD and we will see an improvement in the subjective quality of compression. The result will be improved picture – I *CAN* see pixels on my front projection set-up BTW and I am looking forward to them going away. I’m not saying 10-bit color wouldn’t be nice – I haven’t given up on that yet. High Frame Rate will likely be part of the spec. There is a lot to look forward to. It is still very early. We’ll see what happens.

  • AZ-J

    4K is here, because they want to upsell 4K before 8K

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    First and foremost I thank you for reading and for taking the time to comment. As per your comment we will have to wait and see as there is A LOT left up in the air. Stay tuned I guess. Thanks again!

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    And what of 16K? It will never really end but I thank you so much for reading and for leaving a comment!

  • Electric_Haggis

    Great article!!
    However, I must disagree about a few things.
    Jaggies on sharp graphics are often pretty to spot on even a 40-inch TV from just a few feet away with 1080. My 110-inch screen with Benq W6000 projector is mercilessly revealing, and the pixel grid is often there to see if you look for it – albeit VERY subtle.
    But far more “bothersome” is 4:2:0 colour compression and 8-bit artifacting, and this is THE significant difference I now see when comparing to commercial theatres.
    Like you say, 4K just isn’t there unless it’s at least 4:2:2 and/or 10-bit. Let’s hope that happens!
    But then….. where is the source material? 95% of films are finished in 2K Digital Intermediate, even if they’re shot in 5K, 4K or 3.2K (the best the Arri Alexa can do in RAW). There are, what?… 20 or 30 films at most that can be called “true 4K” as they were finished that way? If all the IMAX and 70mm films are re-transferred, then there’s some more.
    If you release all the existing 2K films on UltraHD, then 4:2:2 and 10-bit suddenly become more crucial as tat’s where the only significant improvement will be!

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    Thanks Haggis. I agree with most everything you stated, though I argue we likely won’t see true 10/12-bit, 4:2:2 for even at the manufacturer demo level, with thousands if not millions of dollars spent on one-off materials, they’re still not even showing 4K, 10-bit, 4:2:2 but rather QFHD, 8-bit, 4:2:0. This after telling everyone the footage they’re experiencing was shot using a state of the art 4K cinema camera capturing in RAW, 12-bit, 4:4:4 or the like.

    I don’t slight the manufacturers or industry for sticking with 8-bit, 4:2:0 for while not ideal it represents the best balance between quality and convenience. And let’s not split hairs it looks pretty damn good. I just finished watching Trouble With the Curve on my 120-inch screen using a SIM2 DLP and my experience was nothing if not completely cinematic. The reason I’m hard on the manufacturers is due to their insistence that what they’re selling is something revolutionary and different when it truly isn’t. If they’d just be honest with people and let things progress at a rate that is more realistic and manageable we may actually get somewhere that isn’t just a stop gap on our way to a so-called promise land. 4K isn’t even out and already the industry is telling us “just wait until 8K”.

    Thanks so much for reading!

  • Hogues

    Great article. thanks for taking the time to explain this. As someone who likes audio as much, if not a bit more, than video, I’ve been worried about the future of media. I hope that audio quality does not get sacrificed for video.

  • http://twitter.com/ARobinsonOnline Andrew Robinson

    I don’t think it will too much (if at all) for if you’re happy with what you have on Blu-ray than I don’t expect that to move backwards, but it may stay the same. As for streaming audio, it too will get better (I hope) and eventually we’ll find a happy medium. Though I don’t think downloads, streaming and/or music services are going to go anywhere. Thanks for reading, I appreciate it!