I have a confession to make – I’m a high definition snob. I only watch movies in 1080p. I could care less if a movie is available – shudder – in standard definition (SD). As far as I’m concerned, the movie doesn’t exist, until it’s in 1080p (excluding cinema releases, of course). Many friends think that I’m a hopeless geek to impose such ridiculous restrictions on a movie experience. For some colleagues it’s more important to watch the release as quickly as possible, then to worry about quality. Am I narrow minded? Is my quest to have a cinema experience overrated? Who even notices those compression artifacts, mosquito noise, or frame tearing, anyway?
When Blu-Ray first came out I recall friends telling me how they were disappointed by HD video. They couldn’t tell the difference from the DVD version. I anticipate the same response as we transition to Ultra HD (UHD). History is destined to repeat itself as we venture into more pixels and bigger TVs.
Consumers face a perception verses reality battle regarding image quality. It’s all about how we perceive new technology. In other words, the eyes may see better quality, but the brain is not recognizing the higher resolution. I attribute this to any combination of factors. Anything along this supply chain from creation to consumption can adversely affect video quality:
- Creation • The cameras used in production may not have been the analogue equivalent of HD. Possibly the lenses were poor quality or old film stock was used. Maybe the content was filmed with digital cameras that did not have HD sensors.
- Source • Maybe the source material wasn’t HD. Even if the content was broadcast on an HD channel, the content itself may have been SD and was unconverted (i.e. up-scaled) to HD.
- Digitization • This is where film or video tapes are converted to a digital format. Possibly the source content was not digitized properly from the master film reels. For example, Super 16mm film has the grain resolution to achieve a 1080p analog to digital conversion. A digital conversion service may have used SD (576p PAL or 480p NTSC) conversion on the film stock. • Maybe a copy was digitized and the master (aka. mezzanine file) wasn’t used at all. Once the film is digitized then software is used to correct color, exposure, and audio/video synchronization. In addition, scratches, dust and film damage is digitally removed from each frame. For some Hollywood movies this takes months of effort. Some early Blu-Ray releases received bad reviews because the digital cleaning process resulted in complete removal of film grain – an aspect of movies that gives that venerable cinema feel. The digitization process has since improved to maintain the visual experience intended by the director.
- Encoding • The source content may not have been encoded properly, resulting in a substandard video transfer. Typically the analog (film) to digital (file) conversion is uncompressed.Each HD frame scanned to a file would typically occupy 6.2MBytes. Taking these frames and streaming them uncompressed at 23.976 fps (frames per second, typical for a Hollywood movie) would stream at 1.2 Gbps (gigabits per second). This is much too large for consumer devices, so the file needs to be compressed down to a reasonable file size. That’s where H.264 and the newer H.265 video compression standards come in. These codecs compress an HD movie down to a 4-6Mbps when used in an internet streaming service. That’s a reduction of 200:1!
- Transcoding • Once a video has been encoded, then it can be transcoded into other formats or bitrates. If a good quality master wasn’t used in this process, then the result is GIGO (garbage in garbage out). If the video was transcoded several times before it reaches the viewer, then image quality would have degraded at each transcoding step. In the early days of H.264 encoding, engineers were not versed in all the encoding intricacies, which resulted in a sub-optimal video outputs. This same learning curve is beginning again with H.265.
- Supply • There are many intermediaries taking content from one provider and handing it over to another. This applies to both broadcast television as well as internet delivery. Hardware, software, and interconnections anywhere along this ‘supply chain’ can compromise the quality of the signal. Over the internet this involves any combination of routers, switches, firewalls, cabling, streaming or caching servers.
- Delivery • The delivery mechanism of the Internet, Over the Air (OTA) broadcast, Cable, or Satellite, may have deteriorated the signal due to congestion, latency, dropped packets, interference, or lack of quality of service (QoS) contingencies. Don’t expect the same qualify from a streaming service, compared to Blu-Ray. Streaming services use much lower bitrates when transferring HD quality (typically up to 4Mbps for 720p or 6Mbps for 1080p), whereas Blu-Ray will use well over 16Mbps.
- Screen • The consumer may not have a TV that is good enough to see the improved resolution of HD or 4K. TV’s with less than 30” diagonal were too small to showcase HD. Cathode ray tubes (CRTs) were a complete waste of time when showcasing HD. Likewise, it will be hard to see the advantages of 4K video when viewed on screens smaller than 42”, unless the viewer is sitting right in front of the screen. The distance of the viewer from the screen has an effect on video quality. The 2 foot computer experience can afford smaller pixels and screen, compared to TV’s 10 foot TV viewing. As the person sits farther from the screen then pixels begin to blend together, and the advantage of higher pixel densities are lost.
- Viewer • Maybe the audience member doesn’t have 20/20 vision?
- Bias • This is often overlooked when evaluating something new. There is an inherent bias of each viewer when experiencing something that they have never seen before. Did the person hope for the video to be better, before they saw it for the first time? Were they indifferent? Maybe they were part of a grumpy generation that could care less? Pre-established bias plays a role in how we react to new technology. Understanding these biases in advance helps to filter the opinion of others.
It’s as though all planets need to align before we can enjoy 4K video. This is certainly the case with many technological breakthroughs. Equipment, technologies and processes along the entire supply change needs to be upgraded to ensure an optimal viewing experience.
In the case of 4K, some people will simply not perceive the higher resolution – at least not initially. Even if the technology from source to viewer has the ability to showcase 4K, some won’t immediately see an improvement in quality. When I displayed Blu-ray content for the first time on my monitor, I couldn’t immediately see the benefits. My personal bias was to rant about how amazing the video was to my friends, because I wanted it to be better – but deep inside I was underwhelmed. It took me a few weeks of consistently watching HD before I acclimate to the resolution. As my brain started to adjust to the additional pixels, and sharper picture, it wasn’t until I looked back at standard definition that I realized how my perception had changed.
My 4K video doesn’t actually suck – mainly because I don’t have a 4K TV yet. But I loved what I saw when 4K and UHD was showcased at a number of Media and Entertainment exhibitions these past two years. That may have been my internal bias talking, of course.
Each generation has incrementally higher expectations on new technology. It’s funny to think that maybe my kids will one day say, “Dad, this 4K video sucks. Don’t you have the movie in 8K?”. The grumpy generation would be quick to react, “When I was your age…”. In the meantime, I can’t wait to be a 2160p snob.
Is perception also reality for 4K video? Would you recognize 4K quality when you see it for the first time? What ultimately effects video quality, and how do we perceive these incremental improvements? This article explores the challenges that the industry faces in delivering 4K UHD video to the masses, and the bias that consumers face when a new technology enters the market.
• About Gabriel Dusil
Gabriel Dusil was recently the Chief Marketing & Corporate Strategy Officer at Visual Unity with a mandate to advance the company’s portfolio into next generation solutions and expand the company’s global presence. Before joining Visual Unity, Gabriel was the VP of Sales & Marketing at Cognitive Security, and Director of Alliances at SecureWorks, responsible for partners in Europe, Middle East, and Africa (EMEA). Previously, Gabriel worked at VeriSign & Motorola in a combination of senior marketing & sales roles. Gabriel obtained a degree in Engineering Physics from McMaster University in Canada and has advanced knowledge in Online Video Solutions, Cloud Computing, Security as a Service (SaaS), Identity & Access Management (IAM), and Managed Security Services (MSS).
4K, Broadcast, Connected TV, Digital Video, DRM, Gabriel Dusil, H.264, H.265, HEVC, Internet Video, Linear Broadcast, Linear TV, Multi + screen, Multiscreen, New Media, Online Video, Online Video Platform, OTT, Over the Top Content, OVP, second screen, Smart TV, Social TV, TV Everywhere, Ultra HD, Ultra High Definition
 Mosquito Noise, pcmag.com, http://www.pcmag.com/encyclopedia/term/55914/mosquito-noise
 Ultra HD, Wikipedia, http://en.wikipedia.org/wiki/Ultra-high-definition_television
 Resolution of Super 16mm film, cinematechnic.com, http://www.cinematechnic.com/super_16mm/resolution_of_super_16mm.html
 What is Encoding and Transcoding?, By Jan Ozer, 20 April 2011, streamingmedia.com, http://www.streamingmedia.com/Articles/Editorial/What-Is-…/What-is-Encoding-and-Transcoding-75025.aspx
 (1920×1080 pixels x 24 bits of color per pixel)/8bits per byte = 6.2MB (mega bytes)
 Mbps, Mega bits per second.
 Over the Air, http://en.wikipedia.org/wiki/Over-the-air_programming