Perhaps you may be interested in new or improved video technology (i.e. a better TV). Unless you are among the majority who just go to a store and buy what the salesperson wants to sell you, you will likely have some questions to help you research and understand what is available, and the various trade-offs. But before Q&A time, perhaps indulge a bit of history and terminology as helpful context.
Background and Terminology
There have been three primary quality designations since the inception of television: standard definition (SD), high definition (HD), and the latest ultra-high definition (UHD). These quality designations reference the static resolution of the displayed images (number of pixels in an image).
An SD world evolved at the start of the TV era. For the past two decades, we have lived in an HD world, bringing along compatibility with SD as well. HD was a large jump in quality from the original SD world. Most folk consider HD the current ‘sweet spot’, because its quality is a good match for our visual perception ability, the video distribution channels largely support HD, and HD is mature technology, benefiting us with trouble-free usage and commodity pricing. UHD is where HD was 12 years ago, and should soon become an updated ‘sweet spot’.
Video signals represent an image by a sequence of pixels. the pixels are arranged in rectangular grids, forming a sequence of discrete frames, each consisting of a set of horizontal lines of pixels. A frame comprises one complete displayed image. Thus frame means image. The product of the number of lines in a frame with the pixel length of each line determines the image’s static pixel resolution (image pixel size). Thus 640/480 defines a frame consisting of 480 lines, each line having 640 pixels.
The ratio of number of pixels per horizontal line divided by the number of horizontal lines in a frame determines the image’s form factor:
- 4:3 standard USA TV (640/480) of the last 6 decades)
- 16:9 wide-screen TV (e.g. 1280/720 for typical high definition (HD) and also for UHD definition)
- 21:9 cinema ultra-wide screen TV).
Refresh Rate and Scan Type
Each video signal has a refresh rate, the number of frames per second (fps) shown on the display. The refresh rate is appended to the static resolution notation. Thus, SD video at 30 fps is 640/480/30, which is normally abbreviated 480/30 (480 lines at 30 fps).
When all lines in a frame are refreshed at each refresh interval, it is called progressive scan (p). Less capable transmission technology splits a frame into two fields, consisting of odd and even lines, refreshing all the lines in a field at each refresh instant; it is called an interlaced scan (i). For example, 480/60i means 240 odd lines are refreshed, then 240 even lines; its frame refresh rate is half that of progressive scan (60i = 60 fields per second = 30 fps).
One will normally drop the fps designator, understanding that in most cases, interlaced format scans at 30 fps and progressive scans at 60 fps. The temporal resolution for 1080i is ~62 Mpixels/sec (1920 x 1080 x 30), only 12% better than 720p.
A signal’s temporal resolution, the number of pixels the display presents to the viewer each second, is equal to the static resolution times the refresh rate. Thus, a signal called 480/60i offers the same temporal resolution as 480/30p.
Let’s consider a standard HD video signal with 720 horizontal lines refreshed 60 times a second, designated 720/60p. In a 16:9 HD form factor, each line will have 1280 pixels. The HD temporal resolution is therefore 720 x 1280 x 60, or ~55 Mpixels/sec. Just as with digital audio (bits per second), the video pixels per second determine video quality, were higher is better fidelity.
I need a new TV. What should I get?
As we approach the year 2020, UHD is becoming a mainstream technology. It is clear that all TV screens will soon be capable of native UHD display. Thus trouble-free usage and commodity pricing will soon benefit us in the UHD world as well. Extreme bargain hunters may be able to still buy an HD TV, and its display quality will satisfy all but the visual connoisseurs among us. But the best advice I could give is to buy a native UHD set and enjoy a bit of video candy.
An increase in video quality in one’s equipment means that at first availability, little content in the higher quality definition will exist. We will watch old-format content ‘upscaled’ to the new quality. Now, HD upscaled to UHD will appear to us only slightly better than the HD original. But that will be nothing short of excellent. So don’t discard those Blu-Ray movies. And one will enjoy the upscaled HD video broadcasts that will be continuing for several years yet, as we wait for UHD broadcast video; converting broadcast TV to UHD is an expensive proposition that also involves development of new broadcast standards.
Currently, there is a small kernel of proper UHD content available. Some streaming services (video over internet), such as Netflix and Amazon Prime, are mastering new shows in UHD. Unfortunately, this content is mainly targeted to titillating young minds, with dramatic substance not yet embraced by UHD, so the mature audience will still remain in a content wasteland for a while.
Newer video game consoles output UHD content. And film technology can easily be remastered in UHD, guaranteeing new UHD releases of some classics. But newer digital movie formats will not be easily remastered to UHD, particularly if they employ digital effects. But no fear, the upscaled HD quality will still be a treat.
Going to an UHD screen should not impact the other components in your media systems, for 4K content will come from multiple HDMI-connected 4K sources, and from the Internet; these pipes have already sufficient capacity to handle the extra bits of resolution. All will be well if one has enough HDMI inputs on the UHD smart TV to accommodate all UHD devices in your system. Eventually, we may need a 4K settop box if we want to use Cable TV service to receive over-the-air 4K video when it arrives. How compressed broadcast 4K signals will appear, compared to current HD broadcast, remains to be experienced. An UHD Blu-Ray player will be a further upgrade; they are under $300 with Dolby Vision HDR, and dropping fast.
The main driver for more pixels presumes one is able to see the new pixels, which means getting very close unless the screen is VERY large. In a typical room with typical viewing distances (8′-12′), the eye will not detect the discrete UHD pixels at any screen size less than 100″ diagonal. This would seem to render UHD technology a mere marketing gimmick for 99% of potential users.
But the extra pixels do more than just show up. They provide processing headroom that will shift some artifacts such as aliasing out of our visual range, making for a generally cleaner picture. Another benefit is the accompanying UHD standard. Its Premium sub-grouping specifies High Dynamic Range (HDR) for amazing contrast, 120 Hz pixel refresh rates for improved latency and realistic motion capture, and 10-bit color to support additional gamut and color volume (intensity), providing much truer color than is possible by meeting only the HD standard.
HDR with color enhancement is the most advantageous of the UHD-Premium features, having noticeably more impact on image quality than the extra resolution of 4K. HDR means much improved contrast capability, the ratio of the luminance of the brightest pixel to that of the blackest pixel. In SD times, a ratio of 500:1 was typical of CRT screens. In prior generation HD display technology, a native contrast ratio of 5K:1 was good, and plasma’s 10K:1 was excellent. Now the best technology gives a native contrast ratio of >100K:1. Combined with the color improvements, the latest imaging technology provides highly realistic viewing.
So yes, the UHD-Premium basket of good things is unquestionably a worthwhile enhancement for all consumers. The operative word is realism. It’s now available. All the goodies have been placed in one UHD basket because that is where technology is going; one standard is better than multiple standards; it is more cost effective for manufacturers to design one target range of competitive products, than to approach the market in piecemeal fashion.
OK, I want UHD-Premium technology. What size screen should I get?
Yes, size matters to perceived image quality at distance; bigger gives more viewing distance choices. UHD screen sizes initially are settling on 55″, 65″, and 77″. These sizes all anticipate a large viewing space; smaller screens may materialize if a market for them is found. As a practical matter, size enters the purchase equation in two ways. You aren’t going to build a new room, so your current viewing space is a constraint. Buy the biggest screen that doesn’t degrade the look and ambience of your space. If space is no constraint, buy the largest screen you can justify on a value basis. In my case, that was 65″. The 77″ list price was 4X as much as the 65″ selling price, so the question of whether it would fit never became an issue with me.
Should I get LCD or OLED technology?
There are three current technologies for video display: projection, LCD, and OLED. Projection is an acquired taste beyond the mainstream topics discussed here. Most TVs today have LCD displays. LCD is less capable and cheaper than OLED, but advanced LCD backlight dimming technologies are closing the gap in image quality (and price). Yet it does not seem possible to view the best LCD image alongside the best OLED image and not to prefer the latter. For my taste, OLED is our new superstar, inheriting the crown from the former plasma technology.
OLED stands for Organic Light-emitting Diode. The organic designation refers to OLED’s hydrocarbon emissive layer, substituted for the LCD’s semiconductor (heavy metal) layer. OLED displays are a very thin, flexible plastic sheet manufactured with an embedded grid of diode triplets, Red/Green/Blue. When energized, each diode in a triplet produces its own colored light.
There is no backlight required, as there is with LCD technology. With OLED, the pixels themselves are light-emitting. The black color is achieved by turning the pixels off at black grid locations. This produces absolute blackness, rather than the dark gray color of LCD. Hence OLED has stunning contrast compared to LCD, and even >10X better than plasma.
OLED diodes are brighter than LED diodes, due to shining through a thin transmissive plastic rather than through light-absorbing crystalline glass. The thin plastic substrate is much more flexible and lightweight than an LED display, and OLED diodes are much more power-efficient than LED diodes. OLED viewing angles are much wider than LCD, providing up to 170° field of view without image degradation.
OLED does have some problem areas that may affect one’s purchase decision. The screen is VERY thin. Care in handling is important to your investment. The blue pixels were originally weaker and predicted to wear out soonest. This may have been lessened in current products.
If you watch screens with fixed content such as borders and window boxes and other fixed ‘decor’, OLED will not be your best choice. Such usage will cause OLED pixel retention (burn-in), in its worst form, a permanent shadow of such static imagery. Also, color uniformity in solid dark and light gray areas will suffer from some faint banding artifacts. In normal use, this will not be noticeable. Finally, keep water away from your OLED display.
Keep the AV Receiver When Upgrading to Smart TV?
An HD AV Receiver likely will not be able to switch UHD video formats, so an expensive upgrade may be necessary to support external video switching. But most new TV’s are smart – they have multiple HDMI inputs and a way to switch among them. So unless you need the multi-channel or zoned audio capabilities, sell the AV receiver, and connect all 4K input devices directly to the TV. On the LG OLED65B7A, there are four HDMI inputs and an optical digital audio 5.1 output to connect to an external home theater speaker/amplifier system, or in our case a Logitech powered 2.1 speaker system that adds some audio punch when needed.
Audio tracks from internet apps on the TV, or from external 4K input sources connected to the TV, will play through the TV’s four internal speakers (a virtual center channel), and will optionally be routed to the external 2.1 system for playing simultaneously through the home theater speakers as well. We find we can enjoy the TV experience without true surround sound, but with a virtual 3.1 system.
Where Can I Find native UHD-Premium Content?
- Blu-Ray Ultra disks (Samsung, Xbox players)
- Netflix, Amazon, Vudu, Youtube (streaming apps via Internet)
- Video Games (Xbox, Playstation PS4 Pro)
- Cable with 4K Settop Box and UHD Premium content (2020?)
- Over-Air Channels via antenna or cable (2022?)
- Satellite direct broadcast
More technical details?
Further familiarity with technology may help one with decisions. The following types of questions may find answers in the nitty-gritty technical discussion that follows.
- What is the future of broadcast video in the era of streaming video growth?
- Is an AVR necessary?
- What to look for in a new AVR?
- Where in the signal path should certain video signal conversions take place?
- What kind of component interconnect cables should be used in the video signal chain?
- How does where one sits and what one watches determine the type of TV to buy?
- How much equipment future-proofing is reasonable/cost effective, considering how broadcast video standards lag far behind digital video display technology?
- How to feed the Youtube machine (what codecs and containers should one use, and what parameters should be specified to provide best quality)?
Not too long ago in North America, there was only one broadcast video signal type, NTSC SD (480i). It was for decades the broadcast TV standard in the US, as well as the signal natively reproduced by CRT TV sets for US use. It is still the signal type natively recorded on DVD-Video.
After the introduction of wide-screen TVs with progressive scan technologies, there were created other non-broadcast signal types. The first step beyond broadcast NTSC is 480p, called extended definition (ED) and supported by the earliest flat panel displays.
Then came high definition HD, which is either 720p or 1080i, followed by true or full HD, 1080/30p. No one broadcasts in this format yet, but Blu-Ray disc players and game consoles output a version of this format. At these high rates, compression is required to tame the signal to fit the prescribed broadcast bandwidth, so the extra resolution may be compromised by lossy compression.
A new digital broadcast standard, ATSC, replaced NTSC in the USA in 2009. It rolls up all the prior SD, ED, and HD standards into one broadcast standard. ATSC HD broadcasts are usually either 720p or 1080i. Various broadcasters choose one or the other format to best represent their typical programming content.
ESPN and ABC, who broadcast a lot of fast moving sports content in HD, broadcast in 720p, because de-interlacing 1080i will cause blur and double image artifacts in action frames. Fox also chooses 720p, as does Disney, because they do not broadcast much film-based content. Movies and a lot of television drama are actually shot at 24 fps, the standard for film. Stations providing primarily film-type HD content choose to broadcast at 1080i because they can use the telecine process (see below) for converting film scan rates to video scan format (30 fps). About 75% of broadcasters, including PBS, NBC and CBS, choose 1080i.
ATSC standard includes 1080/24p, 720/24p, and 1080/30p scan formats. So why do broadcasters only use 720p and 1080i? The highest rate in the ATSC standard, 1080/30p, is not processable by the majority of TV sets in the field, so would need to be tamed, and hence compromised, in the reception equipment. Also, bandwidth considerations would again force some type of compression on a true HD signal. Both these situations serve to defeat the benefit of true HD broadcast.
Aside: Europe, which has been slower to adopt standards and hence gets to chose more modern broadcast standards, has adopted the newer MPEG-4 standard instead of ATSC’s original MPEG-2. The improved compression available, combined with the lower PAL scan rate of 50 fps, may make true HD broadcast closer to a reality there. ATSC subsequently extended its encoding standards to include MPEG-4 Part 10 (aka AVC aka H.264), to utilize the best video compression algorithms available. Europe’s eqivalent of ATSC is DVB-T, also used in Africa and Australia. China uses a variant, DTMB, and Japan has ISDB-T, which is also becoming the primary standard for Central/South America.
Scaling, De-interlacing, Telecine
HD flat panel TV displays have a fixed internal (native) resolution; our 50″ plasma display has 1366×768 resolution (16:9). Such displays usually accept all broadcast signal types and recorded video formats that are in use when they are first designed. As a consequence, the display itself will convert each type input signal to its internal progressive format and native resolution. If the input signal is progressive, then a one step conversion, called a scaler, is required to resize the signal’s image resolution to the display resolution. If an interlaced signal is input, then a 2-step conversion is needed, a de-interlacer followed by a scaler. Although the HDTV must de-interlace the 1080i signals for display, a lossy process, this interlaced scan format has much more static resolution than 720p. Also, 720p input has slightly fewer horizontal lines per frame than the typical maximum native vertical resolution of HD displays, so the extra lines must be synthesized. A 1080i de-interlaced input has more information than necessary, so some can be merged to scale down to the internal display frame size.
ATSC’s /24p scan rates apply only to film-like sources, so are not generally suitable for all broadcast content. Only if the broadcasters could perform hot switching between broadcast scan formats based on content would these film formats be directly broadcast. But any hot switching would cause video glitches that the consumer would find irritating. So broadcasters use a technique called 2:3 pulldown (telecine) to convert /24p signals to 1080i. Receiving components can either simply play the video-converted film material at 1080i, or reverse the pulldown in the broadcast signal, restoring to /24p for a better viewing experience.
Since video conversion (scaling and de-interlacing) is unavoidable and is never lossless, the video signal path should only convert once, if possible. The HDTV usually must do scaling conversion, because it is the exceptional display whose internal resolution matches exactly one of the standard video input resolutions. So scaling is arguably best left to the TV, and ideally, the source should be set to output its native resolution, avoiding scaling at the source (but see the exception regarding audio bandwidth in the HDMI discussion in a subsequent blog page). In any case, the AVR may be best set to video pass-through, to avoid intermediate conversions.
None of the above discussion can support any more than a ‘hand-waving’ argument that one HD format is superior to another. When the final result is judged on the screen, actual resolution from a 720p or 1080i signal will typically fall between 550-750 perceived lines of vertical resolution, depending on the algorithms used for the scaler and de-interlacer, further influenced by the hardware quality (ability to control digital jitter, etc). Also, the higher resolutions often come at the expense of more potential visible artifacts. Most people think the actual result from a quality HDTV is more than adequate. As always, brilliance on the test bench won’t generate profits if it does not translate to a noticeable difference in perceived quality to the average consumer.
There has been a choice HD displays, 720p or 1080p. Since there will likely be no 1080p broadcast video in the lifetime of displays offering only HD display, and further, all DVD-video content and all non-HD broadcast content is scaled SD quality, it becomes easy to save $$ and pick the lower resolution. The only exceptions might be those planning to watch a lot of Blu-Ray video content, or hard-core gamers who sit 5 feet away from 65″ displays. Since our viewing distance is 14′ from our 50″ display, and we play no Blu-Ray content yet, our lesser HD is all we need.
Most all HD displays have been LCD technology. An early and arguably better choice was plasma, but with advancing LCD technology, like dynamic LCD backlighting and micro-dots, and a new OLED technology that subsumes all plasma’s advantages over LCD, plasma technology has been abandoned. There will be flame wars about which display tech is best, but just two things are certain. It’s always a purely personal preference decision, and tomorrow’s answer will be different.
Consumer Video Production (aka Youtube)
The user herself can be a video producer, whipping out her iPhone to get a video of Spot pulling clothes off the clothesline. This is a brave new world because now, it is not the equipment manufacturers who are making decisions for you, but it is you, Betty Jane, who must decide how to make your video look its best when it plays at a YouTube theater near you.
It’s good news though. YouTube always re-encodes its uploads and ensures a good result from a wide variety of commercial video formats. Their advice is to create your video with the maximum quality settings available on your camera, 1080p if available. (They are trying to pry us from our accustomed 720p world that has been as good as needed for over a decade.) Further, one uploads at the same frame rate as the video source i.e. 24, 30, or 60.
YouTube further classifies uploads as either ‘standard’ or ‘high quality’. Assuming we are all now high in quality, YouTube would like 1080P video bit rates of 50mbps, accompanied by 384kbps stereo audio with 48khz sampling. Since my video devices output H.264 encoded video within MOV containers, I upload the same, although YouTube seems to favor basic MP4 containers. They also recommend de-interlacing any interlaced clips in the upload.
There are detailed recommendations from YouTube experts regarding H.264 encoding parameters, involving key frame frequency, use of B-frames, etc. These encoding details are beyond this discussion; such esoterica are the domain of the professionals.
The world is slowly moving on, to mainstream (affordable) higher resolution and higher contrast technology. The future standard resolution will be 4K or Ultra HD (UHD), 3840×2160 pixels with the same 16:9 form factor as the previous generation HD resolutions. 4K screens of 55-85″ are becoming the affordable standard for TVs, and display panels in general.
While one will find little justification for the added resolution by itself, the other improvements will surely justify replacing old HD technologies with newer, more vibrant and accurate color displays. In this replacement cycle, the bump to 4k resolution is ubiquitous and hence comes, relatively speaking, for free (except for a huge bump in bandwidth requirement, to accommodate the raw 4K UHD temporal resolution of ~495 Mpixels/sec, more than 8 times that of the original HD specification).
New for 2017, TV broadcasters are beginning to test ATSC 3.0 for UHD broadcasting. Maturing OLED display technology and High Dynamic Range (HDR) implementations will become generally affordable, enabled by HDMI 2.0a as the standard AV hardware interconnect. Several advantages are inherent in OLED over LCD:
- very high refresh rate
- wide color gamut
- high contrast, due to the blackest blacks
- wide viewing angle
- light weight
- environmentally safe
- low power usage
- more durable
- simpler, in the long run, potentially less costly manufacture
HDR technology is applicable to both LCD and OLED technologies, but will likely always produce superior results with OLED screens. HDR provides both higher contrast, and greater color depth. It requires participation by content providers and video transmission bandwidth providers to achieve greatest benefit.
HDR will initially come in two flavors, proprietary Dolby Vision, and the open standard HDR10 base layer. It will also come with two spec levels, one for LCD technology with its higher luminance potential, another for OLED technology with its lower black level potential. Neither technology can compete on the other’s high ground, thus the double standard. Of critical viewers who can tell the difference, those whose taste tends to brighty-bright blingy-bling will certainly prefer LCD HDR; others preferring the greatest accuracy and detail, and the moods invoked by the deepest, velvety levels of black, will surely opt for OLED. Or not. It’s subjective preference, so difficult to capture in the abstract.
When done right, as in the LG 2016 flagship models, OLED screen technology already provides the finest display experience ever. Yet only a couple of vendors still chase this technology, because early manufacturing yields were poor on the larger panels, pushing early costs very high, and because early-on, there was lower life expectancy of the blue-producing emissive material, raising questions of overall technology viability. The Panasonic/Sony OLED consortium died in 2013 and Samsung bailed on OLED in 2015.
Only LG continues to bet the farm on big OLED, so it will be the producer of all large OLED displays in the near future. Other vendors would have to have very understanding investors to jump into such a scenario, with only one viable supplier who is its own biggest customer. Sony and Panasonic again are offering big screen OLED, which they source from LG. Small OLED screens for desktop monitors and mobile devices will likely be the more viable market for other dabblers in OLED technology, until manufacturing and technology breakthroughs arrive for large panel production.
In the near term, OLED is facing some competition as the new Quantum Dot (nano-crystal) technology binds vendors to LCD technology for a while longer. Quantum dots are implemented as a drop-in substitute panel within the standard LCD display manufacturing process, so there is no expense for a new production line.
When illuminated by the blue LED backlight, the nano-crystals emit colored light, the color dependent on the crystal size. The advantages of quantum dot technology are greater brightness, better color saturation and accuracy, and less energy usage. The downside is color bleeding, and there is a scramble to fix this problem.
OLED will still be king, because of its incredible thinness (that helps make the large screen disappear into the decor), native contrast, color punch, and off-axis viewing quality. While absolute brightness, together with early motion blur, color accuracy, and longevity issues, have been weak spots, OLED, when implemented well, largely overcomes these difficulties, offering the best picture quality available. Nano-crystals will not likely close the gap, and LCD technology has its own weaknesses, particularly in its typical ‘zoned’ HDR implementations and narrower field of view. Curved panels,a common choice by 2016, have since disappeared, as have 3D screens.
Those of us hanging onto our Panasonic plasma screens now have somewhere to turn if those plasma devices ever decide to stop working. OLED is noticeably superior to plasma in image quality, even at 1080p HD without HDR. And if 4K OLED and HDR get perfected one more notch, together with a price decline of ~50%, 2017 could be LG’s great year, finding many of us relegating our still-working plasma screens to secondary viewing areas, to make room for the new King.
2018 Update: The King now has a home in our viewing room, a 2017 LG OLED65B7A bought at 25% Memorial Day discount over current sale prices (only $300 more than the 50″ Panny cost 10 years ago). Wow. Having recently and by choice down-scaled our living standards (as many seniors become comfortable doing), our latest abode has no room for a secondary viewing area with a 50″ screen. The Panny is in the garage, awaiting a new home.