At least twice over the past decade Apple has been close to announcing its own television. Not the Apple TV set top box but actual big screen TVs with, well, big screens. But both times I’ve heard about this Apple backed away at the last minute. And the reason why they did was because even an Apple television would be just another television with an Apple logo. Steve Jobs realized that TVs had become a commodity and there didn’t seem to be an obvious way to make Apple’s television special. I’m not here to say Apple has finally found its TV design path as suggested in Walter Isaacson’s book and will be doing a big screen TV after all. In fact I’m pretty sure Apple will never sell its own TVs. But I think Cupertino has finally figured out a way to grab an important and profitable part of nearly all TVs, controlling the future of video entertainment in the process.
Cupertino has found success over the years by proposing new technical standards and also by taking the lead in product categories that were already considered mature. WiFi 802.11 started as an Apple initiative as did Firewire 1394. Apple’s iPod was not the first MP3 player, yet redefined the category much as the iPhone redefined smart phones and the iPad redefined tablet computers. If Apple is to redefine television in a similar manner then Cupertino will have to do more than slap a more user-friendly interface on your next TV. They’ll have to combine that interface with some really compelling new technology that isn’t available anywhere else.
The technical area I think Apple wants to dominate is High Dynamic Range (HDR) video — technology for making displays of all types brighter and display more colors. Right now there are two major video advances entering the market — 4K and HDR — and tests show HDR is a much more obvious improvement than 4K for viewers.
Apple has long taken pride in their screens, whether they are Retina displays on phones, tablets or notebooks or 5K iMac displays. As the dominant hardware vendor for video editors Apple has to stay on the leading edge of display technology. And what is a TV but a bigger display?
When color TV was developed in the 1950s there were in the USA two technical standards, one from RCA and another from Zenith CBS and manufactured by Zenith. Zenith’s CBS’s color was better but RCA’s had the advantage that it could also support legacy monochrome content (the CBS scan rate was reduced to 20 frames with each frame having 343, rather than 525, lines). Ultimately, as these things go, both standards were merged into the NTSC color televisions we first saw in the 1960s. But “color” in those early TVs was a subjective thing because there were very real limitations to the color phosphors used in 1960s Cathode Ray Tube (CRT) displays. The so-called color model used in those TVs wasn’t defined by what people could see but by what those early phosphors were capable of showing. You can’t even buy a CRT television today yet that original color model remains in effect. Or it did until HDR.
There was no black in that original color TV video model, for example. Think about it: a CRT that defined color as something produced by electrons exciting a phosphor layer was entirely additive, but adding colors in a red-green-blue model adds up to white, not black. So black was the absence of electrons, something CRTs did in a half-assed way at best. Even LCD displays are dependent on a backlight for their brightness, so they don’t do such a good job with black, either. What’s needed to show a true black is a screen technology in which the pixels, not the backlight behind them, produce their own light. Organic Light Emitting Diode (OLED) technology does that.
So the key to HDR is an OLED display. And OLED displays can not only show images that are both brighter and dimmer than other display types, they can show colors that are beyond both the traditional TV color space and even the feature film color space that was defined by the colors capable of being chemically rendered by KODAK and Fuji film.
To drive that need — to get us to see how obviously we need an OLED TV, for example — there has to be content that takes obvious advantage of this new standard. And so RCA subsidized the production of Bonanza to get people to buy color TVs. People won’t be buying OLED TVs in large numbers until HDR is broadly adopted in TV production. But once it is adopted, we’ll all need new TVs that the TV salesmen will gleefully sell to us — 4K HDR OLED TVs.
Apple could be preparing its own 4K HDR OLED TV for sale but I don’t think so. A better move for Apple would be to take command of the whole HDR video wave by acquiring a dominant position in HDR intellectual property. That way every new TV from any manufacturer would have to license IP from Apple in order to be attractive to buyers.
This would give Apple both a revenue stream from HDR royalties and a weapon (the possible denial of HDR licenses) to compel certain behaviors from other companies.
The IP licensing business has extremely high profit margins. So even though this would be Apple deriving revenue from the lower-margin TV hardware sector, for Apple the margins would be a software-like 80-90 percent. That’s a much better way for Apple to be in the TV hardware business than actually selling TVs.
By controlling HDR Apple could come to control the entire video ecosystem. And the best way for Apple to control HDR would be by acquiring Dolby Labs.
Here’s the current state of HDR in the video production market. FilmLight, Adobe and DaVinci have software with HDR color space built in. There are other vendors that have plug-ins that advertise HDR but it’s not actually the true SMPTE 2084 (PQ Color Space or True HDR). You cannot get a true HDR color grade without a true HDR Reference standard monitor and those displays won’t work with HDR-ish plugins.
PQ space is a perceptual space based on the human visual system. It is a logarithmic-like curve that replaces gamma in image encoding. The human eye has always been capable of visual fidelity far beyond our current video and image standards. Dolby has been trying to figure out how to deliver this for seven years.
Dolby Vision (Dolby’s HDR product) isn’t a video grading system but an end-to-end video distribution solution. Dolby’s challenge has been to come up with a true HDR master and deliver it as HDR while at the same time delivering a base layer that would play on a standard TV as well. This is precisely analogous to RCA’s ability in the 1950s to preserve B&W signal transmission as part of its color TV solution. Dolby Vision supports both HDR and SDR (standard dynamic range).
Our current standards are TVs at 100 nits of brightness. HDR can theoretically go up to 10,000 nits. Right now Dolby has reference broadcast monitors that are 2000 and 4000 nits. Consumer HDR televisions will likely range from 700-1500 nits. This is where it gets complicated since a 700 nit display and a 4000 nit display and everything in between are all considered HDR.
Dolby Vision combines metadata and multiple layers of information in order to deliver high dynamic range images that work across all the TVs and systems out there. They are using very sophisticated predictive technology to scale image fidelity on the fly. A Dolby Vision licensed TV set will be identified, and the Dolby Vision video stream will use metadata created at the source and embedded in the program to deliver and scale the appropriate brightness to each set respectively. Dolby Vision technology will just deliver the SDR base layer to a standard TV set.
Dolby isn’t the only player in this field. SONY is working on a similar system, but SONY is too big and troubled for Apple to buy.
Steve Jobs liked to bring technologies to the market in a way that enhanced the user experience without being intrusive. As such, Apple’s strength is also their dilemma since Steve’s vision ultimately ends up in technology being invisible.
Dolby is already invisible. To most consumers Dolby is just a logo. A purchase like Dolby would cement Apple’s presence both in consumer electronics and the professional space all in one move. They could own a piece of every theater both at home and at the cinema, worldwide.
Apple would be a de facto standard in both audio and video on devices and in content creation and distribution, even for performance venues. What added value would having Dolby bring to Beats, for example? Plenty.
At around $6 billion Dolby would be cheap for Apple to buy. The quality of Dolby engineering is such that Apple would be thrilled to have the people, too, not just their technology. And as a part of Apple the resources Dolby engineers could command would grow exponentially.
Since the entertainment complex is already so heavily invested in Dolby standards that would give Apple added muscle in contract negotiations with video distributors. Apple could use the denial of an HDR license to compel certain behaviors in friends and enemies. Apple, you see, is one of many companies trying to move cable television onto the Internet. But like the other companies trying to build such Internet virtual cable systems, Apple has had a hard time acquiring content rights. Major broadcast networks don’t want Apple to undermine their over-the-air or over-the-cable-system revenue streams. Cable networks and movie studios are wary, too. And so Apple struggles. Intel tried to do the same thing and gave up entirely, selling its project to Verizon.
But if Apple can find a way to become the HDR gatekeeper, they can effectively deny big broadcast networks access to HDR. They could deny access to cable systems, cable channels, movie studios — anyone with content to license.
Dolby could be for Apple the one ring to rule them all.
Very interesting. I wonder if Apple can use its supply chain might, to help LG who is currently the only manufacturer of large OLED panels.
The issue isn’t so much 4K as HDR, which includes 4K, along with the wider dynamic range, and color gamut.
You think this is interesting? Cringely comes up with a totally fictional scenario, then says apple will do this to take over a market.
–
Tell me — what has apple done since Steve passed away? Nothing, that’s what. They’ve made slight tweaks to iphone and done shabby marketing events, which in the last year have fallen flat. And now iphone sales are falling off.
—
Bye bye, apple. In 10 years you will be another IBM or Microsoft, just trying to suck the final juices out of your end of life products. Steve was the only reason they didn’t flame out 20 years ago. Without him, innovation is dead.
And how about that. You used the same tactic to “prove” Apple will be gone in 10 years.
O, the crushing weight of Irony
“And people won’t be buying OLED TVs in large numbers until HDR is broadly adopted in TV production, too.”
Maybe not in large numbers, but my wife got a 43-inch 4K TV last week, and she loves it. It’s in use as her main computer monitor, and has not yet seen (and probably won’t ever see), a TV signal. I may get one for my Linux box. They’re cheaper than regular 30-inch 4K monitors. Go figure.
The issue isn’t so much 4K as HDR, which includes 4K, along with the wider dynamic range, and color gamut.
“selling it’s project to Verizon” ->
“selling its project to Verizon”
Fixed, thanks.
Um, I believe it was CBS (rather than Zenith) whose color wheel technology competed with RCA. (https://en.wikipedia.org/wiki/Field-sequential_color_system)
Right, and the CBS color system was a mechanical Rube Goldberg machine, harking back to the days of pre-electronic television, relying upon a spinning disc of colored gels that had a strong tendency to disintegrate from centrifugal force and a negligible tendency to stay in sync with the frames.
The first version of the RCA color system (using the first-generation color CRT, the 15GP22) had theoretically ideal color, but used a red phosphor that was too expensive. RCA degraded the system by moving the red phosphor toward the orange, thereby greatly shrinking the triangle of colors that could be rendered. This was done in order to reduce the price point of the second (21AXP22) and subsequent generations of color CRTs. But the NTSC “standard” still specified a peak wavelength for the red phosphor that only the 15GP22 ever implemented.
Thanks for pointing this out. I’ve corrected the reference and added a few more words explaining the differences between the two systems.
I don’t buy it.
.
Television manufacturers will find a cheaper solution to expand dynamic range which won’t include Dolby or Apple. This situation is very analogous to cameras taking still photos and saving in RAW instead of JPEG. Photographers take their RAW photos into Adobe Lightroom and choose the dynamic range they want (crush the blacks/shadows, saturate the colors, etc.).
.
Right now on video cameras there are also Log profiles that essentially do the same thing, recording the video in an overexposed way so that in post-production it can be tweaked for the dynamic range the engineer wants to bring out, lighter shadows, brighter colors, fleshier skintones.
.
The key to expanding dynamic range is actually RECORDING the content in this overexposed manner, not in the post-production adjustment which can be done frankly by rather dumb algorithms without a human involved (of course the best result would involve a human).
.
So a television receiving something recorded with something like an overexposed Log profile could just adjust everything down in brightness and the playback would look fine, but not wonderful. That’s essentially free to a manufacturer. Some fiddling in software could also make things look a little better too. I just don’t think you could get tons of content with metadata and sell the algorithm to decode it somewhat better than what it would decode with a simple algorithm ignoring most of the metadata.
So you are calling Ray Dolby a fool for spending hundreds of millions of dollars on Dolby Vision? This time I don’t buy it. Yes, highly skilled professional can do lots of things to make video look better. But it’s a huge leap from that to creating systems that just plain work on a billion TVs. The devil is in the details.
John,
I don’t know if you have seen this work in person, but its nothing like you describe in just adjusting current camera log files on a TV set. Log files if shot to protect highlights and if recorded in no less then 10 bits (12 bit preferred), can be graded in HDR color space to achieve this, but make no mistake about it, HDR is a whole different dynamic range and color gamut. Some cameras have been shooting HDR capable video for some time (none in 8 bit) and if the RAW video files were preserved, that same footage could be re-graded to meet the standard, but only using HDR tools to do it. You have to see it.
A TV now cannot be tweaked to look like it has technology inside that it does not have.
Fudging around with brightness and contrast settings don’t come close. Apple monitors were not even able to display 10 bits of color until the 5k iMac was released.
You can only push (REC 709) color space so far before it simply falls apart as you already know.
I don’t doubt other players will try to skirt the Dolby technology license to manage the HDR workflow, but they will still have to conform to the HDR color space to look HDR.
As far as the roll of metadata in content creation and transmission, thats a whole other
workshop.
If you saw what I’ve seen you would go — “Oh, I get it now.” 🙂
As usual you have terrific ideas, really great writing, and access to good inside info. But you need an editor or a proofreader. It seems that every second article you’ve written since you left that trade rag 20 years ago (yes, I’ve been reading you that long) has typos, grammar mistakes, and simple errors of fact that undermine what you write. If you don’t want to hire a professional editor, why don’t you at least pre-submit your articles to a few loyal readers (I’d volunteer) for immediate proofreading and fact-check before you publish?
I do my best but you are right that there can be typos and occasionally even errors of fact when I try to rely on my fading memory, but I’ve been where you suggest I go and I won’t be going there again. Copy editors are great with grammar until you have to fight with them when they have cut out all the jokes. Copy editors also don’t know anything about these subjects — that takes subject matter experts. So you’d have me write a column, send it to a copy editor I’d pay to remove my jokes and then to a subject matter expert I’d again pay (or you’d do it for free). Where does all this paying come from, whose pocket? And how long do you and I wait for these experts to do their work? And when they’ve finished, what if I don’t like what they have done? At pbs.org I had editors (lawyers, too) and columns took days to finally appear on the site. In the case of this particular 1741 word post, I started at noon and finished at three. It’s worth what you paid for it.
Ignore the pedants and keep up the great work. I’ll take thoughtful over grammatically perfect every time. As for minor factual errors—correct them when you must but, in general, they don’t diminish the theme.
Jeff Berg – +1, need a like button here.
Agreed completely. Keep up the good work, Bob.
Super Like this comment (pedant)! Great article Robert!
Brilliant!
I think Bob is right about the importance of HDR technology and Apple’s involvement.
But I also think the possible policies will be more like “stick and apple” (!) than plain gatekeeper.
Strict gate keeping can harm the market and force the spawn on new technologies and/or technical workarounds.
And don’t forget about Google whose road maps will (and do) unavoidably cross Apple’s.
Several issues here.
OLED is not the key to HDR. While OLED can theoretically produce infinite black (absence of any light) it cannot currently be brighter than a LED backlit LCD display. The UHD Alliance certification program for awarding the “Premium UHD” logo, indicating the display is HDR capable, had to carve out an exemption for OLED where it only has to hit 540 nits instead of the 1,000 or better for LCD.
And don’t confuse High Dynamic Range (HDR) with Wide Color Gamut (WCG). While HDR sets will have an increased WCG it doesn’t have to be created by OLED. LED/LCD sets using Quantum Dot (QD) technology currently have a better WCG than does OLED.
Display aside, Apple’s acquisition of Dolby would not give them an IP weapon. While Dolby Vision is a very good HDR system, it is not the only game in town. The Must Have standard is HDR-10 which all sets will support. Much like the new Ultra HD Blu-ray players, all will have the HDR-10 system, they may or may not support Dolby Vision – it’s optional.
In several years the ATSC 3.0 standard for broadcast TV will be in effect and 4K UHD and HDR will be supported but broadcasters want nothing to do with Dolby Vision or any system that uses metadata.
Buying Dolby might be a good move for Apple since the vast majority of Dolby’s earning come from royalties and Dolby Vision is indeed a very good HDR system. But it won’t give Apple a stranglehold on the 4K/UHD/HDR market.
Bravo, Johnny Blue! Now let’s find another reader to fill in the rest of the blanks. I still think Dolby’s a buy but you may well be correct about the other stuff (I’m only, after all, a clown). So what’s the scenario that DOES work for Apple? And who else might be closing-in on it? Thanks!
I think the word you’re looking for is catalyst. Instigator might work too.. but that could be mis-interpreted. ; )
HDR does include wide WCG however, its a component of it. As far as the display needed ….. I believe the Dolby reference models are LED. The projectors used in Dolby theaters are laser technology.
Try rebooting your oled phone in a pitch black room. You already know what to expect but the actual experience of seeing the logo just hanging in mid-air is quite cool… you can’t even see your fingers holding the screen up much less an outline of the screen. Contrast no current lcd tv can match. I’m not sure how much I’d appreciate HDR. I’ll have to wait and see. What about Apple jumping into VR.. late?
“What about Apple jumping into VR.. late?”
My bet is that they’ll wait and see if anyone ever actually buys a VR device before they decide on that. In other words, it’s doubtful.
I’m certain Apple has been exploring VR and AR for years. The “jump” is merely an escalation.
I believe that the NTSC system was approved by the FCC in 1949 and that RCA (NBC) began colour broadcasts in 1954. In Fort Worth, Texas one of the local stations WBAP (Now KXAS) began local colour in 1954 as well. I personally have seen RCA Camera serial number 007 at that station. Amon Carter the owner of the station and David Sarnoff (Chairman of the Board of RCA) were friends.
I could never understand why People say TV are commodities and Apple don’t want to enter it.
Isn’t Smartphone, those $149 Android Phones from China a commodity?
Isn’t the PC, also a commodity?
Apple has no problem in these two segment.
People will start saying about upgrade cycle, where it is slow to replace a TV.
Isn’t that the same as Airport, Mac Pro, or Mac Mini? All with little update a long selling cycle.
And as someone has pointed out HDR had to lower its nits rating to get some OLED in, who said HDR is an OLED thing?
There are plenty of alternative out there, so if Apple decide to use this IP as weapon, I am pretty sure Hollywood and TV Makers will get another standard up and abandon everything that has a Dolby logo on it.
Buy .. hands not that tough to catch falling knive … looky @ 30 to jump in ! 1-95 it to boat show to check out the whales new toys . And to.see what’s spilling out of there mouths !!!!
Stay thirsty my friends
802.11 came out of AT&T/Lucent, not Apple. The Airport was just the first cheap (relatively) access point.
I know the Dolby of today is not the same as the Dolby of yesteryear but every time I hear that name it dredges up horrible memories of their awful cassette noise reduction system. Hiss was better than the muffled mess that it always seemed to produce. Sorry, I’ve kept that in for forty years.
“Brother, I set her down on the other side of the river, why are you still carrying her?”
https://www.kindspring.org/story/view.php?sid=63753
Dolby equalized for tape hiss, before that, the RIAA curve equalized for phonograph needles scraping the groove (the last good thing done by the RIAA). In principle pre-emphasizing highs and then de-emphasizing them during playback should have no effect on the desired content. Perhaps you just enjoy hearing the highs at a louder level than intended, or other parts of your signal chain were reducing the highs.
Maybe he activated Dolby B on cassettes that weren’t recorded for Dolby B. That would result in a dull sound.
I never understood the appeal of Dolby NR (this is at the consumer only level by the way – I had no experience with implementing it in the professional environment at the time) as all it seemed to do on the machines I had was lop the treble off at the knees. Then, when I finally bought my first decent stereo system in the 90’s and got a three head Dolby B/C/S cassette deck it all made sense. With good quality metal tapes and the good deck the noise reduction system finally had a chance to show me that it really did work.
Bob
Keep up the stimulation!
That’s why I am here.
Everyone that I know of who has seen HDR technology, me included, says its the most significant advance in image quality since HD. Visual impact wise, 4K pales in comparison hands down.
I’m not that eager to have a very bright tv screen in my room. I already need to tone down the backlight to not hurt my eyes.
I prefer OLED screens, since the better contrast is always nice.
HDR, as all TVs have ever been, is no brighter than you want it. The higher NIT rating indicates a greater potential brightness for tiny areas of the picture where brightness is needed for realism, like the sparkles you might see in real life in a white mass of snow.
Re: ” That way every new TV from any manufacturer would have to license IP from Apple in order to be attractive to buyers.” So it’s official…we’re calling Apple a patent troll.
Well, we desperately want to have patent trolls to place their boot on our necks and twist our arms while they kindly shove whatever stuff they control down our throats. The bigger the trolls, the better. Open standards, a level play field, fair competition and fair prices are so boring! We want to be abused, we want to spend our hard earned cash on on ‘cool’ products sold with obscene profit margins. Please, we need our Stockholm syndrome!
I think the future of HD display technology may be in tiny laser projector modules instead of large LED displays. These projectors can display a large, high quality image on any surface and are small enough, with low enough power consumption to be embedded in hand-held devices. Sony and Sharp build and sell picop projector modules, developed by a company called Microvision, and have included them in products that are available now or will be soon. Foxconn, an Apple manufacturer, has made and aggressive bid to acquire Sharp. Qualper, a Chinese company is has already built a cell phone that looks a lot like an iPhone that has an embedded projector. Here are some links:
https://www.sony.net/SonyInfo/News/Press/201402/14-024E/index.html
https://www.microvision.com/microvision-at-ces-2016/
I think the industry is going to have some trouble selling another form of TV technology,
In recent years there has been two versions of HD, plasma, LCD, LED, OLED, Smart TV, 3D, TV apps, …. Rightly or wrongly, most consumers had trouble finding much difference, or didn’t want the technology. I’d guess that attempting to sell yet another is going to get yawns, unless the improvement, as seen in the living room, is very dramatic.
This is very dramatic – if you saw it, you would agree. Outside of the cost which always goes down. It will be a visual change that will be very easy for consumers to see and so different than what we see now, it won’t be a hard choice.
Steve Jobs was Apple. The current company with that name is coasting. Look at Apple’s stock, it is half of what it was last year, even with all the buybacks. Apple computers and laptops are nice looking and their OS is much better than the tediously convoluted Windows 10, but they are ridiculously overpriced and not user upgradable machines. Apple’s gadgets, their phones and tablets are overpriced and their customers can run Facebook, Twitter, and Instagram apps on lower priced competitors’ devices. Apple’s watch is overpriced and useless. Apple wasted money on Beats.
It looks like Apple might go into the entertainment business, making TV shows like Netflix has. Maybe someday Apple will be reduced to an entertainment production company, iTunes, and an online money transaction company. That giant round building that Apple is building, that looks a lot like GCHQ’s “The Doughnut” building, seemed to be designed for a company that would continue to develop computers and gadgets. Tim Cook does not seem to an innovator nor much of a pitchman. Jobs paid himself $1 per year, Tim Cook is paying himself millions each year. He’s coasting.
Let’s face it, we’re all coasting, compared to the WWII generation.
If Apple buys Dolby, would the rest of the industry rally to support Sony’s HDR implementation thus launching a format war like VHS/Betamax? Would Amazon, Microsoft, Sony, Samsung, Google or Netflix want to team up and keep Apple from dominating the industry? Would Apple benefit from starting a format war? Sidestepping a war of this nature would be prudent.
I don’t think it was my imagination, there was an Apple TV advertisement during the mid-season premier of the Walking Dead. I can’t recall seeing one ever before. Something must be happening…
Notice the first two sentences of the article “At least twice over the past decade Apple has been close to announcing its own television. Not the Apple TV set top box but actual big screen TVs with, well, big screens.”
Lots of references to “NIT”, and it’s something taken over from cinema theatre projection to TVs to describe how bright the screen is. LCD manufacturers market this using the term “contrast ratio”. Essentially the same; how bright is your display? In cinemas the goal is to be able to create an image which is bright enough for the audience, which is also more difficult when the screen size gets larger. In LCD TVs it’s a marketing tool.
.
Here’s the problem with TVs. High NIT or contrast ratio produces a picture which is far far too bright. You could get a sun tan from the output levels some screens can go to! It will never be used. The problem with LCDs, of course, is that they’re not starting from zero as you can never fully switch off each pixel. Rather each pixel is a coloured shutter which leaks some light.
.
OLED may not have the same NIT level, but it’s starting from zero. There is a concern on non-linear response from zero to low level, but it never has to get as bright as LCD screens do to show a good colour range.
.
Like plasma screen technology, OLED is a light generating technology rather than light shuttering. It works better and partly because it feels more like the CRT images you were brought up on. You don’t have to be in the sweet spot for a good picture either. You can watch the screen off angle and still get an excellent image.
.
Last I hear HDR had been ratified as a standard, so new screen will come out with this soon (get ready to junk your current 4k TV for the next 4k TV!). But the ranges of OLED and LCD aren’t the same. LCD (and remember that “LED TVs” were LCD with an LED backlight, not individual elements – cheat or what!) screen by their very nature have to work at a higher brightness level than OLED. So how did they get round this with the HDR standard when these will by their very nature be so different? Simple, two standards! Yes, one standard for OLED and another for LCD screens.
.
And to further bash LCD screens, if I haven’t enough already, when you do put one in your Home Theatre / Cinema, you’ll be turning the brightness down as much as possible anyway. Because they’re just far too bright so you’ll probably lose out on the HDR. If you could still get plasma screens that would be the choice, but OLED seems like the best alternative. I believe the new Panasonic has overcome many failings of the LG screen (Home Cinema Choice review).
.
What for Apple? Could they buy Dolby? Only worthwhile if they then incorporate Apple branding. So that when you view HDR Amazon feeds they’re forced to show the Apple logo. Well, if Amazon won’t see the Apple TV product, it’s a good up yours to Amazon!
As I understand it, the high NIT level of HDR never makes the overall picture too bright. What it adds is the ability to reproduce the picture more accurately. For example, without HDR, a picture of snow would look like a uniform white mass. With HDR and the same overall brightness, we would see the white mass with numerous sparkles, as it would appear in real life; just the sparkles would be at the high NIT level.
NIT is a measure of brightness, and HDR of range of colours. You can have a HDR at different levels of NIT, and that’s exactly why the standard has had to be split as it has.
.
If you’re screen starts at a light level of 0 for OLED then you can add the full HDR range on top and say get to a maximum light level of 10 (arbitrary figures for this example). 10 being ideal for your viewing. Now start with an LCD or LED backlighted screen, and you’re now starting with a light level of 3 because you can never fully shutter the light from an LCD screen. Add the HDR range and you get 13, which is unlucky for you as it’s now too bright for your room. So you reduce down the contrast so that it’s maximum light output is 10. What happens to your HDR then? It’s a very good question.
.
You have to set up your TV using a test card (TestCard ProHD on the iPhone is ideal for this) for the ambient conditions that you view in. So as per the above example that higher level might be too bright. I still think HDR on an Panasonic OLED flat screen (not curved) will be amazing, I just look forward to them doing one!
Maybe Apple is waiting to buy Dolby AFTER their proposed format becomes ratified and is in use.
[…] Source: I, Cringely Why Apple doesn’t sell televisions – I, Cringely […]