2025 Was the Beginning of the End of the TV Brightness War (theverge.com)
- Reference: 0180424007
- News link: https://entertainment.slashdot.org/story/25/12/19/1734246/2025-was-the-beginning-of-the-end-of-the-tv-brightness-war
- Source link: https://www.theverge.com/tech/841054/tv-brightness-hdr-2025
RGB mini-LED also emerged as a new category. The technology uses individual small red, green and blue LED backlights instead of white or blue LEDs paired with quantum dots. Hisense demonstrated it at CES 2025, TCL announced its Q10M for China, and Samsung unveiled its own version called micro-RGB. These sets range from $12,000 to $30,000. Sony has confirmed it will debut RGB TV technology in spring 2026. HDR content is currently mastered at a maximum of 4,000 nits. The situation echoes the audio industry's loudness war, The Verge points out, which peaked with Metallica's heavily compressed Death Magnetic in 2008.
[1] https://www.theverge.com/tech/841054/tv-brightness-hdr-2025
I believe it (Score:2)
I bought a new TV this year--just a cheap direct-lit model LCD panel--and the backlight is so eye-searingly bright that I ended up turning it down to 30%, and then turning off HDR as well. I don't need the headache.
Besides, I'm not convinced HDR is anything more than a software gimmick unless you have an OLED display.
Re: I believe it (Score:2)
Whether HDR is real or not depends on both brightness and color depth. Many TVs will take 10 bit input but have an 8 bit panel and simply aren't capable of displaying subtle gradations. Then they do fake HDR. This can still make improvements (they can do better dithering, my LG43UT8000 actually does get noticeably better in so called HDR+ mode, which is really fake HDR) but they still don't have more dynamic range.
Re: (Score:2)
There is no "fake HDR".
HDR is about the transfer function.
The standard gamma log function has limited dynamic range.
Your TV will support HLG, DD, HDR10, or HDR10+, which all have high dynamic range transfer functions.
Gradations are a separate problem.
Re: (Score:2)
I've already discussed this with you as much as I'm going to, since you are confused about every point. But I will address this:
> Your TV will support HLG, DD, HDR10, or HDR10+, which all have high dynamic range transfer functions.
That's the opposite of what you said last time, so please do fuck off forever.
Re: (Score:2)
> I've already discussed this with you as much as I'm going to, since you are confused about every point. But I will address this:
I'm not remotely confused- you are just wrong, and you are very defensive over being wrong.
> That's the opposite of what you said last time, so please do fuck off forever.
Incorrect. Though through the lens of your illiteracy, I wouldn't be surprised if you thought that.
Re: (Score:2)
That's interesting, didn't know about the distinction between a 10-bit panel and an 8-bit panel. This explains why I'm seeing annoying artifacts with HDR enabled, like posterization in really bright scenes. Thanks.
8k resolution (Score:2)
8K resolution can enable in-home immersive (IMAX like experiences. Therefore we need to get to 8K resolution and also eliminating (visibility of) the space between pixels (use a diffuser sheet?). After that work on cost reduction/manufacturing.
Re:8k resolution (Score:4, Insightful)
> eliminating (visibility of) the space between pixels
I think you might be sitting too close to your TV.
Apparently we need it (Score:1)
Half the shows I've tried to watch on Apple TV are so dark I have to follow the story in parts through dialog only.
Amen! (Score:2)
I've always hated this. TV shows and games that are FAR too dark to be able to see anything at all, even in a dimly lit room.
Poor lighting and the incomprehensible audio of today's shows makes closed captions absolutely mandatory. I was starting think it was just me getting old. But, when Apple TV added automatic captions that appear if you backspace the show at all and last for 30 or so seconds, I realized that this is a wide spread problem with the shows and not my sight nor hearing.
Re: (Score:2)
Now, you've reminded me of Game of Thrones' The Long Night episode. Aaagh!
Metallica's Death Magnetic (Score:2)
Ah, I remember this one. I also remember that the Guitar Hero tracks didn't have the insane loudness and compression that the official track had, so the community made a number of "remasters" that sounded miles better than the album. The Moderus III remaster was the most circulated I believe.
I guess if you play retro games (Score:1)
This might be cool because CRTs were much much brighter than LCDs and if you're going to try and do scamline effects they end up looking pretty terrible because they make the screen look dark and washed out because the LCD just isn't bright enough.
On the other hand a lot of old pixel art is specifically designed for CRTs and scan lines and I've yet to find a filter that doesn't fix that by having little bits of black lines or dots throughout the image. Even the fancy pixel shader stuff doesn't quite pul
Re: (Score:1)
i so miss my viewsonic g90.
Re: (Score:2)
CRTs are not, from what I have read, seen, or remember, brighter than any recent LCD display. Sure, they were brighter than the early LCDs, but not anything recent and nice. Not even close.
I have the oppposite problem (Score:2)
I watch tv at night and don't need a TV that can be seen under flood lights. I need one that is much more dimmable.
Re: (Score:2)
Exactly! Who watches TV in a brightly-lit room? I watch TV in a dark or dimly lit room, 8 feet away from my 10+ year old 50" 1080p tv, with the brightness turned down to 50%. Even on my old, inexpensive LED backlit LCD panel TV, setting the brightness at 100% is way too bright in a dark room, and I doubt very much that that's anything near 2000 nits. Probably more like 200.
Re: (Score:2)
> Exactly! Who watches TV in a brightly-lit room? I watch TV in a dark or dimly lit room, 8 feet away from my 10+ year old 50" 1080p tv, with the brightness turned down to 50%. Even on my old, inexpensive LED backlit LCD panel TV, setting the brightness at 100% is way too bright in a dark room, and I doubt very much that that's anything near 2000 nits. Probably more like 200.
If you are talking SDR content 200 nits is about right. A peak brightness in the thousands is only relevant for HDR content, where that kind of brightness will only appear for a very small amount of time in a very small portion of the screen. The overall brightness can still be quite low even with those peaks and HDR content is typically expected to be viewed in a dark room.
Re: (Score:2)
> I watch tv at night and don't need a TV that can be seen under flood lights. I need one that is much more dimmable.
Viewing in a dark room is typically how HDR content is supposed to be viewed, that's why viewing it in a lit room makes it appear so dim. The peak brightness can be very high but only happens in a very small portion of the screen and for a short time.
SDR content is not going to reach anywhere near peak brightness so all these thousands of nits being advertised are irrelevant in that context.
Generally, don't need (Score:2)
But 5000 nits is nice on the deck of a yacht.
What does brightness matter? (Score:2)
What I find distracting is the bands of darkness because digital compression does not have a smooth dark profile.
I know I have better sight than most in the dark (talking about actual real life), but I can't believe people don't notice the shadow in the dark consists of only three gray values.
In short, screw TV colour encoding cutting bits from the spectrum, use all of 0-255 of every byte.
Not Loudness War Redux. (Score:3)
The Loudness Wars were compression of dynamic range- reduction.
Increasing display brightness is the literal exact opposite- allowing for larger dynamic range.
Where the Loudness Wars sought to increase the loudness of a medium with a fixed dynamic range, TVs are increasing the dynamic range so that a TV-equivalent Loudness War doesn't need to happen.
Article was written by an idiot.
Re: (Score:2)
> The Loudness Wars were compression of dynamic range- reduction. Increasing display brightness is the literal exact opposite- allowing for larger dynamic range. Where the Loudness Wars sought to increase the loudness of a medium with a fixed dynamic range, TVs are increasing the dynamic range so that a TV-equivalent Loudness War doesn't need to happen. Article was written by an idiot.
I don't disagree, at least based on the summary. How is "there's new technology this year and next that is brighter than ever before" somehow either an end of escalating brightness or an inflection point? That's not what either of those mean.
Re: (Score:2)
The TV equivalent of the Loudness Wars in the realm of brightness, would be a new TV that had a brightness of 10,000 nits, but a minimum brightness of 8,000 nits.
That would indeed be horrible. But that's not what's happening. TVs are getting better at reproducing studio-mastered content is what is happening.
If they really want to win the consumer over... (Score:2)
...Ditch the dammed AI completely and just give us the picture without messing!
Diminishing returns (Score:2)
The problem TV manufacturers are up against is that it's hard to make a convincing reason to get a new one anymore.
I purchased my first "flat panel" in 2010. It was a 42in 1080p unit. It was a big deal at the time that it had a LED backlight. It would be laughable today, but it was amazing next to the 32in CRT that it replaced. I upgraded in 2019 to a 65in 4k unit with local diming and HDR- that was a huge upgrade. When my daughter threw an object and broke the screen last month, I was forced to buy a new T
I love the brightness of the Hisense U8 line. (Score:2)
In my view it's a better value than the Sonys that cost twice as much.