That's a pretty goofy statement. HDR is at its best right now with content meant for TVs like movies and television shows. Stuff like YouTube HDR will always be dependent on the person creating the content because it is not a curated platform
The problem is that you need a screen that actually supports HDR to its fullest, so 800 nits would probably be the minimum peak brightness you can get away with HDR looking correct. Otherwise, things will look overexposed or like vivid mode because its fake hdr displaying on like a 400 nit crappy LCD. I don't know much about what phones consider HDR these days
Honestly if the monitor is from >3 years ago its probably really bad. HDR 400 isn't even really HDR. And it didnt work on IPS monitors without local dimming which isn't great
HDR is a ratio though. The max brightness of OLED TVs doesnt need to be as bright as LCD TVs. Though OLED TVs even from a few years ago go up to 1000 I think
Oled have an infinite black level which throws a wrench in a normal conversation about it. HDR is actually more about contrast than brightness. The higher peak brightness the better, but it all depends on what the content is graded for, and there will be diminishing returns after a certain point. Yes, the higher contrast will be there, but can you even see it? I find more value in better contrast in dark areas than bright. So oled is superior
I think that 1000 nits is a realistic expectation going forward for consumer HDR. Maybe not on LCDs but I don’t think I’ll ever buy an lcd panel again if I can help it. It still looks better than SDR and can be obtained at reasonable prices
I'd assume that as long as the engine and assets can take advantage of the larger color space, it should be scalable to your screen's brightness unlike graded films
Ironically on my OLED monitor I run it in HDR even for SDR content, because the colors in the SDR modes looks way too saturated for me, where the HDR presets look more accurate