So I recently got a 4K projector for my home theater. It's a Epson Cinema 4010 and on paper, it's a 4K HDR projector. I have connected it to my Apple TV 4K and that's when I started to realize that HDR is still not fully ready for prime time for most people.
Here's some background:
Color TV is in color...but barely. We are used to it but it's actually pretty crappy. Luckily, we don't realize how crappy it is until we see something like HDR which stands for High Dynamic Range and shows a lot more color. But HDR is also just the start. There's also Dolby Vision which like HDR but better.
However, HDR requires more bandwidth than SDR (standard). And not only are most HDMI cables just v1.4 (most people don't even know there's a version number on HDMI) but most televisions and controllers are v1.4 as well. You actually need v2.0A (obviously) to get the full benefit and needless to say, my new Epson projector doesn't have that.
So what does that mean? How can it be HDR but while still using the older (but standard) HDMI setup? The answer is frame rate. I can watch HDR movies at 24 frames per second (which is like a movie theater I suppose) and the difference is pretty noticeable. But this took some trial and error to understand because the 4K projector (which, it turns out, is not really 4K but really 2X1080p) like most audio/video equipment, does its best to bury these stats so that only the truly entrenched can discover this.
Now, don't get me wrong, I love this new projector. And I'd argue HDR is a bigger deal than 4K. But it's pretty surprising that we are increasingly seeing these amazing new standards come up (like Dolby Vision which my projector doesn't support at all and HDMI 2.0a which isn't even that new) with the hardware makers not making use of them.
In short, we're still probably a couple years away from HDR and its cousins being mainstream.