Originally posted by brucethemoose
View Post
Announcement
Collapse
No announcement yet.
AMD Continues Working Toward HDR Display Support For The Linux Desktop
Collapse
X
-
Originally posted by atmartens View PostWouldn't proper HDR support be useful for content creation (more than just Blender)? Isn't that an important market for AMD et al.?Last edited by brucethemoose; 06 October 2022, 03:21 PM.
- Likes 4
Comment
-
I read The sad, misleading, and embarrassing state of HDR in PC gaming just this week, which explains that it's a mess even on Windows for a variety of reasons. I looked for this article because I was surprised to find that nearly every monitor review I read said that the HDR support was terrible. I went for the ViewSonic XG270QC in the end, which is apparently better than most. I look forward to receiving it (in 2 weeks! ) even though I won't be able to enjoy the HDR for a little while yet.
- Likes 1
Comment
-
Originally posted by Chugworth View PostI can't really know for sure if I'm viewing HDR content unless its menu specifically says that it is.
Originally posted by atmartens View PostWouldn't proper HDR support be useful for content creation (more than just Blender)? Isn't that an important market for AMD et al.?
​Originally posted by atomsymbolThe base-problem is that GPUs are unlike CPUs. The behavior of a CPU is well documented, it has virtual memory, and execution is happening in small execution steps (instructions) - creating a CPU emulator is straightforward. On the other hand, creating a GPU emulator is nearly impossible. VGA pass-through is just a way of saying "Look, it is impossible to emulate a GPU, so the only choice that people currently have is to completely give up on the idea of GPU emulation". If GPUs were as well-defined and as easily virtualizable as CPUs then the idea of VGA pass-through wouldn't even occur to people.
- Likes 2
Comment
-
Originally posted by Anux View PostGPU emulation might even work better if we look at stuff like DXVK and I think there is a project for OpenGL emulation in qemu (anyone remember the name?).
Test signature
- Likes 3
Comment
-
Originally posted by Anux View PostHu? HDR is obvious to see, no need for a menu to tell you. It sounds more like your monitor doesn't have real HDR capabilitys.
Sure there is a difference, but unless you have two identical displays sitting side-by-side with one playing in HDR and one not, it's not so apparent just how much you're missing or gaining. I have watched movies in HDR, then later watched the same movies without HDR and really didn't feel like I was missing much. Perhaps it's just that much of the current content that calls itself HDR content isn't really taking full advantage of it.
- Likes 1
Comment
-
Originally posted by Anux View PostNo, KVM is CPU passthrough, if you compare it with qemus CPU emulation there are worlds between it. Sadly consumer GPUs don't support virtualization and therefore we have to fully pass through the GPU and need a second one for the host. GPU emulation might even work better if we look at stuff like DXVK and I think there is a project for OpenGL emulation in qemu (anyone remember the name?).
The complexity of it on the technical side is nothing compared to quantum mechanics.Last edited by SvenK; 06 October 2022, 06:38 PM.
Comment
-
Originally posted by Chugworth View PostThe problem with HDR is that it's easy to be fooled. On my television, I can't really know for sure if I'm viewing HDR content unless its menu specifically says that it is. I imagine on a laptop with no display indicator for HDR, it would be quite easy to think you're running with HDR when you're really not. That would be one thing to look out for when HDR arrives in Linux. Hopefully there will be some log entries or something that shows HDR is actually active on the display.
However, part of the problem is just really bad HDR monitors. The good ones are still out of reach for the average person. I have a $300 HDR-10 4k monitor from LG with contrast so bad I can barely use it even in SDR without cringing. It'll end up being sold or used for something I can marginally tolerate its deficiencies. It's also the only HDR monitor I own and probably won't buy another any time soon. I see no reason to do so with media companies & creators faking HDR content to begin with.
- Likes 1
Comment
-
Originally posted by stormcrow View Post
Granted there's a lot of SDR content masquerading as HDR (it's encapsulated and not true HDR). Disney has been criticized for doing it with The Mandalorian, and they're not the only ones by far.
However, part of the problem is just really bad HDR monitors. The good ones are still out of reach for the average person. I have a $300 HDR-10 4k monitor from LG with contrast so bad I can barely use it even in SDR without cringing. It'll end up being sold or used for something I can marginally tolerate its deficiencies. It's also the only HDR monitor I own and probably won't buy another any time soon. I see no reason to do so with media companies & creators faking HDR content to begin with.
Comment
Comment