Announcement
Collapse
No announcement yet.
User Profile
Collapse
-
Michael, can you tell us whether the Asus motherboard's super-I/O chip (sensors for fans, temperatures, etc.) is fully recognized by lm_sensors?
...Last edited by sharpjs; 25 November 2023, 01:15 PM.
Leave a comment:
-
Any subjective differences on noise?
edit: FWIW, I have the prior generation of the NH-U14S in a Torrent Compact for a 2950X, and it’s...Last edited by sharpjs; 24 November 2023, 07:07 PM.
Leave a comment:
-
I believe the The NU-U14S TR5-SP6 will exhaust upwards (in most desktop cases) with both of those motherboards. Its airflow is perpendicular to the long...
Leave a comment:
-
The new TR boards so far all rotate the socket 90 degrees from what my 2950X has. The NU-U14S TR5-SP6 will exhaust upwards in a tower case, not towards...
Leave a comment:
-
You might find that good enough, but I did not. It was lag city for me at 4K.
I tried the straightforward graphics methods mentioned in the...
- Likes 1
Leave a comment:
-
The previous post seems to have answered that exact question.
I must admit I'm an end user, not an expert in advanced virtualization....
Leave a comment:
-
Correct in my case. I just want to run a Windows VM with snappy desktop (not games) graphics performance on Qemu+KVM without undue hassle or limitations....
- Likes 1
Leave a comment:
-
I don't think that's the case. My reading suggests that only Radeon Vxxx (sold only to hyperscalers) and maybe Instinct cards support it. I would be...
Leave a comment:
-
I have a NH-U14S TR5-SP6 awaiting the rest of the components for my TR 7000 build. The prior-generation NH-U14S TR4-SP3 in my current workstation cools...
- Likes 3
Leave a comment:
-
It's worth noting these also have ECC VRAM, which the consumer cards don't, AFAIK. I'd buy one of these if it had fractional virtualization support in...
- Likes 3
Leave a comment:
-
Nice! Last time I checked, GPU virtio for Windows was just someone's seemingly-abandoned proof-of-concept code that wasn't ready for real use. I've...
- Likes 1
Leave a comment:
-
FWIW, somehow WSL2 (WSLg, rather) is fractionalizing the GPU. I've never taken the time to figure out how they're doing it. Having read this thread, I...
Leave a comment:
-
Because I was not aware of that effort, having given up on fractional GPU virtualization. I will take a look at that. If it looks viable, I will indeed...
Leave a comment:
No activity results to display
Show More
Leave a comment: