Intel's Arctic Sound M is being announced today as the Intel Data Center GPU Flex Series.
Announcement
Collapse
No announcement yet.
Intel Announces Data Center GPU Flex Series
Collapse
X
-
I've been streaming games from my Windows desktop to a Linux laptop for about a week now with Moonlight and Steam, and it's been pretty decent. So far it's seemingly more viable for me than dual-booting, or having to choose between using Windows primarily or Linux with less gaming functionality.
Prior to that, most of my GPU encoding and streaming has been with Oculus VR; Quest with PCVR operates entirely off a encode/decode process. In my case, if I had a dedicated encoder card, and assuming everything was programmed to just work with it, I'd set Oculus, Moonlight, and Plex over to it. Flat and VR games would have full access to the GPU without having to worry about NVENC getting hindered by high GPU load. A Quest 2 and Oculus encodes close to 4K by-default, and a RTX 3060 has no problem with that even up to 120 FPS while gaming. I don't plan on having to streaming multiple 4K streams so Arctic Sound M would be beyond overkill for me as a single user, but I can see how it would be useful for powering multiple sessions like that. I avoided Plex all this time largely because I dislike that it's inconsistent with direct playback of media to certain devices (I can't predict if 3 devices will all of a sudden need transcoding with a certain file), but I wouldn't care when or what it transcodes with an encode card that can handle 30+ 1080p streams
Streaming PCVR over the internet is viable, and with inflation, GPUs still being expensive, and the continued growth in popularity of encode/decode VR headsets (Quest 2, Cambria?, Vive Focus, Varjo Aero), it seem cloud streaming providers have a good opportunity and might benefit from dedicated encode cards.
I don't know what NVIDIA offers in this regard, and AMD is laughable for GPU encoding of any regard. I have a 6600 XT and I'd be interested in a dedicated encode card to pair with it as the 6600 XT by itself is notably worse at encoding than the 3060 I have to use now.Last edited by Guest; 25 August 2022, 08:31 AM.
- Likes 1
Comment
-
Originally posted by Paradigm Shifter View PostI'm more and more convinced that this is going to be Larrabee/Xeon Phi all over again.
BTW, Larrabee/Xeon Phi failed because Intel tried to make a x86 CPU compete with GPUs. It was doomed from the start.
Originally posted by Paradigm Shifter View PostThe consumer version will release, get panned for being rubbish,
Originally posted by Paradigm Shifter View PostMaybe a cut down version will make it into the iGPUs of the 15th or 16th gen chips in the future.
- Likes 1
Comment
-
Originally posted by coder View PostIt's too early to tell.
BTW, Larrabee/Xeon Phi failed because Intel tried to make a x86 CPU compete with GPUs. It was doomed from the start.
But what news did trickle out over the last few months has not been inspiring.
Originally posted by coder View PostThe early DX12 numbers seemed reasonable, from what I was able to glean. I think the biggest issue Intel is likely to have is really just the state of the GPU market and launching against a newer generation of competing models than they'd planned.
Originally posted by coder View PostSooner than that. They're calling it a tGPU (i.e. Tile GPU) and it will appear in the Meteor Lake (i.e. Gen 14) CPUs due out next year.
- Likes 2
Comment
Comment