Announcement

Collapse
No announcement yet.

Intel Announces Data Center GPU Flex Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Intel's Arctic Sound M is being announced today as the Intel Data Center GPU Flex Series.
    Weird flex, but ok ;-)

    Comment


    • #12
      I've been streaming games from my Windows desktop to a Linux laptop for about a week now with Moonlight and Steam, and it's been pretty decent. So far it's seemingly more viable for me than dual-booting, or having to choose between using Windows primarily or Linux with less gaming functionality.

      Prior to that, most of my GPU encoding and streaming has been with Oculus VR; Quest with PCVR operates entirely off a encode/decode process. In my case, if I had a dedicated encoder card, and assuming everything was programmed to just work with it, I'd set Oculus, Moonlight, and Plex over to it. Flat and VR games would have full access to the GPU without having to worry about NVENC getting hindered by high GPU load. A Quest 2 and Oculus encodes close to 4K by-default, and a RTX 3060 has no problem with that even up to 120 FPS while gaming. I don't plan on having to streaming multiple 4K streams so Arctic Sound M would be beyond overkill for me as a single user, but I can see how it would be useful for powering multiple sessions like that. I avoided Plex all this time largely because I dislike that it's inconsistent with direct playback of media to certain devices (I can't predict if 3 devices will all of a sudden need transcoding with a certain file), but I wouldn't care when or what it transcodes with an encode card that can handle 30+ 1080p streams

      Streaming PCVR over the internet is viable, and with inflation, GPUs still being expensive, and the continued growth in popularity of encode/decode VR headsets (Quest 2, Cambria?, Vive Focus, Varjo Aero), it seem cloud streaming providers have a good opportunity and might benefit from dedicated encode cards.

      I don't know what NVIDIA offers in this regard, and AMD is laughable for GPU encoding of any regard. I have a 6600 XT and I'd be interested in a dedicated encode card to pair with it as the 6600 XT by itself is notably worse at encoding than the 3060 I have to use now.
      Last edited by Guest; 25 August 2022, 08:31 AM.

      Comment


      • #13
        Originally posted by Paradigm Shifter View Post
        I'm more and more convinced that this is going to be Larrabee/Xeon Phi all over again.
        It's too early to tell.

        BTW, Larrabee/Xeon Phi failed because Intel tried to make a x86 CPU compete with GPUs. It was doomed from the start.

        Originally posted by Paradigm Shifter View Post
        The consumer version will release, get panned for being rubbish,
        The early DX12 numbers seemed reasonable, from what I was able to glean. I think the biggest issue Intel is likely to have is really just the state of the GPU market and launching against a newer generation of competing models than they'd planned.

        Originally posted by Paradigm Shifter View Post
        Maybe a cut down version will make it into the iGPUs of the 15th or 16th gen chips in the future.
        Sooner than that. They're calling it a tGPU (i.e. Tile GPU) and it will appear in the Meteor Lake (i.e. Gen 14) CPUs due out next year.

        Comment


        • #14
          Originally posted by coder View Post
          It's too early to tell.

          BTW, Larrabee/Xeon Phi failed because Intel tried to make a x86 CPU compete with GPUs. It was doomed from the start.
          Pretty much. Still, I can't shake that feeling.

          But what news did trickle out over the last few months has not been inspiring.

          Originally posted by coder View Post
          The early DX12 numbers seemed reasonable, from what I was able to glean. I think the biggest issue Intel is likely to have is really just the state of the GPU market and launching against a newer generation of competing models than they'd planned.
          Seems really game specific. But yes, Intel delayed for whatever reason (silicon problems they hoped to fix in drivers is the one I hear most often, but whether that's just because one person said it and everyone else is mindlessly repeating it... I dunno) so competing against new architectures from both AMD and nVidia is going to make them look even worse.

          Originally posted by coder View Post
          Sooner than that. They're calling it a tGPU (i.e. Tile GPU) and it will appear in the Meteor Lake (i.e. Gen 14) CPUs due out next year.
          Did not know that. Cheers!

          Comment

          Working...
          X