Announcement

Collapse
No announcement yet.

Stadia Is Google's Cloud Gaming Service Using Linux, Vulkan & A Custom AMD GPU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by ms178 View Post
    HPC-APU (or Server-APU) was intentional, there was a rumor a couple of years ago which mentioned exactly such a high performance APU with a high core count CPU part and a mid to high level GPU part plus HBM on a single package. I always thought about it as a monolithic chip, but maybe they will even try for a chiplet approach to save on costs? I've just never seen HBM integrated into one of their chiplet designs yet.
    AMD has yet to actually release a single chiplet part so it's a bit early to talk about those. Returning to the question of server APUs, they really don't make that much sense as you're generally going to be running your compute loads on either a GPU or CPU. When you're doing CPU jobs the GPUs are superfluous and when you're running GPU jobs CPU power may not be superfluous, it's need is reduced to the point where a single one can easily serve more than half a dozen GPUs. Thus with an HPC APU it's either going to have the GPU be a total waste or then have the vast majority of the CPU power be completely superfluous depending on the job that's being run.

    As for why Google would use them here, they are running games that unlike HPC tasks, will tax both of them at the same time. For this a server APU does make a lot of sense and if there have been some genuine talk about server APUs I wouldn't say that it would be too "out there" if really was AMD going around with a preliminary design asking service companies and hardware vendors if they'd be interested. After all, the first service like this to became available to the public (OnLive) became did so almost a decade ago and there have been a number of companies that have tried to do the same thing even after they went defunct, including Sony (who bought out their patents and Gaikai, one of their competitors).

    Comment


    • #72
      You wanna know why it definitely has to be an Intel CPU?

      - Intel still leads in the single-thread performance department! (Still the most important factor for games!)
      - Intel has proper support for AVX2 with 256-bit, unlike AMD which cheats out with 2x 128-bit!
      - Therefore, game developers targeting Stadia can compile a AVX2-optimized build, whereas with standard/generic x86_64 they are bound by SSE2!

      Comment


      • #73
        Originally posted by Linuxxx View Post
        You wanna know why it definitely has to be an Intel CPU?

        - Intel still leads in the single-thread performance department! (Still the most important factor for games!)
        - Intel has proper support for AVX2 with 256-bit, unlike AMD which cheats out with 2x 128-bit!
        - Therefore, game developers targeting Stadia can compile a AVX2-optimized build, whereas with standard/generic x86_64 they are bound by SSE2!
        I doubt it. They could have announced Intel as their supplier right away. Why the secrecy then? I guess it has more to do with the new Zen 2 specifics which were not announced yet. Also AMD's EPYC 2 should fix their AVX2 and single-thread performance disadvantage very soon and Google should have advanced access to and knowledge of these.

        I guess it boils down to the fact that Google wanted to introduce Stadia now and AMD did not want to disclose too many details on their custom CPU part at this time.

        Comment


        • #74
          Originally posted by Linuxxx View Post
          You wanna know why it definitely has to be an Intel CPU?

          - Intel still leads in the single-thread performance department! (Still the most important factor for games!)
          - Intel has proper support for AVX2 with 256-bit, unlike AMD which cheats out with 2x 128-bit!
          - Therefore, game developers targeting Stadia can compile a AVX2-optimized build, whereas with standard/generic x86_64 they are bound by SSE2!
          They only need to hit 60fps, and any old cpu is plenty for that. For the most part it's the GPU that matters. Also, Zen2 should be out in 3 months, which likely changes things.

          Comment


          • #75
            Originally posted by ms178 View Post

            I doubt it. They could have announced Intel as their supplier right away. Why the secrecy then? I guess it has more to do with the new Zen 2 specifics which were not announced yet. Also AMD's EPYC 2 should fix their AVX2 and single-thread performance disadvantage very soon and Google should have advanced access to and knowledge of these.

            I guess it boils down to the fact that Google wanted to introduce Stadia now and AMD did not want to disclose too many details on their custom CPU part at this time.
            According to HardOCP Google uses indeed not AMD CPU's currently and got that confirmation from AMD.

            Comment


            • #76
              Stadia is, in my opinion, a game-changer and it's going to be big. Shockingly big as in the standard everyone's using, perhaps not reads of this particular website but the vast majority. I absolutely see this replacing both the XBox and Playstation. Just buy a "smart" TV, enter your login information for games beyond those that are free-to-play or free-to-try and you're set. In the middle of the game and mom wants to watch her soap opera? click pause, go on the smaller tv or phone or notebook in your room and keep on playing. The advantages over both console and PC gaming are there.

              There's this movie called "Ready player one" about a VR world with tens of thousands of players. I totally see Stadia offering some kind of game like that and everyone's going to be playing it.

              The only downside of Stadia is the input lag, there's no getting around it. But people will care. It won't be that high.

              Originally posted by Sanjuro View Post
              I'm never gonna use it because google and cloud gaming. However, the fact that it's going to use Linux and Vulkan is great news.
              If a game developer wants to be part of it they have to port their game to Linux/Vulkan? Am I understanding it correctly?
              We can hope some good will come of this but if Android's anything to go by it's not that likely. Sure, game developers will have to develop for Linux/Vulkan to be on Stadia. And that will make it easier to release some games on regular GNU/Linux. But will they? Stadia will shift games from being sold to a model where in-game items and advertising are the money-makers. I suspect that a lot of games will be Stadia-only with no stand-aloen version. A lot of the special Stadia-features wouldn't work without Google's network anyway.

              Good news is that Unity and Unreal Engine are apparently already running on it - so any game using those could in theory be ported to GNU/Linux with ease. Of course, games using the Stadia SDK won't have stand-alone versions.

              Originally posted by Xaero_Vincent View Post
              So I'm a bit confused. This is going to be a platform to stream Linux games from the cloud? Or is Google using Debian as a base for a hypervisor and GPU passthrough to power Windows 10 guests in the cloud that gamers access?
              Everything is done server-side. You send input from gamepad/keyboard/mouse and get a video-feed on your screen.

              Originally posted by theriddick View Post
              Going to need allot of local servers whereever they plot this thing down, just having a central hub in a capital probably won't cut it for most places.
              Thought I do get 18ms ping to Syndey/Melbourne from North of Adelaide here in South Australia, but the NBN is full of copper and network congestion issues....
              Not a problem. Google's got datacenters all over and the same infrastructure is being leveraged.

              Originally posted by Weasel View Post
              "Professional gamers" with insane input lag? You can't be serious.
              I am absolutely sure that plenty of "professional gamers" (=youtube/twitch celeb gamers) will praise and shill this like crazy. They will talk it up like it's the best thing since the wheel. Don't forget that there will be a Youtube-linked "click to play" button on gaming videos/content on YouTube and all over the web for that matter and there will be a financial incentive involved (think Adsense/advertisement revenue sharing)

              Comment


              • #77
                Originally posted by theriddick View Post
                Going to need allot of local servers whereever they plot this thing down, just having a central hub in a capital probably won't cut it for most places.
                Thought I do get 18ms ping to Syndey/Melbourne from North of Adelaide here in South Australia, but the NBN is full of copper and network congestion issues....
                Apparently Google is guaranteeing a server will always be within 64km, meaning the speed of light will only be .3ms. Of course, routing and various hardware will never approach that, but it does seem like the distance won't necessarily be a direct problem.

                Copper probably won't cut it though. I think they said it uses 20Mbps right now, for 1080p, and any kind of stalls probably aren't going to work out very well. My guess is it will be limited to areas with pretty solid connections.

                Comment


                • #78
                  Yeah I'm on 85Mbps atm but there is local congestion as everyone in this town apparently connects to the same overloaded hub

                  Comment


                  • #79
                    Originally posted by lectrode View Post
                    Since procedurally generated worlds would have to sync more data, I even tested a Minecraft server (Java), since that has to send all the world data (primarily the thousands of blocks, in addition to players, items, and mobs) in addition to syncing character information. Even with all the information it had to sync for the generated world, it stayed below 200kbps, and dropped to ~70kbps after the bulk of the blocks were synced. On another server where all the blocks were static but had thousands of players (over 7200) online, the bandwidth used was ~10kbps upload and ~70kbps download.

                    Please point out a single game that transmits more than 2.5mb per second for character/world syncing purposes (game installers/updates do not count).
                    Minecraft is using blocks/vortex for good reasons. There are prototypes in http://cubeengine.com/ of fully destroy-able worlds. Playable prototypes. You 100 percent don't attempt the original cube game with destructible maps on internet because this can fill a 1G lan with map alteration data. For such a low graphics game needing 2.5G+ networking to play is insane. There are games out there that are not popular that are LAN locked because there is not enough network bandwidth across the internet for the level of map destruction/modification they accept. The internet games that include some form of map destruction are using blocking and other methods to reduce the volume of data.

                    Even unreal engine has Destructibles and the more complex you make those the more network bandwidth you can chew up until your game no longer will work even over a 20mbs connection. Heck if you go nuts with map work including destructible you can make a unreal based game eat up a 10G connection in map update data.

                    So this is not a single game at all. There are quite a few games out there where you can make a map designed for LAN with destructible features used in volume that results in too much map update data to be playable across internet without massively stalling up yet play fine in LAN. Of course the maps designed for Internet are not this way. Yes LAN only games have gone out of popularity.

                    Comment

                    Working...
                    X