Announcement

Collapse
No announcement yet.

Two Weeks Until Likely Updates To OpenGL, Vulkan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by eydee View Post

    One of them is only called a GPU to not hurt the feelings of said vendor though. You could wire it up in a multi-GPU system, but it would be like pouring a glass of water into the ocean, to make it bigger. One of them is always a low-power/low-performance GPU, that's why the whole system has been invented.
    FWIW there's _plenty_ of notebooks which come with something like skylake gt2 (HD 520) + gm108 (under whatever marketing name of the day). At least if you've got dual-channel memory for the IGP, it isn't actually really all that much slower (if at all) than the dedicated gpu (and don't even get me started on the amd apu + ultra-low-end amd gpu combos where the APU would blow away the dedicated gpu if given the additional power budget of the gpu but that's a whole another topic). So being able to use both potentially could increase performance quite a bit. But, I don't really see this happening, splitting the render tasks isn't going to be easily achieved (the one use I've seen cited was running post-process effects on the igp, which is a start), certainly also because you actually would have to decide that at runtime depending on the performance of the actual gpu combination.

    Comment


    • #32
      Originally posted by cj.wijtmans View Post

      How is utilizing multiple cards regardless of their vendor a "gimmick"?
      From what I understand, I seriously doubt that in most cases a multi-vendor multi-gpu setup would be better than the faster of the two GPUs operating on its own. At very least I doubt it would be worth the engineering effort. I'd much rather they focus on making it good for at least one GPU, for now. Last NVIDIA driver update broke the shader compiler for an existing application.

      Comment


      • #33
        Originally posted by microcode View Post

        From what I understand, I seriously doubt that in most cases a multi-vendor multi-gpu setup would be better than the faster of the two GPUs operating on its own. At very least I doubt it would be worth the engineering effort. I'd much rather they focus on making it good for at least one GPU, for now. Last NVIDIA driver update broke the shader compiler for an existing application.
        NV has the best drivers, it's surely fixed already.
        A Vulkan game using both my integrated Intel graphics and gtx770 simultaneously in my desktop pc would be without question faster than my 770 alone.

        Comment


        • #34
          Originally posted by duby229 View Post
          EDIT: People buy consoles because it works without screwing around with it. PC's could get the same status if Intel would just make their minimum standard "fast enough".
          Umm, no. A PC is more complex, it can do way too many things to be simple enough to use, especially on Windows 10 that likes to break things more often than its predecessors.

          What about you let casul gamers stay on consoles instead of turning low-end PCs into consoles for some minority of people that isn't smart enough to buy a console nor a half-decent gaming pc?

          Comment


          • #35
            Originally posted by duby229 View Post
            Well, Let me just say that Moore's Law is such a ridiculous over-simplification that it isn't funny at all. Intel knows full well that volume will cover the costs of production.
            Rising prices in a moment when PC sales are plummeting and they are even firing 10k people... I wonder why they don't do it now.

            EDIT: I never did understand why the F___ Intel fuses off capability in it's dies to segment it's product lines. If it was there to fuse off, the production costs were already F___ing paid for.
            If you leave all nice features in cheaper chips too, then none will buy the high end ones.
            The high end ones, the ones that costed a lot to make.

            There is a bit more than "cost of etching a circuit in the silicon" to factor.

            Comment


            • #36
              Originally posted by starshipeleven View Post
              Rising prices in a moment when PC sales are plummeting and they are even firing 10k people... I wonder why they don't do it now.

              If you leave all nice features in cheaper chips too, then none will buy the high end ones.
              The high end ones, the ones that costed a lot to make.

              There is a bit more than "cost of etching a circuit in the silicon" to factor.
              If a capability could be fused off, then it was already paid for. Intel already knows full well that's it's volume of production that covers production costs.

              EDIT: There is a very good reason why Intel doesn't fabricate different dies for each product line, it's that very same reason why it's so stupid to fuse off capability that was already paid for.
              Last edited by duby229; 14 July 2016, 11:31 AM.

              Comment


              • #37
                Originally posted by duby229 View Post
                If a capability could be fused off, then it was already paid for.
                Nope. As I said, the bigger costs aren't etching the circuit, but design, making the fab itself that costs tens of billions, and so on.

                So yeah, the feature isn't paid off in the slightest.

                It's like this for most mass-produced goods, you end up destroying or shelving or crippling most of them to tier them to keep prices high enough to make a profit.

                Comment


                • #38
                  Originally posted by starshipeleven View Post
                  Nope. As I said, the bigger costs aren't etching the circuit, but design, making the fab itself that costs tens of billions, and so on.

                  So yeah, the feature isn't paid off in the slightest.

                  It's like this for most mass-produced goods, you end up destroying or shelving or crippling most of them to tier them to keep prices high enough to make a profit.
                  Dumbass, that's -exactly- the very reason why -volume- pays for production. Idiot.

                  Comment


                  • #39
                    Originally posted by duby229 View Post
                    Dumbass, that's -exactly- the very reason why -volume- pays for production. Idiot.
                    The volume must also be sold to pay for production.

                    If you flood the market the prices fall, and you end selling them for less than what you paid, not paying back a fucking damn.

                    Are you seriously this daft?

                    Comment


                    • #40
                      Originally posted by mike4 View Post

                      NV has the best drivers, it's surely fixed already.
                      A Vulkan game using both my integrated Intel graphics and gtx770 simultaneously in my desktop pc would be without question faster than my 770 alone.
                      How exactly would it be "without question faster"? You'd have to ferry framebuffers, vertex buffers, uniform buffers between the two GPUs over PCIe; you'd have to compile and link all the shaders twice, and find some way to swizzle the data between stages so that it's compatible. It seems like those two pieces of overhead would easily ruin any benefits you had hoped for, especially if you're talking combining with an integrated chipset. Even with the vast improvements made over the last few years by Intel, their most powerful chipsets are nearly an order of magnitude less so than high-end discrete cards.

                      Comment

                      Working...
                      X