Announcement

Collapse
No announcement yet.

Intel DG1 Graphics Card Nears Working State On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by zamadatix View Post

    You're discovering the semiconductor shortage and blaming it on disinterest in selling mid range GPUs instead of inability to make them. The 1650 and the 5500/5300 class GPUs will have equivalents in price point and power in the 3000 and 6000 generation once they have the ability to supply enough GPUs. Anything lower than those is in fact part of the iGPU class now.
    You are discovering the lies the cartel fabricates in order to justify the milking of its customers. They can make lower class gpus RIGHT NOW. In fact, they could easily fix the gpu crisis by doing so, because most people don't need high end gpus and never did. They are only buying them now (at overinflated prices when they can find them) because there is nothing else to buy.

    Transistors are not that complicated. The more of them, the more die area, the more die area, the less chips made from a waffer. If you put half of them, you get 2 times the number of chips. If you place 1/4 of them, you get 4 times the chips. Not rocket science. So if AMD decided to make a 6400 and a 6500, which could be ideal 1080p solutions for the mainstream market, they could make them RIGHT NOW on the same waffers they are using RIGHT NOW and produce a gazzillion more of them. MSRPs would be sensible and street prices would be sensible as well because the supply could satisfy the demand.

    But AMD (and Nvidia of course) knows that it is a new generation of games, PC games are unoptimized anyway, and people need to buy a gpu. They know they can still sell overpriced gpus to impatient gamers, and they know that they can also sell to miners. So why not procude larger dies? It makes them more profit by fleecing more money for the same die areas, and keeping the demand unsatisfied ensures that the clients will keep coming. After all, someone who lacks a gpu for mechanical reasons or simply for obsoletion reasons, will have to buy at some point. Why satisfy him with a 6400/6500 card, when you can sell to the rich kid a 6900 and then have him frustrated and forced to pay through the nose in the future anyway?

    The cartel knows that if they supply mainstream cards, which they can easily do if they simply cut down the designs, since most people game at 1080p anyway, the market will be saturated and satisfied and prices will go down, and people will re-pay them 3-4 years later when they upgrade. By refusing to service the mainstream market, they keep getting the same revenue per transistor but also keep the demand/market unsatisfied for future sales. Capiche?

    Comment


    • #22
      I think even a cheap card with HDMI 2.1 and working VP9 would be a good idea as upgrade for multimedia pcs. Fast CPUs can decode it as well but VP9 4k60 is not that easy to playback with 0 dropped frames.

      Comment


      • #23
        The more entrants in a market, the less cartel-like behavior is possible. Intel entering the fray does help things, especially if they can make their own. However, I thought that the anti-competitive behavior was coming from the fabricators themselves? Perhaps I was misinformed.

        Comment


        • #24
          Originally posted by TemplarGR View Post

          You are discovering the lies the cartel fabricates in order to justify the milking of its customers...
          For as much thought as you put into spinning that story you think you would have realized the GeForce 16 series is still actively made at TSMC on a 12 nm process, the shortage of it has nothing to do with the shortage of Samsung 8 nm process used in the 3000 series. Nor does the volume of 3080/3090 being manufactured * 4 = enough low-mid range GPUs - even of which if it did I'd just be talking to a different moron going on about the GPU cartel forcing you to buy enterprise class GPUs for a performance upgrade instead.

          Anyway I just wanted to point out iGPU replaced low and mid-high still have plenty of models targeting the segment from all vendors, even if the ones from this generation aren't out yet. I'm not really interested in why you're upset about being unable to afford one today because you need it instantly or which global shadow cabal is to blame.
          Last edited by zamadatix; 12 April 2021, 08:13 PM.

          Comment


          • #25
            Originally posted by tildearrow View Post

            Sadly, it is not "the same shit".

            Here is proof:
            - Intel does not have ray-tracing (yet)
            - AMD does since RDNA 2
            - NVIDIA started it all

            - Intel has 4:4:4 video encoding since Ice Lake
            - NVIDIA has 4:4:4 video encoding since Maxwell
            - AMD doesn't even care

            - Intel has the best quality encoders
            - NVIDIA is in the middle
            - AMD has the worst quality encoders

            - Intel selectively cripples/removes very old/seldom used features from its encoder (but does not lower speed)
            - NVIDIA does not remove features from its encoder
            - AMD removes encoder features and cripples the encoders too early (H.264 in particular, which still remains widespread, but they have been lowering the encoding speed since VCE 3.0)
            Your list seems oddly focused.

            You forgot to include things like:
            - General rasterization performance
            - Price
            - Actually being able to buy a card
            - Wayland support
            - Usable control panels
            - CUDA support
            - and probably a dozen other things that the vast majority of users would find far more important than 4:4:4 video encoding.

            I'm very happy to see increased competition for GPUs going forward. I suspect Intel's first drivers are going to be very rough, so I'm a bit skeptical about jumping on their bandwagon right away, but if they spend the money to stay in the market and keep competing I think their 2nd generation products could be very interesting.

            Comment


            • #26
              Originally posted by smitty3268 View Post

              Your list seems oddly focused.

              You forgot to include things like:
              - General rasterization performance
              - Intel seems to lag at this.
              - AMD is fine at this, but has improved considerably on Navi.
              - NVIDIA has always been good at this.

              Originally posted by smitty3268 View Post
              - Price
              - Intel: Card still not released (the only ones are extremely specific, so probably very high)
              - AMD: Low (mid for workstation cards)
              - NVIDIA: Mid-high (very high for workstation)

              Originally posted by smitty3268 View Post
              - Actually being able to buy a card
              - Intel: No
              - AMD: Barely
              - NVIDIA: Barely

              Originally posted by smitty3268 View Post
              - Wayland support
              - Intel: Yes, near perfect
              - AMD: Yes
              - NVIDIA: Depends, usually not

              Originally posted by smitty3268 View Post
              - Usable control panels
              On Linux?
              - Intel: No
              - AMD: No
              - NVIDIA: Kind of

              Originally posted by smitty3268 View Post
              - CUDA support
              - Intel: Partially, unofficial
              - AMD: No (translation tools are available, but barely work and require code modification)
              - NVIDIA: Yes, native

              Originally posted by smitty3268 View Post
              - and probably a dozen other things that the vast majority of users would find far more important than 4:4:4 video encoding.
              See? I am a niche user, yes, but I cannot afford a Threadripper.

              Comment


              • #27
                Originally posted by Prescience500 View Post
                The more entrants in a market, the less cartel-like behavior is possible. Intel entering the fray does help things, especially if they can make their own. However, I thought that the anti-competitive behavior was coming from the fabricators themselves? Perhaps I was misinformed.
                The fabricators of course are a part of the problem, but the fabricators only produce the waffers at a certain price point, they don't make the designs/numbers of chips per waffer.

                Comment


                • #28
                  Originally posted by zamadatix View Post

                  For as much thought as you put into spinning that story you think you would have realized the GeForce 16 series is still actively made at TSMC on a 12 nm process, the shortage of it has nothing to do with the shortage of Samsung 8 nm process used in the 3000 series. Nor does the volume of 3080/3090 being manufactured * 4 = enough low-mid range GPUs - even of which if it did I'd just be talking to a different moron going on about the GPU cartel forcing you to buy enterprise class GPUs for a performance upgrade instead.

                  Anyway I just wanted to point out iGPU replaced low and mid-high still have plenty of models targeting the segment from all vendors, even if the ones from this generation aren't out yet. I'm not really interested in why you're upset about being unable to afford one today because you need it instantly or which global shadow cabal is to blame.
                  Even if true, they are not making Geforce 16 in any real capacity because they are still very expensive. They are being overpriced because the demand is higher than the supply. And supply can't satisfy them because in the past there were many designs from both manufacturers to cover the mainstream market, while for years now there has been only 1050/1650 and RX 560/570.

                  PS: You deserve a ban for calling me a moron indirectly. I am tired of people insulting others here. Your knowledge is nothing of how the industry works. You are just spinning gpu manufacturer propaganda. If market laws were still enforced, all 3 major PC companies would have been destroyed in courts by now. Yet they make tons of money by fleecing their customers and those immense profits can afford them enough "journalists" and trolls to spread lies to justify their scams.

                  Comment


                  • #29
                    Originally posted by skeevy420 View Post
                    I know a lot of people have been riding the Hate On Intel bandwagon lately, and with 14nm+43 with umpteen exploits I get it, but I'm genuinely excited about a 3rd competitor in the graphics card market. Having two blob-free GPU options to choose from will be pretty nice. What I worry about is this screwing AMD over even more than they already are. I look at all the Intel+Nvidia systems and AMD+Nvidia systems out there and I can't help but worry that they'll be replacing Nvidia GPUs with Intel.

                    I also think this'll force AMD to step up their Linux driver game even more. They won't be the only open player in town so something like Ray Tracing on Linux still not being available 4-5 months after launch just won't be acceptable for much longer. One group of people will go "Look at Intel. Their new card is released, it supports all the features we expect this generation, and the drivers were ready three months before launch" and other people won't be able to hide behind the ole Nvidia blob and shim shield when the AMD driver crowd be coming round the mountain when they come.
                    It's been like that for decades, AMD hasn't given a crap about proper Linux support. Competition isn't enough, they'll still focus more on the Windows gaming market (where their drivers are still sucky and buggy).

                    AMDs software division is just shit, quality is bad and it shows.

                    Comment


                    • #30
                      Originally posted by M@GOid View Post

                      Regarding iGPU, you eider have very high expectations for a GPU ("I play current AAA games exclusively"), or haven't used one for a good time. The current offerings can do very well in gaming, if you are playing up to e-sports types of games.

                      Sure. you can't play CP2077 very well in one, but gaming is much more than AAA.
                      I bought an Intel 1165G7 CPU equipped laptop on the premise that the Xe GPU is good enough for basic gaming. I previously had a laptop with a 1050Ti. My gaming needs aren't extreme - I want to play Rocket League. The 1050Ti played it just fine at 60FPS on High settings in 1080p. The Xe can't manage 30 FPS in 1080p on Performance, I had to drop down to 720p for that. That's just too ugly and low to be playable.

                      A 1050 is a midrange GPU, but it's far more powerful than an integrated GPU, even the supposedly great Xe. Now I'm probably looking for an eGPU enclosure and a low end GPU to put in it, but even a 1050 costs 185 pounds!

                      Comment

                      Working...
                      X