Announcement

Collapse
No announcement yet.

Intel Arc Graphics A380: Compelling For Open-Source Enthusiasts & Developers At ~$139

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by atomsymbol
    The availability of GPUs in Europe might deteriorate further over the next few years because inflation in the EU is expected to be approximately 10% per year - while inflation in other parts of the world (such as: China) is going to be approximately 3% per year, which means that PPP (Purchasing power parity) of EU in the next few years is going to see a temporary decrease.
    not so dramatic if you count in that in the past europe always had the highest purchasing power even higher than the usa...

    also why do you think does inflation do to the sales of gpu cards ? this only means the price goes up...

    but it makes no sense to say there is no sale in europe because there is 10% inflation... makes no sense to me-
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #72
      Originally posted by numacross View Post

      It is not required, but it has been tested that not having it has a significant performance impact between 10 and 20%. The reduction of average FPS doesn't tell the whole story because it also results in horrendous stuttering
      Can confirm similar results through my own brief search of Google search results regarding re-sizable bar optimization, however most are stating 1-2% improvements, whereas only some stated 10-20% improvement alongside anomalies. Guessing those seeing the 10-20% improvements are those using the most recently manufactured computer hardware.

      Really good stuff stated here! At first, thought my hopes were dashed, thinking I wouldn't be able to afford an entire >10th generation CPU hardware for using an Intel Arc GPU!

      Comment


      • #73
        Well, I wasn't expecting it to be quite that bad.

        Hardly compelling. More like, "limping over the finish line so late the next marathon is about to start".

        Time will tell whether this showing is so poor that it continues long term or not. Intel has the knowledge and the capital to stay it out (AMD stayed out Bulldozer and a GPU mis-step, although they got hammered as a result) so something good could come of it, provided the bean counters don't pull the plug.

        An appalling showing from Intel, especially considering all the noise Raja Koduri et al were making about it. He has had a hand in some truly great GPUs, but recently...

        Would like to see a custom version without the 8-pin, since it's supposed to be a 75W part. Would also like to see one with a better cooler - that one looks like they tried to find a way to use all their old Celeron stock coolers with a too-large shroud jammed on top. Single slot, for preference... or passive... but I expect passive would require it to lose so much performance that the integrated graphics would look good in comparison.

        I need to go look up performance numbers from a 5700G, 5900HX, 6900-whatever-it-is and Alder Lake Xe iGPU...

        Comment


        • #74
          Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
          You aren't going to constantly piss away many millions of dollars for development teams to fix AAA day 1 bugs or eek out extra performance when the damn thing is going to run at 4fps.
          You don't run AAA titles on iGPUs except at minimum settings and like 720p resolution. And with those settings, some of them are quite playable.

          Big companies like money and alienating a large portion of your audience isn't good business strategy.

          Comment


          • #75
            Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
            I'm saying professional focused workstation cards didn't give a fuck about HDMI
            Depends on which kind of professionals. If we're talking video production, then I'm sure plenty do. But, more people are using 4k TVs as monitors than you might expect.

            And then there's the matter of projectors. The ones at my office have HDMI, but not DisplayPort. Probably because they took the cheap route and used home theater projectors, but it still stands as a real-world example.

            Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
            at the time so it's a weird thing to list as a negative for a half decade old card.
            That card isn't being discussed in the context of a "recommended buy" for people at the time, but rather in contrast to this one.

            Comment


            • #76
              Originally posted by birdie View Post
              This card raytracing performance is laughable and it's just there for show
              I think it's there for API-compatibility and very limited-use scenarios.

              Originally posted by birdie View Post
              and tensor-like blocks barely take any space as they are very simple.
              They're simple so they can be compute-dense. That doesn't mean they're small.

              Comment


              • #77
                Originally posted by ms178 View Post
                If you want prove for that point, AMD could have been more aggressive with pricing within the RDNA1 cycle and could have made it the spiritual successor to the RX580, as it was cheap to produce and we haven't had the transport and wafer costs explosions back then, but they were simply not doing it.
                The phenomenon was hard to predict and nobody knew how long it would last. Given that, it's hard to fault AMD for not porting their RX 580 to 7 nm (which I guess is what you're saying?).

                I think RDNA was more efficient per transistor anyhow, in which case something like that would've been a waste. The whole point of RDNA was to improve efficiency over GCN.​

                And they even did try porting it to 12 nm, in case you don't remember the RX 590. That burned significantly more power and didn't really perform much better than the RX 580. Such an experience probably wouldn't have given AMD confidence that it'd be worth doing, yet again.

                Originally posted by ms178 View Post
                That's a bold claim, here is why: While TSMC fab capacity might be a limiting factor, Intel could have gotten either more wafers or a better price per wafer than the others due to their sheer size and long history with TSMC,
                LOL. You obviously don't know what you're talking about. First, Intel has no real history with TSMC. Certainly not at the scale of AMD or Nvidia. And because they compete with TSMC's other customers and have their own fabs, TSMC isn't going to do Intel any favors - especially anything that might disadvantage customers with more future business potential for TSMC.

                Second, TSMC isn't looking to do anyone any favors. Apple gets special treatment because they pay top $ and probably even take on more risk exposure than other customers. More importantly mobile SoCs tend to be on the smaller size, which means higher yields on a newer node. That's why you tend to see phone SoCs being the early adopters of new nodes. GPUs are the opposite - comparatively large and therefore more vulnerable to defects.

                Third, I think you're missing the key point: the size of the pie was fixed. Intel bidding up prices to compete with the other two wouldn't really have helped the situation and would've had other effects like making Ryzen 3000 and 5000 even more expensive.

                Originally posted by ms178 View Post
                having more GPU options and overall volume in the market would have put some price pressure on AMD and Nvidia by the laws of the market.
                "the laws of the market." LOL!

                It depends on where the bottleneck is. In a situation like we had, where fabs are the bottleneck and production capacity is inelastic, then no. It wouldn't have made much difference.

                Originally posted by ms178 View Post
                You might remember that we haven't gotten the low end cards for a very long time, driving the demand to the upper tier cards with it.
                If Intel had launched Arc at the end of 2021, they could have had double-digit market share by now.
                There were still previous-generation GPUs being produced and the A380 can't even compete with them.

                Originally posted by ms178 View Post
                They put some pressure on some suppliers to expand their capacity, e.g. substrate and other IC suppliers.
                There was already more than enough pressure. You can't just flip a switch and scale up those businesses.

                Originally posted by ms178 View Post
                While AMD has limited leverage, Intel for sure does have much more.
                No, and I'll elaborate a little more on why. Intel is investing in new fabs and doubling-down on its foundry business, which is a direct competitor to TSMC. Intel would rather fab all their chips internally, as that's how they achieve the best margins. They're just in a situation where they only really have one competitive node and not enough of it, which brings them to TSMC. But, there's no long-term relationship, here. Intel will go back to using its own fabs as soon as it can, and everyone knows it.

                AMD, on the other hand, needs TSMC, Samsung, or might even use IFS, if Intel ever spins it off into a truly independent entity. More importantly, AMD is a partner who brings more to the table. AMD has a vested interest in working with TSMC to develop and refine chip fabrication and packaging technologies, whereas any assistance Intel gives TSMC, in these departments, is 100% helping competitors (both because TSMC is a competitor and because many of its customers are competitors). So, AMD brings a true partnership, whereas Intel is in it just to satisfy short-term needs.
                Last edited by coder; 28 August 2022, 10:28 PM.

                Comment


                • #78
                  Originally posted by ms178 View Post
                  ...would have put some price pressure on AMD and Nvidia by the laws of the market.
                  Law of the market:

                  “Everything is worth what its purchaser will pay for it.” Publilius Syrus, first century B.C.

                  Comment


                  • #79
                    Originally posted by birdie View Post

                    There's zero demand for a HW AV1 encoder now. None.
                    Absolutely right, 640k should be enough for everyone.
                    Last edited by BingoNightly; 28 August 2022, 10:42 PM.
                    Don't expect much and seldom disappointed.

                    Comment


                    • #80
                      Originally posted by atomsymbol
                      The price of GPUs in the European Union - which in most cases are both designed and manufactured outside of EU - varies depending on (1) currency exchange rates (such as: USD to EUR) and (2) on the local tax rates in the various EU countries. In case a EU citizen buys a GPU from AliExpress, the final price is influenced by the import tax rate.
                      10% inflation in the EU means that EU citizens will in total spend less on GPU purchases because they will be forced to spend 10% more on other products and commodities (electricity, heating, gas, food, solar panels, etc).
                      A decrease of PPP in EU is unrelated to the nominal prices of GPUs in the EU.
                      A decrease of PPP in EU is related to (1) how many units of GPUs will be sold in years 2022-2023 in the EU and (2) what GPU models will be sold in years 2022-2023 in the EU. A decrease in PPP might result in EU citizens buying lower-performance GPU models compared to what they were buying in, for example, year 2019 (which does not mean that a GPU bought by an EU citizen in 2023 will be slower in terms of FP32 FLOPS than a GPU bought by an EU citizen in 2019).
                      A decrease of PPP in EU might increase availability of lower-end GPUs in EU and decrease availability of top-end GPUs in EU - but this depends on other factors such as the supply chain's readiness/willingness to deliver those lower-end GPUs in higher quantities (and thus not to deliver those lower-end GPUs to other markets (e.g: to China)).
                      ----
                      From wikipedia.org/Complexity:
                      everything you said is correct and right and fine...

                      but tell me why i can buy a 3000€ nvidia 3090ti or why i can buy a 1200€ AMD 6950XT in europe but i can not buy a 130€ card ?

                      according to your theorie the people suffer they have 10% inflation but they have money to buy 3000€ 3090ti and they have 1200€ for 6950xt but they of course have no money to buy a 130€ card...

                      man something in cour theory really does not fit.

                      but i have a explanation for you... i am on wellfare 449€ this 10% inflations hits me hard

                      my brother earns 30 000€ per month he is surprised that his weekly food supply from the supermarket costs 400€ instead of 200€... but he does not fucking care and buys the most expensive organic meat and the best food he can find.

                      by this simple example of me and my brother the people who buy 1200€ card or 3000€ card they do not suffer because of 10% inflation because they earn 30 000€ per month but people on wellfare 449€ are hit very hard by this inflation..

                      so maybe thats why there is no 130€ card to buy because the market maybe isn't there at all right now.

                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X