Announcement

Collapse
No announcement yet.

Intel Arc Graphics A380: Compelling For Open-Source Enthusiasts & Developers At ~$139

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by davidbepo View Post
    hi Michael first of all thanks for testing, but i have one question, do you have the phoronix test suite command to compare to it, i have a feeling this isnt much/at all faster than TGL in its current form and wanted to do some comparison on my laptop but didnt see the command/test name anywhere
    Need to make it public from private mode, but on airplane and not letting me connect to VPN at the moment, but should be able to get it set when off.
    Michael Larabel
    https://www.michaellarabel.com/

    Comment


    • #62
      Originally posted by tildearrow View Post
      A Windows vs. Linux benchmark is the next thing I want to see.
      I have a feeling Windows is currently faster...
      I think it will definitely be faster in dx12 and maybe a bit faster in Vulkan as well, because last time I heard, these are currently the only 2 well optimized drivers on Windows, while the other drivers are not. But Since Anv is in relatively better shape than Iris as you saw in the benchmarks, I think there is a chance DX 9-11 games (via DXVK) will be faster on Linux (unless they quickly catch up with older api optimizations on Windows.

      Comment


      • #63
        Originally posted by coder View Post
        What's happened with PC GPU prices over the last two years isn't primarily the duopoly of producers. It's crypto, pandemic, & deep learning fueling demand, up against a historical crunch in silicon production capacity. Rampant scalping only further contributed to scarcity and astronomical pricing.
        Sure, there were many more reasons for the price levels we were seeing. But the GPU duopoly certainly contributed to it. If you want prove for that point, AMD could have been more aggressive with pricing within the RDNA1 cycle and could have made it the spiritual successor to the RX580, as it was cheap to produce and we haven't had the transport and wafer costs explosions back then, but they were simply not doing it.

        Originally posted by coder View Post
        Intel being in the game would've made virtually no difference, because they'd all be sourcing from the same fabs.
        That's a bold claim, here is why: While TSMC fab capacity might be a limiting factor, Intel could have gotten either more wafers or a better price per wafer than the others due to their sheer size and long history with TSMC, that is also the reason why they got the same preferential treatment as Apple. Also Intel would have to play the value card anyway to get any traction on the dGPU market, sacrificing margins to buy both mind- and market share. And having more GPU options and overall volume in the market would have put some price pressure on AMD and Nvidia by the laws of the market. You might remember that we haven't gotten the low end cards for a very long time, driving the demand to the upper tier cards with it. If Intel had launched Arc at the end of 2021, they could have had double-digit market share by now.

        Originally posted by coder View Post
        They have limited leverage over their suppliers and prices have gone up throughout the economy.
        They put some pressure on some suppliers to expand their capacity, e.g. substrate and other IC suppliers. While AMD has limited leverage, Intel for sure does have much more.

        Originally posted by coder View Post
        That's why I think the "rumors" of cancellation is nothing more than a bunch of bored gamers echoing each others' scornful sentiments that their unrealistic expectations weren't met.
        Better watch this: https://www.youtube.com/watch?v=WXSmgdl3E2o

        and this one on that matter: https://www.youtube.com/watch?v=DH2s5HeZzs8

        Both Youtubers have their own proven sources at Intel and if both are talking about at the possibility of cancelation of Arc and such a review of the upper management team, then Intel at least considers that as an option. There is no point of reading too much into it at this point in time, but I think it is part of their internal mind games to put more pressure on AXG to improve their execution. Both Youtubers also make that clear in their reporting.
        Last edited by ms178; 28 August 2022, 02:41 PM.

        Comment


        • #64
          Originally posted by numacross View Post
          A 75W card having an 8-pin external power supply doesn't sound like the TBP is realistic. Intel's CPU TDP definition (only at base clocks) isn't realistic either.
          TPU tested the A380 and found that it's using 94W at maximum load with 102W spikes:
          power-consumption.png
          The desktop power usage is higher than expected for a card this small.

          Edit: Interestingly GTX 1060 peaks at 125W while being ~25% faster, so that doesn't look really good for A380 considering that 1060 is a 5-year old GPU made in a 16nm process.
          If it can't beat a GTX 1060 or RX480/580 then Intel is wasting everyones time. You can buy those cards used for a cheaper price than the A380. On Steam the GTX 1060 is the most popular Nvidia card while the RX580/480 are the most popular for AMD. Who's gonna buy this thing?

          Comment


          • #65
            this is a fake launch here in germany or europe there is zero intel arc a380 cards...

            ✔ Preisvergleich für ASRock Intel Arc A380 Challenger ITX 6GB OC - A380 CLI 6GO ✔ Bewertungen ✔ Produktinfo ⇒ Anschlüsse: 1x HDMI 2.0b, 3x DisplayPort 2.0 • Grafik: Intel Arc A380 Graphics - 6GB GDDR6 - Desktop • Ch… ✔ PCIe ✔ Testberichte ✔ Günstig kaufen


            DG1 was fake launch only to supply some developers
            DG2 is fake launch only to supply some benchmark and testing websites like phoronix.

            if there is not a single card in the complete EU this is clearly a fake launch.

            you can say this card is so cheap: ~$139​ as long as you want if there is not a single card to buy this price is fake.

            i think this second generation intel gpu card is right now only for adanced testers and people like [email protected] to help the development with their testing and checking all kind of stuff.

            this is right now not really a card for consumers. even if the AV1 part and the cheap price would be nice for people who do not game but want av1 encode/decode ...

            i think intel as a complete entity is "Evil" but this fresh wind of competition agaist nvidia and AMD is a good thing.

            AMD could have doomed this intel ARC a380 if the 6400/6500 would have AV1 decode...
            so now you have the situation that amd 6400/6500 is much faster and even faster per dollar but no AV1 decode or encode.
            just imagine this: if AMD makes an 7400/7500 with AV1 decode most people will no longer have interest in the intel gpu.

            so basically everyone who buys this ARC a380 is telling AMD: put AV1 into 7400/7500 or else we will support intel...

            so by this logic it is a good thing right now the next best option is the amd rx6600 at 275€ with AV1 decode...

            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • #66
              According to the following Internet article, resize rebar is not absolutely required for using Intel Arc GPUs,

              "No, you won’t need a Core 10th Gen CPU or newer for Intel Arc graphics cards; Demystifying Intel’s Arc quick start guide"​
              https://www.rockpapershotgun.com/no-...graphics-cards


              Intel Arc A-Series Graphics - Desktop Quick Start Guide
              FAQs

              Q: Why do I need to enable Resizable BAR?
              A: Resizable BAR must be enabled for optimal performance in all applications using Intel® Arc™ A-Series Graphics.​


              I agree, if true, seems misleading.


              If so, cannot wait until I am able to rid my computer system of the nVidia proprietary drivers. However, looks like I'll be waiting until the end or beginning of 2022/2023 for purchasing an Intl Arc GPU successor for my current nVidia GTX 670.

              Last edited by rogerx; 28 August 2022, 04:03 PM.

              Comment


              • #67
                Originally posted by rogerx View Post
                According to the following Internet article, resize rebar is not absolutely required for using Intel Arc GPUs,

                "No, you won’t need a Core 10th Gen CPU or newer for Intel Arc graphics cards; Demystifying Intel’s Arc quick start guide"​
                Intel have confirmed to use that their Arc graphics will work with Intel and AMD CPUs alike, though Resizable BAR support comes strongly recommended.


                If so, cannot wait until I am able to rid my computer system of the nVidia proprietary drivers. However, looks like I'll be waiting until the end or beginning of 2022/2023 for purchasing an Intl Arc GPU successor for my current nVidia GTX 670.
                It is not required, but it has been tested that not having it has a significant performance impact between 10 and 20%. The reduction of average FPS doesn't tell the whole story because it also results in horrendous stuttering:

                Comment


                • #68
                  Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post

                  I'm not sure why some people seem to be cheering on the demise of Intel dGPUs. The duopoly we have now is not good for consumers. The last few years of mining fueled bullshit prices make Intel even more interesting. They are the only ones with their own fabs, so once they (incredibly slowly) unscrew their < 14nm fiasco, we would eventually end up with a 3rd player who isn't trying to compete with everyone in the world for TSMC capacity. That obviously isn't the case now and Intel needs TSMC just like everyone else, but the path to more GPUs available in the market is there.
                  Of course choice is good. But Intel is a horrible choice. Breaking a Duopoly with a terrible company that makes poor products isn't going to accomplish much of anything. Not to mention how many security exploits this will inevitably open up as Intel seems completely incapable of making secure silicon. It's just a pathetic attempt at making a useless alternative.

                  Comment


                  • #69
                    Such an amazing and comprehensive set of benchmarks.

                    I hope the rumors are true that AMD is keeping AM4 for now as a budget platform.

                    This is the perfect card for pairing with a Ryzen 5500 for budget-minded Ubuntu gaming rigs (or Manjaro if the newbie is a bit braver)

                    Comment


                    • #70
                      Originally posted by birdie View Post

                      This card raytracing performance is laughable and it's just there for show and tensor-like blocks barely take any space as they are very simple.
                      If that's the case, why did RTX cards perform so terribly compared to how many extra transistors got added?

                      If you say it's not tensor cores, it's definitely some new feature, and I'd bet Arc also has it. Unless you are claiming that the RTX series just randomly happened to suck architecturally and it was no relation to all the new hardware features being added. I'm very doubtful of that.

                      2060 had 10.8 billion transistors, versus 7.2 for the 1070 Ti around similar performance. 50% more for no performance gain. Makes much more sense to compare that. If you take take that multiplier to the 1060, you get 6.6b which is around 10% lower than Arc. So still better, but nothing as extreme as your initial post would suggest considering it's intel's first gen. And the 2060 didn't have an equivalent video encoder either.
                      Last edited by smitty3268; 28 August 2022, 05:58 PM.

                      Comment

                      Working...
                      X