Announcement

Collapse
No announcement yet.

Intel Arc A380 Desktop Graphics Launch In China

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by brucethemoose View Post
    Nvidia GPUs are currently not CXL devices, neither is the Arc. And I run a PRIME setup... its kinda annoying TBH. And good luck finding an app that can do direct transfers between different GPU vendors.
    Videocardz has done it again. Looks like Intel is working on a powerful new Xe-based 7nm GPU codenamed "Ponte Vecchio" -  after an old bridge in Florence, Italy. A quick Google search of the codename also comes up with a Linkedin recruitment ad for an Engineer that will be working on Arctic Sound/Ponte Vecchio-based solutions. […]

    That is something you missed. Xe SLI equal is implemented on CXL in the Data centre. So in Intel data centre GPU cards like Ponte Vecchio full CXL is there. Arc is you general consumer targeted cards. If a consumer version in Arc is released with CXL time will tell. With CXL being Intel SLI equal its kinda a matter of time I would say.

    The reality here we are heading into change. At the start of a change the software normal does not exist. Hardware has to come first.

    The reality here with AMD and Nvidia signed on for CXL is more of a matter of time until we have AMD and Nvidia CXL cards at least in the Data centre. Once we have more than 1 brand that support CXL then a matter time before we start seeing software exploiting this means to transfer directly GPU to GPU crossing brands.

    Originally posted by brucethemoose View Post
    IDK much about what users are doing with those FPGAs and ASICs. I've read the claims of bitrate savings over the popular CPU encoders and GPU blocks, but TBH I think a lot of it is snake oil.
    Compare how the three GPU industry titian's HEVC hardware encoders compare to x265 and if it's worth using for encoding your videos!


    There is more to it. FPGA/ASIC are normally after to have a quality closer to that of CPU rendering. There can be lot of quality difference depending on what hardware accelerator for media encoding you use.

    The reality is here there advantages to using the Intel media encoder over the Nvidia media encoder at times. When this can be done without heavy CPU overhead we should expect software supporting it to appear.

    This is more a matter of time in my option. The benchmarking of different hardware media encoders being intel, amd, nvidia and other vendors shows they are not direct 1 to 1 replacements with each other. Horrible as it sounds you might want to encode in all and on the cutting room floor(video editor) keep the sections that came out the best.

    Comment


    • #42
      Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
      No thanks. The RX 6400 is an insulting piece of crap laptop GPU repackaged for desktop duty because AMD could get away with it during the low inventory mining frenzy. I'm not buying a neutered X4 lane 64 bit memory interface GPU with two display outputs and gimped decode along with nonexistent encode capabilities.
      The memory interface is 64-bit because it has Infinity Cache sitting off to the side.

      Originally posted by pWe00Iri3e7Z9lHOX2Qx View Post
      That half a decade old Quadro P2000 I mentioned is a much better card.
      The P2000 was a neutered GTX 1060 that sold for at least twice the price of that card because it's "a workstation product".

      The RX 6500 XT averaged about 25% faster than the GTX 1060, and the GTX 1060 is at least that much faster than the P2000. So, that should mean the RX 6400 easily beats the P2000.
      The only way you can justify the P2000 is on features, or if you get a really good deal on a lightly-used one.

      I think your main problem is that you're still in denial about inflation.

      Comment


      • #43
        Originally posted by smitty3268 View Post
        you have to remember that for all those decades of work they've never really had to care about getting good performance in games. Or even, well, working at all in games.
        That's nonsense. Lots of people game on integrated graphics. You have to turn down the quality and resolution, but it's often quite doable.

        And the idea that performance on integrated graphics doesn't matter is equally laughable. Intel and AMD both put tremendous amounts of work into squeezing the best performance from their iGPUs that they can.

        The main reason Intel's iGPUs have lagged behind isn't drivers - it's under-investment in the hardware. Intel toyed around with Iris graphics and eDRAM, but it wasn't really until Gen11 that they really started to mount a serious effort at overhauling and beefing up the hardware to a point where it could really start to compete with AMD. And in the Xe (Gen12) iGPU of Tiger Lake, they finally achieved a clear lead over AMD's Vega 8 (but on a process that was still inferior to TSMC N7).

        Comment


        • #44
          Originally posted by coder View Post
          That's nonsense. Lots of people game on integrated graphics. You have to turn down the quality and resolution, but it's often quite doable.

          And the idea that performance on integrated graphics doesn't matter is equally laughable. Intel and AMD both put tremendous amounts of work into squeezing the best performance from their iGPUs that they can.
          I'm not saying they didn't attempt to optimize as much as possible.

          I'm saying that when the newest AAA game came out and had flickering textures and occasional crashes after an hour of gameplay, they didn't really care because it only ran at 10 fps anyway, so nobody was that bothered. And they weren't spending the next month dynamically rewriting all it's shaders so they could pick up that extra 15% bonus speed like AMD and NVidia drivers do. It's not like they couldn't find anything else to work on improving.

          That adds up over time to a lot of games with potentially buggy behavior and missing extra optimizations the competition has.

          And yes, stuff like the Sims or PUBG are going to work fine because they are lightweight enough that people actually run them on Intel and so they get supported. But not everything is.

          I will say, the last few years Intel has really stepped up their game on mobile hardware. So this is becoming less true recently. But then you can't really rely on them having 15+ years of experience doing that, like what was initially said.
          Last edited by smitty3268; 18 June 2022, 07:06 PM.

          Comment


          • #45
            Originally posted by coder View Post
            BTW, nobody is considering the impact of China's crypto mining ban on pricing. This is a 6 GB card, which means it's a target for miners. That could push up prices for it, outside of China.
            Crypto currency values have collapsed worldwide recently. It is questionable if mining is still profitable.
            Sadly, five minutes with DuckDuckGo did not get me a definitive answer.

            Comment


            • #46
              Originally posted by Rabiator View Post
              Crypto currency values have collapsed worldwide recently. It is questionable if mining is still profitable.
              Yeah, I was thinking of that. Intel won't have known this, prior to launch, but China's ban has been going on for long enough that maybe they figured 6 GB card would actually make it into gamers' hands, there.

              Also, even as some prices are way down, if you live in an area with heavily-subsidized electricity (and, in the case of kids, this includes parents paying the bill), then threshold for profit is surely still low enough. In the case of kids, they might individually be running only a couple machines, but the demand could expand the market for hardware enough to keep inventories low and elevate prices.

              Whatever the case, it sounds like Intel is facing some kind of production-related problems, and what they most fear is a US/worldwide launch plagued either by poor hardware availability or maybe other issues (e.g. driver support for popular games, as was suggested). If their first big launch goes poorly, it could do enough reputational damage to kill their ambitions in the gaming market.
              Last edited by coder; 20 June 2022, 10:54 AM.

              Comment


              • #47
                Michael officially arc a380 can avalaible on us with newegg at 139us:

                Buy ASRock Challenger Arc A380 6GB GDDR6 PCI Express 4.0 ITX Video Card A380 CLI 6G OC with fast shipping and top-rated customer service. Once you know, you Newegg!


                Comment


                • #48
                  Originally posted by pinguinpc View Post
                  Michael officially arc a380 can avalaible on us with newegg at 139us:

                  Buy ASRock Challenger Arc A380 6GB GDDR6 PCI Express 4.0 ITX Video Card A380 CLI 6G OC with fast shipping and top-rated customer service. Once you know, you Newegg!


                  Already have two A380s coming like tomorrow or so.... Will have up Linux info later in the week.
                  Michael Larabel
                  https://www.michaellarabel.com/

                  Comment


                  • #49
                    Originally posted by Michael View Post

                    Already have two A380s coming like tomorrow or so.... Will have up Linux info later in the week.
                    Oh my.........



                    Comment


                    • #50
                      Originally posted by Michael View Post

                      Already have two A380s coming like tomorrow or so.... Will have up Linux info later in the week.
                      Do you can test without resizable bar and with resizable bar for see how affect linux ?

                      And if you can test is possible use arc with power limit 1 only aka pl1 65w, this is for know if is possible work without 8 pin connector ?

                      Comment

                      Working...
                      X