Announcement

Collapse
No announcement yet.

Linux 6.2 Will No Longer Treat Intel Arc Graphics As Experimental

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    For those who might own one: Is the AV1 encode really that good? I'm waiting for a affordable card with av1 hw support to replace my RX 580.

    Comment


    • #22
      Originally posted by furtadopires View Post
      For those who might own one: Is the AV1 encode really that good? I'm waiting for a affordable card with av1 hw support to replace my RX 580.
      EposVox from YouTube is one of the best streamer related channels in the world. He seems to think so and I believe most of his testing was on a380. It is solely why I bought a a750 (mine got delivered Monday last week.)

      Comment


      • #23
        Originally posted by rogerx View Post
        Should also note the Intel Arc graphics cards require Wayland. Having the best experience using Wayland/Sway desktop.

        X/Xorg with i915 modesetting driver only gives me displays going into power saving mode after starting X/Xorg.

        Most seem to be having good results with Wayland/Sway.
        had a same experience when comparing between i3 vs sway, Xorg was too buggy, with Gnome Xorg I was getting black screen sometimes.

        Comment


        • #24
          Originally posted by qarium View Post
          for me it looks like AMD is still better for linux support than intel. amd is also better than nvidia in my point of view.
          just no AMD puts more public effort in linux/open source (and a lot of amdgpu kernel stuff works brilliantly), but they achieve less than what Intel provides, Intel has more features working - Nvidia is off the radar, just because the don't supply any open source software stack - period

          Comment


          • #25
            Originally posted by coder View Post
            As for the rest, we know they've had issues with dGPU memory management.
            is that the only reason why dg2 software stack is not ready on launch date? - man, I'll quit my current job to develop some better software for intel, if they want to have it

            Comment


            • #26
              Originally posted by furtadopires View Post
              For those who might own one: Is the AV1 encode really that good? I'm waiting for a affordable card with av1 hw support to replace my RX 580.
              I've finished testing for some weeks now, the current DG2 Multimedia IP block/ASIC is really good quality, yes - talking about VMAF and visual comparison

              I use ICQ15/18 for 4k/AV1 and 4k/HEVC, CQP mode is just unreal, talking about bitrate/second

              funny thing, their ASIC and software changed over the years, I always use the same source material for my ASIC tests, and even Intel qsv changed between iGPU and dGPU:
              Code:
              Length Name
              ------ ----
              2319667940 docu.2018.1.ffmpeg.hevc_qsv_skl_icq_22.aac_nv12.mkv
              1973101041 docu.2018.1.ffmpeg.hevc_qsv_dg2_icq_22.aac_nv12.mkv​
              Last edited by photom; 21 November 2022, 06:30 PM.

              Comment


              • #27
                Originally posted by photom View Post
                is that the only reason why dg2 software stack is not ready on launch date?
                Enough hyperbole. I can't give you a straight answer, because I have no inside knowledge.

                It sounds to me like you're calling Intel lazy or incompetent, but I believe they're neither of those things. So, I'm trying to look for other factors and explanations to explain the current situation, and they're not hard to find. I'm sure that doesn't help you with your problems, but again I point to the monumental feature-set they're taking on. They don't just have to get everything working, but it has to be competitively fast and stable. That's more work than you think it is.

                Originally posted by photom View Post
                DG2 Multimedia IP block/ASIC
                You say "ASIC", but there's probably quite a lot more firmware running in there than you'd expect.​
                Last edited by coder; 21 November 2022, 07:07 PM.

                Comment


                • #28
                  Originally posted by coder View Post
                  Enough hyperbole. I can't give you a straight answer, because I have no inside knowledge.
                  It sounds to me like you're calling Intel lazy or incompetent, but I believe they're neither of those things.
                  true, enough hyperbole - we're just waiting for things to work some months/years after we bought our hardware, like we're are used to with AMD stuff - ok, hyperbole again, I just shut up now - I just switched my shirt from team AMD to Intel, so I'm fairly new in that department

                  Originally posted by coder View Post
                  You say "ASIC", but there's probably quite a lot more firmware running in there than you'd expect.​
                  I would love to learn more about the IP blocks Intel/AMD/Nvidia buy from some other vendors, and the ASICs they integrate in their GPU tech - that would be really nice to understand more of what is in there....I already learned a lot while reading source code and sometimes reading between the lines

                  Comment


                  • #29
                    Originally posted by photom View Post
                    I would love to learn more about the IP blocks Intel/AMD/Nvidia buy from some other vendors, and the ASICs they integrate in their GPU tech - that would be really nice to understand more of what is in there....I already learned a lot while reading source code and sometimes reading between the lines
                    AMD has the best docs openly available, but they only seem to cover the shader ISA, from what I've seen.

                    I think there was an IP dump of Nvidia Ampere docs & code? I don't advocate nor participate in trafficking stolen IP, but it should mean there's information out there about some of their GPU internals.

                    Comment


                    • #30
                      Originally posted by furtadopires View Post
                      For those who might own one: Is the AV1 encode really that good? I'm waiting for a affordable card with av1 hw support to replace my RX 580.
                      its pretty mediocre, it is still out done by cpu encoding, but that will always be a given, however assuming hwenc is a necessity, av1 encode is by far leaps and bounds ahead of hevc and especially avc of previous gen cards, so as always it depends on the usecase, want to encode your videos folder? maybe dont use hwenc in the first place, but if you need to record while playing a game that needs it's fair share of cpu, then its a really good value add

                      Comment

                      Working...
                      X