Announcement

Collapse
No announcement yet.

Intel Arc Graphics vs. AMD Radeon vs. NVIDIA GeForce For 1080p Linux Graphics In Late 2023

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Panix View Post

    What 'mistake' are you referring to? Are you saying my presumptions are inaccurate or incorrect? I think the only amd gpu - series I'm interested in is the 7900 series. I looked at Blender and Davinci Resolve benchmarks - and performance looks pretty good IF - for Blender, the hip-rt is ever working. It's experimental in Windows and who knows how long that status will be.

    As for basic settings - what if I want the gpu as quiet as possible? For the 7000 series, the fan curves and voltage regulation options are really limited - in Corectrl unless this has changed? I think one of the TuxClocker devs replied here - and I can't remember what the reply was - but, I think that is being worked on? If I had a 3090 (for e.g.) - I think most of these options are working atm, right? They might even be working for a 40 series - I am not sure - but, my guess is that support is probably closer to being realized than the AMD RDNA 3 cards? Hdmi 2.1 - still not working. See the pattern here? Or am I making yet another mistake? /s :-/

    Edit: re: Tuxclocker web page you sourced - amd and nvidia options look similar but power limit options are lacking on amd 7000?

    AMD you have the "Radeon OverDrive" feature. If that not enabled you only see basic features. tuxclocker the current really maintained one has not caught up to WattmanGTK that no longer maintained could expose it was nice as WattmanGTK displayed all the settings that were possible at the time then gave you the mask to apply if you wanted to see them or alter them. Tuxclocker only displays what the current OverDrive mask allows and then not everything.


    The HDMI 2.1 is a legal problem. Of course Nvidia breaks the HDMI non disclosure agreement with the open source kernel mode driver release. I will guess the HDMI like problems will go away in time as Nvidia moved over to open source kernel module so will want the same deals as AMD and Intel does.

    Displayport is a open specification where as HDMI like it or not is a NDA protected closed specification with a legal department who will be major pain. Yes some of the problems with open source drivers are legal.

    Yes displayport to HDMI 2.1 adapters have also run into issues from HDMI standard board going after their products. So HDMI board being a legal jackass is not just restricted to open source drivers.

    Comment


    • #52
      Originally posted by WannaBeOCer View Post
      Graphic designers: Ray tracing is the holy grail of computer graphics.

      Nvidia: Makes ray tracing a reality.

      AMD fanboys: ray tracing is a gimmick
      Ray tracing has been used in computer graphics for decades (in movies mostly), it is not new. What you mean is "REAL TIME ray tracing", AKA vidya games using raytracing during gameplay.

      Thing is, it is indeed a gimmick today, because there is no real raytracing in videogames yet. There are some "effects" using raytracing methods, but the scene is rasterized as it always was. The problem i have encountered with games today is that for most if not all the effects they replaced with raytracing, the visual improvement is minor relative to the performance hit. There is almost nothing they can do in modern visuals with raytracing that they couldn't do with more traditional techniques. Yes, some games today look bad if you disable raytracing, but that is because developers use it like a crutch and do not develop similar looks without it, like they were forced to do prior to RT in gpus....

      All in all, too much fuss, too much money, too much energy consumed, for little to no improvement in visuals. So yeah, it is gimmick.

      NOW, in a decade or two, when it will be possible for games to be FULLY RAY TRACED, yes, then it will indeed be the holy grail....

      Comment


      • #53
        Originally posted by Panix View Post
        It seems painfully slow to me - or I wouldn't hesitate to pick one over a Nvidia. I had the black screens - I know about the flickering - and most distros are going with Wayland Display Server and practically dumping X11 - plus, I just want to try it. But, my gaming is just light - it's not my priority - and I want the gpu mostly for productivity - even Linux users tell me to go with Nvidia. Why is that?!?
        Every case is different, especially in Linux world, where every distro is different. I use rolling distro, so I get regular updates for all components that may be affected. But the same isn't true for some popular distros out there. Distros like Ubuntu and Mint has quite outdated components and in this case NV driver may be better, since you can update driver without using any PPAs or something like this. In case of AMD and Intel you want to have up to date kernel and Mesa,
        There are also other things to consider. Some libraries don't support some vendors at all or support is quite bad. In some cases NV is better choice just because CUDA is better supported than alternatives. For me this isn't an issue since I don't need this in my PC, but some people do.
        And also there is another issue – X11 itself. Many graphics glitches are caused by the way X11 work – eg. it's hard to eliminate screen tearing issues in X session. Wayland fixes a lot of issues, but again – you need to have up to date distro and you should avoid NV here (NV support for Wayland is quite bad).

        Comment


        • #54
          Originally posted by TemplarGR View Post

          Ray tracing has been used in computer graphics for decades (in movies mostly), it is not new. What you mean is "REAL TIME ray tracing", AKA vidya games using raytracing during gameplay.

          Thing is, it is indeed a gimmick today, because there is no real raytracing in videogames yet. There are some "effects" using raytracing methods, but the scene is rasterized as it always was. The problem i have encountered with games today is that for most if not all the effects they replaced with raytracing, the visual improvement is minor relative to the performance hit. There is almost nothing they can do in modern visuals with raytracing that they couldn't do with more traditional techniques. Yes, some games today look bad if you disable raytracing, but that is because developers use it like a crutch and do not develop similar looks without it, like they were forced to do prior to RT in gpus....

          All in all, too much fuss, too much money, too much energy consumed, for little to no improvement in visuals. So yeah, it is gimmick.

          NOW, in a decade or two, when it will be possible for games to be FULLY RAY TRACED, yes, then it will indeed be the holy grail....
          My mistake I didn’t specify I was referring to games. I’ve been following AMD, Intel and Nvidia real time ray tracing technologies over a decade. AMD released an insane cinematic demo when they announced the HD 4800 series but nothing came of it. Intel demoed Wolfenstein with ray tracing effects that was streamed to a laptop from four compute servers. In the same year Nvidia used three Fermi cards using OptiX to demo ray tracing effects added to a Bugatti Veyron locally.



          Fast forward a decade later Nvidia actually implemented their vision of having ray traced effects on top of rasterization bringing us closer to photorealistic gaming. Game studios from what I’ve seen have been excited since it saves them plenty of time. You call it a crutch but it improves the visuals and saves them time to focus on actual gameplay. If you think it’s a gimmick you can turn it off. I play on PC due to better graphics and enjoy seeing the evolution even if it impacts performance. I’ve seen this play out previously with the introduction of tessellation with the same exact responses how it was a “gimmick.”

          Comment


          • #55
            Originally posted by Sevard View Post
            Every case is different, especially in Linux world, where every distro is different. I use rolling distro, so I get regular updates for all components that may be affected. But the same isn't true for some popular distros out there. Distros like Ubuntu and Mint has quite outdated components and in this case NV driver may be better, since you can update driver without using any PPAs or something like this. In case of AMD and Intel you want to have up to date kernel and Mesa,
            There are also other things to consider. Some libraries don't support some vendors at all or support is quite bad. In some cases NV is better choice just because CUDA is better supported than alternatives. For me this isn't an issue since I don't need this in my PC, but some people do.
            And also there is another issue – X11 itself. Many graphics glitches are caused by the way X11 work – eg. it's hard to eliminate screen tearing issues in X session. Wayland fixes a lot of issues, but again – you need to have up to date distro and you should avoid NV here (NV support for Wayland is quite bad).
            I want to use a rolling distro too - but, I'm pretty open/flexible - Ubuntu, Fedora, Tumbleweed, Debian sid, Manjaro, Arch - don't care which - I have mostly experience and familiarity with Debian and their derivatives but I'll use whatever works.
            I have used Nvidia cards - so, familiar with the black screen, flickering - and not so much the most recent developments and situation with Wayland - I had the impression it was improving - that combo - but, also open to AMD gpus - if they work well in the software I will use - although, the basic settings/functions of the card - for controlling fans and undervolting - it might not need it as much for most current gen - but, nice to have if it works - but, that functionality appears to be very slow in development in Linux.
            It's all moving towards Wayland - so, I acknowledge that - but, I'll use whatever works with the Nvidia / AMD gpu - whatever gpu I have/get.

            Comment


            • #56
              Wow, the intel arcs perform really bad here. Is there something fundamentally wrong with this test? From window's based benchmarks I would have expected way better results,I thought intel arc will perform way better on linux than on window's.

              Would love to see a real arc benchmark comparison against window's and at 1440p/4k.

              I really considered arc as an alternative, but judging from these benchmarks, it's not worth for linux at all, am I right?

              Comment


              • #57
                Originally posted by AndiK View Post
                I really considered arc as an alternative, but judging from these benchmarks, it's not worth for linux at all, am I right?
                I'm not sure, there might be another problem maybe with the intel hybrid core scheduling. Nvidia cards should be better too.

                Still if you want to game, Intel cards would be my last choice.

                Comment

                Working...
                X