Announcement

Collapse
No announcement yet.

2D - RX550 vs GT1030

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 2D - RX550 vs GT1030

    Which has better 2D X performance. RX550 or GT1030? I am open to using the closed NV driver but if the open rad driver provides 2D thats as good the closed NV driver then I might be tempted to pay a little more to get a RX550 instead of the slightly cheaper 1030. Nouveau 2D was a bit of a travesty last time I checked.

    Looking to run PS under wine 3.x so if one is known to work better with wine..

  • #2
    It has to drive a 4K display nicely.

    Comment


    • #3
      I am well aware of the advantages of open drivers versus closed drivers and I know about wine appdb but which has better 2D performance under X, 1030 (with closed driver) or RX550 with a closed or open driver?

      Comment


      • #4
        iGPU is probably the answer here, HD530 has comparable performance to 1030 and it's free*.

        Comment


        • #5
          I'm actually upgrading a friends i5 which has HD3000 iGPU hence it can't output 4K without adding a discrete GPU.

          I've heard of NV users having stability issues if trying to use a 4K display with only 1GB VRAM so it seems best to not buy anything with less than 2GB VRAM.

          Are there no 2D benchmarks that Phoronix or anyone else has done that could help someone who doesn't care about licensing or open code decide what to buy here.

          Comment


          • #6
            Originally posted by debianxfce View Post

            Inlel gpus are so shit that even intel uses Amd Vega. RX550 and GTX 1030 are 2 level faster than hd530, see: http://www.tomshardware.com/reviews/...rchy,4388.html
            All low end GPUs are shit, but you can get iGPU for free* instead of paying 80-90$ bucks for 1030. For that money you can get _used_ 950, 750ti or 680 easily and all of them will beat the shit out of 1030. In terms of performance 1030 is roughly equal to 660, which came out as low-mainstream card in 2012. And you actually pay money for that?
            If all you need is 2D performance, use iGPU. If you are on extreme budget, buy _used_. Normally I would recommend saving extra 50-100$ and buying actually decent card, but those are hard to come by lately. 1060 3G is currently the price to performance sweet spot, going any lower is IMHO a bad decision if you actually care about performance/$ (1050Ti is slightly less expensive and significantly slower card).

            Btw, this is pcgamers bang for the buck early 2018 edition (amd results should be taken with pinch of salt, they are based on MSRP, not street prices). Note there is no 1030 even making the list.

            Last edited by Guest; 24 February 2018, 11:50 AM.

            Comment


            • #7
              Indeed, don't get a 680 to run mine sweeper and a word processor. But please, you're both comically missing the question. It's about raw 90s-like 2D performance with many pixels. And whether the X driver has overhead that makes that slow-ish or not.

              An iGPU isn't free : between motherboard, CPU and RAM that may easily be about $400.
              If using no compositor or software compositor (both possible on Mate, XFCE, KDE) maybe GT1030 behaves the same as GTX 1080 and RX550 the same as RX480.
              So, maybe someone who reads this can give impressions? CPU overhead?

              The used low end GPU market is not very healthy, we're not in mid 2000s. Also the options are limited. R7 240 would do but the driver support is behind (GCN 1.0 GPU)

              To danboid : the photoshop version may have D3D or OpenGL acceleration? Not for compute but for zooms, scrolling etc. i.e. dumb pixel pushing. This shouldn't be very demanding on the GPU itself. (of course it'd be on Wine - depends on whether the vast improvements in Wine make it seamless)

              Do you need hi dpi support on the desktop? Gnome 3, Unity, Cinnamon may be demanding. There's MATE 1.20 out though - I don't know how good it is at hi dpi but that's a "2D" hi dpi desktop!
              In all cases though, you have a fast quad core CPU : if there's any CPU overhead from X, the driver or somewhere in the stack you'll survive it (as long as the cooling still is working great), there's still a lot of CPU left for real work.

              Comment


              • #8
                So nobody answered the mans question....

                So xorg has an interface called XRENDER. It is accelerated by a few different API's depending on the driver and the hardware. Some older cards have actual 2D hardware rendering units in silicon and can be acclerated by XAA or EXA. But newer cards do not have that hardware anymore, so those are accelerated with OpenGL using an API called Glamor.

                Nvidia's proprietary driver on the other hand doesn't use any of that infrastucture. I'm sure it must be accelerated in some kind of way, but its performance really is super bad.

                Here is a slightly older benchmark showing performance. I'm not sure if todays numbers would be the same, but I'm certain the trends would be the same.

                https://www.phoronix.com/scan.php?pa...linux_2d&num=1

                EDIT: Actually, those are all open source drivers, let me see if I can find numbers for proprietary drivers...

                https://www.phoronix.com/scan.php?pa...-2d-2015&num=1

                Here we go, so you'll have to kinda open both links and flip between them to compare, but you'll notice how much worse the proprietary drivers are compared to the open source drivers.
                Last edited by duby229; 25 February 2018, 01:58 PM.

                Comment


                • #9
                  Nice, I didn't know those benchs existed! (even on Windows reviewers hardly care.)

                  Well : in the 2015 review the nvidia driver is doing very well. When the bars are tiny it's "seconds : less is better".

                  Using Wine though, it might be unaccelerated, like a dumb framebuffer.

                  Comment


                  • #10
                    Originally posted by debianxfce View Post
                    That's a old windforce, mounted on super lowend gigabyte cards, that thing is almost as bad in terms of cooling as MSI Armor design.
                    And the reason why I know it's shit is because I unfortunately got a 1080 with it (3 fan version of windforce), after a month I replaced it with zip tie mod:

                    Last edited by Guest; 28 February 2018, 11:03 AM.

                    Comment

                    Working...
                    X