Announcement

Collapse
No announcement yet.

AMD Announces Ryzen 7000 Series "Zen 4" Desktop CPUs - Linux Benchmarks To Come

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Dukenukemx
    replied
    Originally posted by qarium View Post

    they say the gpu spike from 200watt to over 600watt and by this can outgun many PSUs...
    thats true but the joke about my systems my systems where planed and build for massiv cryptocurrency mining witn 4 pices of vega64 per system... a spike to 600watt can not outgun my 2000watt PSU... with 1200watt on the GPU lines.
    i know many people have something like 800watt PSU and then the 600spike for the gpu blows of the fuse for the electric current in the PSU...
    people with 2000watt PSU don't care ...
    The brand PSU matters more than the wattage value, but either way GPU's have gone from suck to blow.
    "you justify the price of something by comparing it to other over priced things."

    thats plain and simple not true only nvidia has in fact overpriced things and i did not compare it to an 3090 the nvidia card with similar performance has over 200€ higher price.
    You do know AMD and Nvidia have priced fixed in the past? They were taken to court over it. If 2016 GPU's are still the most popular 6 years later then someone has overpriced hardware.
    "Graphic cards shouldn't be that much money,"

    isn't it true that this depends on the performance ? you can buy a 300€ graghik card like the AMD 6600XT...
    Performance always goes up, as that's how they entice you to buy new cards. That doesn't mean prices should go up proportionally.
    "This is why the GTX 1060 is still the # 1 GPU on Steam.​"

    to be honest this only shows how stupid the people are the AMD cards are much better fit in my point of view.
    The most popular AMD card on Steam is the AMD RX 580. The second is RX 570 and the 3rd is Vega 8 graphics, which is APU graphics. See a trend here?
    "These prices aren't normal."

    i can't agree on this... in 2017 i bought 6 vega63 one for 740€ and 5 for 666€
    2017 was the crypto boom, so of course prices are high.
    Last edited by Dukenukemx; 08 September 2022, 03:39 AM.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by coder View Post
    A few games have long been playable at 4k. I think one of the GTA games could deliver > 60 fps since GTX 1000-series, if not before.

    However, you're right that 4k remains a challenge, though mainly for those without top-tier cards. And I think it's this combination of running a 4k-resolution monitor with a midrange card where things like DLSS make the most sense. Maybe even a 2.5k monitor with a lower-end card.
    GTA V was released on the Xbox 360 and PS3. This is not a recent game. As for 4k, the main issue with it is that everyone tests it with Anti-Aliasing and much like DLSS and FSR, they don't remember why this technology was created in the first place. You don't need to run AA when you're using 4k. AA was created because at 640x480 or 800x600, you could really see the jaggies. At 4k though, good luck finding them. Just run the games AF to 16X and you'll be fine at 4k.

    Leave a comment:


  • coder
    replied
    Originally posted by Dukenukemx View Post
    Secondly, even the most expensive graphic cards like the 3090 Ti's struggle to maintain 60 fps with all the graphic features turned on. Unless you're playing a fairly old game or a game that isn't demanding like Minecraft, then 4k with all features on is possible. DLSS and FSR were created specifically because turning on Ray-Tracing in games was not playable. Even at 1080p, you'd still need DLSS and FSR to make some games playable.
    A few games have long been playable at 4k. I think one of the GTA games could deliver > 60 fps since GTX 1000-series, if not before.

    However, you're right that 4k remains a challenge, though mainly for those without top-tier cards. And I think it's this combination of running a 4k-resolution monitor with a midrange card where things like DLSS make the most sense. Maybe even a 2.5k monitor with a lower-end card.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    Do you not watch Tech Jesus? Doesn't matter what PSU you have if the graphics card can spike and use double power. Newer graphic cards will spike and consume for a moment a lot of power, and this can crash a system. Tech Jesus mentions Nvidia cards particularly because Nvidia cards are the worst at this.
    they say the gpu spike from 200watt to over 600watt and by this can outgun many PSUs...
    thats true but the joke about my systems my systems where planed and build for massiv cryptocurrency mining witn 4 pices of vega64 per system... a spike to 600watt can not outgun my 2000watt PSU... with 1200watt on the GPU lines.
    i know many people have something like 800watt PSU and then the 600spike for the gpu blows of the fuse for the electric current in the PSU...
    people with 2000watt PSU don't care ...

    Originally posted by Dukenukemx View Post
    This is Apple logic when you justify the price of something by comparing it to other over priced things. Graphic cards shouldn't be that much money, but dummies are more than willing to pay for it, because it's the fastest. This is why the GTX 1060 is still the # 1 GPU on Steam. If you go down that list the GPU's don't get more expensive but cheaper. Since 2016 when the crypto market went nuts, the price of GPU's haven't gone down, and when the crypto market crashed in 2018, we quickly went into COVID19 and the prices went to insane levels. These prices aren't normal.

    again you blame me for apple logic and apple fanboyism and so one and so one but it still stands i never bought any apple product in my life. and i even openly tell people who have iphone they are stupid as fuck.

    "you justify the price of something by comparing it to other over priced things."

    thats plain and simple not true only nvidia has in fact overpriced things and i did not compare it to an 3090 the nvidia card with similar performance has over 200€ higher price.

    "Graphic cards shouldn't be that much money,"

    isn't it true that this depends on the performance ? you can buy a 300€ graghik card like the AMD 6600XT...

    "dummies are more than willing to pay for it, because it's the fastest."

    dummies already buy the 3090 with a over 200€ higher price for similar performance.

    "This is why the GTX 1060 is still the # 1 GPU on Steam.​"

    to be honest this only shows how stupid the people are the AMD cards are much better fit in my point of view.

    "These prices aren't normal."

    i can't agree on this... in 2017 i bought 6 vega63 one for 740€ and 5 for 666€
    the amd 6600xt is 8% faster than the vega64 but the price is at 399€

    This means you save 266-340€ and you get 8% better performance you also get raytracing support and AV1 decode...

    if you do not want save money but instead want more performance you get a 6800 for 629€


    and this 6800 is 32% faster than a vega64


    Originally posted by Dukenukemx View Post
    You keep showing your ignorance of the world. Yes we are the minority but so are people who buy AMD 6800's and 3080's. Look at this list and tell me how many RTX cards, let alone 3000 series based cards you see on that list?
    Mainstream means nobody even knows what DLSS/FSR plus Ray-Tracing is, let alone knows how to turn it on. Most people wouldn't even know if Ray-Tracing was on or off in a blind test.

    Firstly, TV's aren't good due to latency. You must have horrible input lag, even if you switch the TV to game mode. Secondly, even the most expensive graphic cards like the 3090 Ti's struggle to maintain 60 fps with all the graphic features turned on. Unless you're playing a fairly old game or a game that isn't demanding like Minecraft, then 4k with all features on is possible. DLSS and FSR were created specifically because turning on Ray-Tracing in games was not playable. Even at 1080p, you'd still need DLSS and FSR to make some games playable.
    i did multible tests with cyberpunk 2077 and set all features on and to max and use FSR gives me the best visual result with playable FPS in my vega64..

    "Most people wouldn't even know if Ray-Tracing was on or off in a blind test."

    i can say nothing about this from my own tests because my vega64 does not support raytracing.

    "You keep showing your ignorance of the world. Yes we are the minority but so are people who buy AMD 6800's and 3080's"

    well what is my ignorance i have very good education about technolog in this matter.

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    The problem with DLSS and FSR is that it doesn't have consistency in terms of results.
    this was the case when FSR1.0 was released and compared to DLSS2.x
    FSR was nice but had not the expected final results compared to DLSS2.x
    but with FSR2.0 most review i watched and read said it is similar in the result than DLSS2 some say DLSS2 is still better but id is really a very tiny difference of at all and most people dont see it.

    so it is completely fair to compare amd with FSR2.0 and nvidia with DLSS2.x

    but as tests show of FSR2.0 on nvidia hardwaree on the same hardware DLSS2.x result in 6-7% more FPS

    thats because the nvidia do use their AI/matrix/FMA cores to calculate DLSS2 and FSR2.0 is calculated by shaders.

    this means because of this AI/matrix/FMA cores to calculating DLSS2 nvidia has still an advantage.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    the most highend hardware did always use the most power.. and i dont care about a TDP of 340watt or 450watt
    i have a 2000watt PSU in my 2 threadripper systems. this means an insane TDP of 450watt for a card does not matter in my system.
    Do you not watch Tech Jesus? Doesn't matter what PSU you have if the graphics card can spike and use double power. Newer graphic cards will spike and consume for a moment a lot of power, and this can crash a system. Tech Jesus mentions Nvidia cards particularly because Nvidia cards are the worst at this.

    i did observe a price drop of the 6950 from 1200€ to 950€ if you compare it to 6900 and 6800 it is not overpriced
    the 6950 was only overpriced at the release date... at the release date the 6900 was sold for 950€ and the 6950 was at 1200€
    but now the 6900 is at 885€ and the 6950 is at 958€
    thats not horrible overpriced... if you want to save power with an more efficient gpu you maybe choose the 6900 anyway..
    This is Apple logic when you justify the price of something by comparing it to other over priced things. Graphic cards shouldn't be that much money, but dummies are more than willing to pay for it, because it's the fastest. This is why the GTX 1060 is still the # 1 GPU on Steam. If you go down that list the GPU's don't get more expensive but cheaper. Since 2016 when the crypto market went nuts, the price of GPU's haven't gone down, and when the crypto market crashed in 2018, we quickly went into COVID19 and the prices went to insane levels. These prices aren't normal.
    you for yourself can do whatever you want turn of raytracing and turn of FSR/DLSS you are also free to only buy a 6800 or 3080..
    but be aware of that people like you are a minority and even people like me who buy amd because of opensource drivers are a minority...
    You keep showing your ignorance of the world. Yes we are the minority but so are people who buy AMD 6800's and 3080's. Look at this list and tell me how many RTX cards, let alone 3000 series based cards you see on that list?
    steam august 2022.png

    the majority means the mainstream they just turn on all features no matter what FSR2/DLSS2 and Raytracing...
    Mainstream means nobody even knows what DLSS/FSR plus Ray-Tracing is, let alone knows how to turn it on. Most people wouldn't even know if Ray-Tracing was on or off in a blind test.

    i have a 4K60hz TV on my pc and my vega64 really needs FSR to maintain 4K without disabling to many features.

    native 4'K is maybe better if your system is fast enough but not for cards like vega64 who are to slow without FSR...
    Firstly, TV's aren't good due to latency. You must have horrible input lag, even if you switch the TV to game mode. Secondly, even the most expensive graphic cards like the 3090 Ti's struggle to maintain 60 fps with all the graphic features turned on. Unless you're playing a fairly old game or a game that isn't demanding like Minecraft, then 4k with all features on is possible. DLSS and FSR were created specifically because turning on Ray-Tracing in games was not playable. Even at 1080p, you'd still need DLSS and FSR to make some games playable.
    Last edited by Dukenukemx; 07 September 2022, 03:31 PM.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by qarium View Post

    if you do not exclude any feature and you put all the features in the benchmark means DLSS2 and also raytracing Nvidia still has the efficiency cown. amd only wins if you turn of raytracing and if you do not benchmark with FSR2/DLSS2
    I'm a pragmatic person. Find me benchmarks that prove this. Don't just say it and expect people to believe you. FIND IT.

    Leave a comment:


  • Dukenukemx
    replied
    Originally posted by coder View Post
    At some level, all computer graphics is faking, right?
    I see what you did there.
    If you really care about "authenticity" in your algorithms, then you should be using only Path Tracing, and nothing else. And insist on no postprocessing effects.
    I just think it's funny how sanctimonious people get about DLSS. If it looks good enough, it is good enough. IMO, it's as simple as that. If you find it to have bothersome artifacts, then it doesn't look good enough and don't use it. However, for those who think it does look good enough, then its efficiency benefits are undeniable.
    The problem with DLSS and FSR is that it doesn't have consistency in terms of results. This makes it harder to do comparisons between GPU brands as these companies have historically played with image quality to get better benchmark results. ATI has the Quake/Quack thing and Nvidia has done similar fuckery with 3DMark. I'm not against the feature, but I don't like it being a factor in purchasing a GPU. Just like I don't care for Ray-Tracing, since this technology is extremely early in being usable. It's nice, but don't claim it to be a feature that will determine if I'd buy it or not.
    I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.
    I'm gonna wait for cheaper AMD cards as I use Linux Mint. It's a lot less of a problem when I don't have to worry about what kernel I'm using that might upset the Nvidia proprietary drivers. I'm looking at RX 6700's or even 6600 XT's. The prices will collapse more as time goes on.

    Leave a comment:


  • qarium
    replied
    Originally posted by coder View Post
    I'm going to start looking a lot more seriously at a RTX 3080, if prices keep going down. I would consider a 12 GB version, if I can get one at a discount relative to the comparable-performing RTX 4000 series card. That's how I ended up with my current GPU, which I got for a little over 50% below MSRP once the GTX 1000-series launched.
    why not get the full 16gb vram with the 6950xt? the price did go down in the last weeks its 958€ now

    Leave a comment:


  • qarium
    replied
    Originally posted by Dukenukemx View Post
    If the 6950XT uses less power than it uses less power. Also, the amount of power needed to feed the 3090 is scary. We can all joke how demanding a R9 390 or Vega 64 is in power, but compared to a 3090, they seemed efficient, and less likely to start a fire. The 3090's and also the 3080's can spike a power supply really hard, to the point that it can do some serious damage.
    the most highend hardware did always use the most power.. and i dont care about a TDP of 340watt or 450watt
    i have a 2000watt PSU in my 2 threadripper systems. this means an insane TDP of 450watt for a card does not matter in my system.

    Originally posted by Dukenukemx View Post
    Nobody in their right mind should consider buying either. Both are horribly overpriced,

    i did observe a price drop of the 6950 from 1200€ to 950€ if you compare it to 6900 and 6800 it is not overpriced
    the 6950 was only overpriced at the release date... at the release date the 6900 was sold for 950€ and the 6950 was at 1200€
    but now the 6900 is at 885€ and the 6950 is at 958€
    thats not horrible overpriced... if you want to save power with an more efficient gpu you maybe choose the 6900 anyway..

    Originally posted by Dukenukemx View Post

    and both demand a lot more power. The most I would consider buying is a 6800 XT and a RTX 3080. Also, I don't care for FSR or DLSS as these technologies are faking graphics. These are solutions to Ray-Tracing as most games can't run this feature without lowering the resolution. Because Ray-Tracing sucks and isn't a feature that games 100% utilize. You may see water or a reflection with Ray-Tracing, which is not the same as Ray-Tracing in Minecraft or Quake 2 which everything has Ray-Tracing. Minecraft and Quake 2 not particularly demanding games.
    you for yourself can do whatever you want turn of raytracing and turn of FSR/DLSS you are also free to only buy a 6800 or 3080..
    but be aware of that people like you are a minority and even people like me who buy amd because of opensource drivers are a minority...

    the majority means the mainstream they just turn on all features no matter what FSR2/DLSS2 and Raytracing...

    Originally posted by Dukenukemx View Post

    Link doesn't work. Doesn't matter anyway because nobody cares about Ray-Tracing.
    the link works for me ? LOL,,,

    Originally posted by Dukenukemx View Post

    That's because these technologies aren't really worth using since they do lower image quality. You are always better off running in native 4k than fake 4k.
    i have a 4K60hz TV on my pc and my vega64 really needs FSR to maintain 4K without disabling to many features.

    native 4'K is maybe better if your system is fast enough but not for cards like vega64 who are to slow without FSR...

    Leave a comment:

Working...
X