Announcement

Collapse
No announcement yet.

AMDVLK 2022.Q1.1 Released With Radeon RX 6500 XT Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • coder
    replied
    Originally posted by atomsymbol
    Just some notes/scenarios:
    • New fabs most likely aren't going to solve the issue of surplus money going to scalpers, instead of the surplus going to GPU designers and manufacturers
    • If today a cryptocurrency mining facility buys 100 GPUs then new fabs will enable the mining facility to buy 1000 GPUs, and consequently there will remain to be a shortage of GPUs in eshops
    • If the GPU crypto market collapses after Ethereum switches to proof-of-stake then there will be a very large number of GPUs on the market, and consequently the new fabs will be running below their peak manufacturing capacity
    Scalpers will stop making money when supply reaches a point where they can no longer absorb all of it. Then, they have to sell what they bought at a loss, which causes them to stop doing it.

    Cypto-mining is fundamentally limited by power costs, among other things. Again, you can face a supply/demand problem, where so many miners flock to an area with cheap electricity that the cost/W eventually has to rise or governments have to crack down on miners.

    As for there being a glut of fab capacity, I think that's what held back semiconductor manufacturers from ramping up even faster, in prior years. They have long memories and the semiconductor business is typically cyclical.

    Probably the other wildcard was China, though some US embargoes on semiconductor technology seem to have slowed their ramp-up. But everyone knows you're not going to win a price war with China, due to their government support for the sector.

    However, it could be that we're in the middle of an expansion big enough to absorb those factors. We'll see...

    Leave a comment:


  • coder
    replied
    Originally posted by atomsymbol
    The normal course of action, in a free market economy, in the case of a severe shortage is to [temporarily] increase the product's price on the manufacturer's and designer's side in order to decrease the demand and allocate more resources on the supply side. It is unfortunate that AMD&NVIDIA aren't behaving according to this simple principle,
    It takes something like 5 years to build a new semiconductor fab. Lots and lots of new fab capacity has been announced, in the past couple of years. Unfortunately, there's only so much you can rush the process. You can't get 9 women to collaboratively make a baby in 1 month.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by qarium View Post

    if they limit their gpus to intel cpus they can control who get ECC and who does not get ECC
    this means if you want ECC then you have to buy Xeon cpu.

    if they make their GPUs run with AMD cpus then the people just buy an Ryzen 5800X3D and put in ECC ram.
    this no more chipset sales for intel and no more Xeon sales for intel.

    if they do as you said... well in this case we will see a massive price drop in the market for the hardware.

    no one needs an expensive intel motherboard chipset anymore they just buy a cheap AM4 mother board with 5800X3D

    no one need expensive Xeon cpu anymore they get ECC ram with the AMD CPU.

    so you see i hope you are right but in this case intel just shot in their own food.

    but instead of they do the "restricted to Intel cpu's" game then they can control the market.
    WHAT. THE. F....

    Ok, seriously. I don't think it's worth responding to this any further, because your brain clearly works differently than mine does.

    But for the record, a $300 consumer graphics card isn't going to determine whether people buy Xeon servers. It doesn't even exist yet. By that logic, nobody would be buying Xeon servers now.

    Leave a comment:


  • qarium
    replied
    Originally posted by smitty3268 View Post
    The existing cards are basically existing integrated GPUs with quick hacks to allow them to run over PCIe. They are nothing more than test hardware which has nothing to do with the actual plans Intel has.
    Huh? What does Xeon or ECC memory have to do with consumer graphics cards?
    Hate to break it to you, but the intel graphics cards will not be OEM only or restricted to Intel cpu's. Doesn't matter how many conspiracy theories you spout. Heck, in the past they've added an integrated AMD gpu onto Intel CPUs, and you think they're not going to create a standard discrete card? That's crazy talk.
    This has all been publicly confirmed by Intel. If they backtrack at this point, they'd likely be subject to lawsuits from their investors for being lied to.
    if they limit their gpus to intel cpus they can control who get ECC and who does not get ECC
    this means if you want ECC then you have to buy Xeon cpu.

    if they make their GPUs run with AMD cpus then the people just buy an Ryzen 5800X3D and put in ECC ram.
    this no more chipset sales for intel and no more Xeon sales for intel.

    if they do as you said... well in this case we will see a massive price drop in the market for the hardware.

    no one needs an expensive intel motherboard chipset anymore they just buy a cheap AM4 mother board with 5800X3D

    no one need expensive Xeon cpu anymore they get ECC ram with the AMD CPU.

    so you see i hope you are right but in this case intel just shot in their own food.

    but instead of they do the "restricted to Intel cpu's" game then they can control the market.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by qarium View Post
    right i am not convinced because the existing PCIe cards are made in a way that they only run with intel cpus...
    The existing cards are basically existing integrated GPUs with quick hacks to allow them to run over PCIe. They are nothing more than test hardware which has nothing to do with the actual plans Intel has.

    also also by logic if intel do not follow this OEM only stradegy to push their chipset sales and cpu sales with cheap GPUs then they will lose many Xeon sales beause the people get ECC without buying Xeon.
    Huh? What does Xeon or ECC memory have to do with consumer graphics cards?

    so by logic they make it an OEM product only and make it only run with intel cpus or else they will hurt themself.

    and for sure they will not do this.
    Hate to break it to you, but the intel graphics cards will not be OEM only or restricted to Intel cpu's. Doesn't matter how many conspiracy theories you spout. Heck, in the past they've added an integrated AMD gpu onto Intel CPUs, and you think they're not going to create a standard discrete card? That's crazy talk.

    This has all been publicly confirmed by Intel. If they backtrack at this point, they'd likely be subject to lawsuits from their investors for being lied to.

    Leave a comment:


  • qarium
    replied
    Originally posted by coder View Post
    We've been down this road before (I mean specifically with Intel's upcoming dGPUs) and it seems there's no convincing him. Luckily, we have only another 2 months or so to wait.
    right i am not convinced because the existing PCIe cards are made in a way that they only run with intel cpus...

    also also by logic if intel do not follow this OEM only stradegy to push their chipset sales and cpu sales with cheap GPUs then they will lose many Xeon sales beause the people get ECC without buying Xeon.

    so by logic they make it an OEM product only and make it only run with intel cpus or else they will hurt themself.

    and for sure they will not do this.

    Leave a comment:


  • billyswong
    replied
    Originally posted by Teggs View Post
    That's at least half the problem with the 6500 XT, the way it's presented. If AMD had said, 'Hey everyone, here's our new RX 6300. As a converted laptop GPU, it does not include all of the features of our other RX 6000 offerings. However, this does make the die and the board it is placed on inexpensive to produce, and happens to leave the 6300 unattractive for crypto mining at the same time. Check our benchmarks and independent performance reviews to see if the RX 6300 is right for you!' ...it wouldn't make it physically better, but there would be a lot less negativity around it.
    I agree. Call it RX 6400 XT and market it such, then customers will be a lot more satisfied. They branded the OEM sub-75W version RX 6400 already anyway. A "5"-level cards are expected to be full featured, regardless of its speed. Not this current cut-cornered product.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by Teggs View Post
    That's at least half the problem with the 6500 XT, the way it's presented. If AMD had said, 'Hey everyone, here's our new RX 6300. As a converted laptop GPU, it does not include all of the features of our other RX 6000 offerings. However, this does make the die and the board it is placed on inexpensive to produce, and happens to leave the 6300 unattractive for crypto mining at the same time. Check our benchmarks and independent performance reviews to see if the RX 6300 is right for you!' ...it wouldn't make it physically better, but there would be a lot less negativity around it.
    I actually agree with that. Nobody was going to love this card, but if AMD had marketed it the right way I think most people would have shrugged and moved on while talking about how the GPU market still sucks. They also may have just wanted to target the OEM market to avoid all the press anyway, since I'm guessing that's where the majority of sales are going to end up.

    Leave a comment:


  • Teggs
    replied
    Originally posted by bridgman View Post

    As far as I can see the PRO card is 4GB as well:



    No mention of AV1 decode on the PRO version either, sorry.
    It's a good thing I'm not in the target audience for that product. The marketing on that page was cringey. The grammar was atrocious, the word choice screamed 'I'm in marketing and don't know what I'm talking about!', and while I know that they had a difficult task trying to sell a laptop GPU as workstation, pretending it only has two video outs because AMD deliberately cut costs on behalf of their workstation customers... *sigh*

    That's at least half the problem with the 6500 XT, the way it's presented. If AMD had said, 'Hey everyone, here's our new RX 6300. As a converted laptop GPU, it does not include all of the features of our other RX 6000 offerings. However, this does make the die and the board it is placed on inexpensive to produce, and happens to leave the 6300 unattractive for crypto mining at the same time. Check our benchmarks and independent performance reviews to see if the RX 6300 is right for you!' ...it wouldn't make it physically better, but there would be a lot less negativity around it.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by coder View Post
    Huh? It should run fine @ PCIe 3.0, but the x4 thing could bottleneck some games.
    It doesn't change the fact that I bought a new 2021 APU within the year and it's not good enough to be paired with their budget GPU. This is a product release ripped directly from the Intel playbook. I don't know why and can't explain it, but this GPU feels like some Intel tick-tock bullshit. Perhaps it's because I haven't owned my system for 11 months and yet it's outdated for the crappiest of the new GPUs around -- that does feel like some Intel bullshit. Trust me when I say that if I thought artificial PCIe limitations were going to be a thing I wouldn't have bought my Ryzen Pro.

    There is no amount of AMD Fan Blinders that'll make me say that good things about this GPU. Well...the messed up thing is I'm actually fine with the price. I totally get the current market and was willing to spend that simply because I've rode my 580 hard lately. That's as nice as it gets -- I'm in a bind and gotta buy what's available.

    Leave a comment:

Working...
X