Announcement

Collapse
No announcement yet.

Linux 6.11 Kernel Features Deliver A Lot For New/Upcoming Intel & AMD Hardware

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Errinwright View Post

    According to MLID youtube, the 8800 XT (largest die) will be launched in Q4, but no word on the 8700 XT.
    That 270mm² for a monolithic 64CU chip is super cute though.

    Originally posted by Errinwright View Post
    All RDNA4 cards are monolithic dies, so the should hopefully be less driver issues compared to the RDNA3 lineup.​
    Drivers are said to be in much better shape than RDNA3. Mesa already has RDNA4 support upstreamed and is constantly receiving tweaks and bugfixes for said architecture, so AMD is already testing and optimizing. As a result, I'd expect RDNA4 support to be quite robust at launch.

    Comment


    • #12
      Listening to MLID in 2024, do we never learn

      Comment


      • #13
        Originally posted by Errinwright View Post

        They are only launching midrange GPUs for the RDNA4 lineup, and the name might be 8700 XT (instead of 8800 XT) as the price range is 499-599$ MSRP. The raster is expected to be around RTX 4080 with ray tracing approximately around 4070 (ti). It seems the volume of the two variants (probably 8600/8700 XT) is expected to be high, so the RTX models may see price drops if that is your preference.

        All RDNA4 cards are monolithic dies, so the should hopefully be less driver issues compared to the RDNA3 lineup.
        Do you happen to know what country the RDNA4 will be produced in? I mean do they depend on Taiwan or China? Because I recall some US generals expecting the war to start around 2025-2026.

        Comment


        • #14
          Originally posted by skeevy420 View Post

          I saw that. The 8800 XT outta have one hell of a price/performance ratio if the rumors around its performance and $499-$599 price are true.

          Related to AMD graphics, I just hope we, meaning PC users in general, get some sort of AI based upscaling. Given the option, I almost always would rather play at 1440p Medium or High settings instead of 720-1080p with Ultra settings upscaled to 1440p. I don't care if the upscaler is NIS, FSR, or XESS, they all have their flaws and limitations that become more and more apparent as you upscale beyond 1080p on a larger screen and especially when your sitting at a monitor and not a TV that's across the room.
          As a side note, oddly enough to some people the expression "hell of a price" is positively connotated. In fact AMD's pricing has been vastly criticized by media outlets as a weak spot. 499$ to 599$ is totally overpriced and not viable to gain more traction in the market. Nvidia climbed to 88% market share for dGPUs. So AMD has to lower the prices to become more competitive. AMD can adjust it and they had to often enough.:

          The MSRP of the RX 6750 XT was 549$. The MSRP for the identical RX 6700 XT was 479$. Both of them went down drastically in market price.
          Currently the RX 6750 XT costs 299$. For the RX 7700 XT AMD lowered the MSRP from 449$ to 419$ and the market price is falling nevertheless with heavy dips to 353$ in between. So the argument of inflation, which is typically put forward by Nvidia fanboys, is definitely not an excuse for AMD here either. Especially Nvidia could lower their prices, as we all know.

          We urgently need a third big competitor in the dGPU sector. Intel was a big disappointment in that regard. But maybe Arm or very experienced Imagination Technologies may bring something competitive to the table soon. The latter's IP in apple's m chips is really impressive.

          By the way the RX 8700 XT should not come with only 12 GB VRAM like its predecessors. For this GPU class and price this is no longer adequate nowadays.
          Last edited by M.Bahr; 26 August 2024, 06:04 AM. Reason: some typos

          Comment


          • #15
            This is also the kernel release where AMD's HDMI audio gets a patch too, right? Because the current state of it is getting pretty irritating and I can't wait for it to be over with. Granted, it's not clear whether my symptom is affected by it.

            Comment


            • #16
              Originally posted by M.Bahr View Post

              As a side note, oddly enough to some people the expression "hell of a price" is positively connotated. In fact AMD's pricing has been vastly criticized by media outlets as a weak spot. 499$ to 599$ is totally overpriced and not viable to gain more traction in the market. Nvidia climbed to 88% market share for dGPUs. So AMD has to lower the prices to become more competitive. AMD can adjust it and they had to often enough.:

              The MSRP of the RX 6750 XT was 549$. The MSRP for the identical RX 6700 XT was 479$. Both of them went down drastically in market price.
              Currently the RX 6750 XT costs 299$. For the RX 7700 XT AMD lowered the MSRP from 449$ to 419$ and the market price is falling nevertheless with heavy dips to 353$ in between. So the argument of inflation, which is typically put forward by Nvidia fanboys, is definitely not an excuse for AMD here either. Especially Nvidia could lower their prices, as we all know.

              We urgently need a third big competitor in the dGPU sector. Intel was a big disappointment in that regard. But maybe Arm or very experienced Imagination Technologies may bring something competitive to the table soon. The latter's IP in apple's m chips is really impressive.

              By the way the RX 8700 XT should not come with only 12 GB VRAM like its predecessors. For this GPU class and price this is no longer adequate nowadays.
              I'm not one of the people upset with AMD's current prices. All things considered, 2019 to Now, I'm more upset at the current cost of food a lot more than I am luxury goods like graphics cards. Moreover, AMD doesn't have to lower their prices until NVIDIA does. As long as NVIDIA has highly inflated prices, AMD just has to be cheaper to be the better value. The fact is, because of how well CUDA works and how the industry is dependent on them as a whole, NVIDIA doesn't have to lower their prices. They basically have a monopoly on the professional market via CUDA and on the gaming market via DLXX. If AMD lowers their prices too much their half-ass software ecosystem will become three-quarter-ass. I hope buying all these companies increases AMD manpower so they can move their software up to at least quarter-ass.

              Even if Imagination Technologies could release a GPU as powerful as a 7900 XTX for the price of a 6700 XT, they'd still be in AMD's situation with a software stack that can't compete with NVIDIA's CUDA professionally. They'd have a GPU that could compete for gamers in the consumer market but it wouldn't likely be very usable for professional users.

              I have a 6700 XT. At 3440x1440 with games at mostly high to ultra settings most of my games sit between 6GB to 10GB of VRAM usage. I don't think I've ever actually used the full 12GB outside of MSFS 2020 with unplayably high settings. The amount of VRAM a person needs becomes resolution dependent at HD resolutions and I don't play games above 1440p to necessitate that much VRAM. If they're designing the 8800 XT/8700 XT as a 4K Native GPU it'll need more VRAM, upwards of 24GB. If it's being designed as a 1440p Native GPU or 4K upscaled GPU, it can get away with 12-16GB.

              Based on how resolution and VRAM works, I wish they'd make one or two GPUs a generation and sell RAM kits. Like a low powered and a high powered model with RAM kits sold based around recommended gaming resolutions (or projected AI workloads) such as 2x4GB for 1080p, 2x8GB for 1440p, and 2x16GB for 2160p. If the goal is to lower costs then they don't need to sell unnecessary models when they could turn 2 models into 6+ models by using RAM as an accessory. They'd also get double sales from a lot of people. Folks buying the 2x4GB or 2x8GB to get them started and then upgrading to a 2x16GB kit a year later. If they make compatible RAM slots between generations then folks can upgrade their core GPU, swap RAM, and get better performing games...and then upgrade to slightly faster ram a year later. If done right it's both lower prices for consumers and more sales and profits for AMD.

              Comment

              Working...
              X