Announcement

Collapse
No announcement yet.

NVIDIA Announces The GeForce RTX 40 Series With Much Better Ray-Tracing Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Seeing the RTX 4080 12 vs 16G model could be some of the reasons for EVGA really bad reaction. Less memory, less memory bandwidh and less GPU cores... Also note now Nvidia writes up that the 12G and 16G RTX 4080 are meant to be 2 to 4 times faster than the same older generation card. Gets better Nvidia will not be making reference 4080 12G. This looks like silicon that is defective AIB will be told to make into RTX 4080 12G and when it does not perform as well as Nvidia has made out the AIB will be left holding the bag.


    https://www.digitaltrends.com/comput...0-vs-rtx-3080/

    Note the 3080Ti and 3080 of the past generation has the same memory bandwidth. Yes the 3080 has less cores and slightly less ram than the 3080 ti but has the same memory bandwidth..

    Lot of ways with the differences between the RTX 4080 12G and RTX 4080 16G these should have had different model numbers to prevent consumer confusion. Yes Nvidia themselves will not have the consumer confusion because they are not making the RTX 4080 12G version. This is true lack of respect for what AIB have to deal with and it is Nvidia who sets model numbers.

    Leave a comment:


  • shmerl
    replied
    Originally posted by ryao View Post

    Nvidia previously said that such technology would be needed to reach 1kHz refresh rates.

    Why would you need such refresh rate at the cost of reducing quality, no matter the technology? I don't get the appeal. I'd take the refresh rate that GPU can handle natively without any upscaling, as long as it gives better image quality.

    Leave a comment:


  • ryao
    replied
    Originally posted by birdie View Post

    It's not just motion interpolation, DLSS 3.0 renders almost full frames without using any classic GPU/CPU instructions which will result in an enormous performance increase/power savings. I'm not sure the tech will work for all games.

    Why "almost", because in motion the image changes and there's new stuff (not seen before on the screen) to be rendered anyways using GPU/CPU but depending on how good DLSS 3.0 is, it will mean that you only have to render properly the percentage of the frame which has changed between DLSS interpolations. In games with little motion it means you can use just DLSS 3.0 instead of computing frames using GPU/CPU.

    DLSS 3.0 looks like an alien tech - I've heard about it before but no one imagined anyone could make it work.

    As much as people here hate NVIDIA (for reasons which I personally don't understand at all because they are absolutely illogical, let me repeat once again: NVIDIA does not owe anyone Linux support at all or open source drivers for Linux) the company has basically invented the modern game graphics pipeline starting with programmable shaders ending with RTX/RTRT.
    Nvidia previously said that such technology would be needed to reach 1kHz refresh rates.

    Leave a comment:


  • theriddick
    replied
    Originally posted by superpenguin View Post
    I found the launch quite impressive, and initially planned to upgrade to this gen from a 1080 but i can't find a product that fit me perfectly in this line up.

    END_OF_PERSONNAL_RAMBLING
    You'll have to wait for a 4050 or maybe a lower end 4060.... Q1 announcements I suspect.

    Leave a comment:


  • geearf
    replied
    Originally posted by Teggs View Post
    In the fields of mathematics and science in general professional respect is shown by referring to a person by their last name. Kepler, Volta, Maxwell, Pascal, Turing... Nvidia would have people believe that they chose Ada Lovelace to name an architecture after out of respect for her accomplishments, but by concentrating on her gender by using her first name they show disrespect for her work as a scientist. I suppose they don't care about any damage they do to the effort to value scientific work apart from the gender of the scientist doing said work.

    I believe it is correct to refer to this architecture as 'Lovelace' no matter what Nvidia says.
    Maybe it was a hint at ADA from Cardano as well?

    Leave a comment:


  • abu_shawarib
    replied
    Overall Nvidia is making some bold claims about performance this gen, which I'll wait for press to verify.
    DLSS 3 stuff seems impressive just from a technical pov alone, but not sure right now how easy it is to implement or side load on an unsupported game or in Linux.

    Leave a comment:


  • abu_shawarib
    replied
    Originally posted by [email protected] View Post
    Boy do Jensen had a narcissistic photo for the video... "Look at me, I'm the best Taiwanese CEO, not that witch in the other company"...
    It's a normal picture, just chill.

    Leave a comment:


  • WannaBeOCer
    replied
    Originally posted by Jabberwocky View Post

    Ah yes... I remember now GTA 6 is going to require lots of compute and machine learning capabilities. Video RAM, power draw and price isn't relevant anymore. Besides most of the gaming market is dominated by 3090s and 3080s and totally not 1060s nor 1650s. /s
    Of course a troll, I can play too. This is the first generation that surpasses 40 TFlops.

    Tim Sweeney says 40 TFLOPS is required for photo-realistic games

    Read more: https://www.tweaktown.com/news/53045.../index.html​

    Leave a comment:


  • jrch2k8
    replied
    Well, i just bought an RX6600 for 220$, that will be enough for 2 years at least.

    Leave a comment:


  • Jabberwocky
    replied
    Originally posted by WannaBeOCer View Post

    I agree, I was shocked seeing people buy RDNA2 GPUs for Ampere pricing at launch before the mining craze. AMD stripped out a ton of functionality from RDNA compared to Vega to make a pure gaming architecture then priced it the same as Ampere. Yet gamers were still paying the same price as a GPGPU. They're not ridiculously priced considering the amount of compute performance they provide. The RTX 4090 has 100 TFlops of compute and that's not even talking about the machine learning capabilities. Look up research publications you'll see a ton of RTX consumer cards used, then try to find a RDNA GPU listed in a publication.
    Ah yes... I remember now GTA 6 is going to require lots of compute and machine learning capabilities. Video RAM, power draw and price isn't relevant anymore. Besides most of the gaming market is dominated by 3090s and 3080s and totally not 1060s nor 1650s. /s

    Leave a comment:

Working...
X