Announcement

Collapse
No announcement yet.

NVIDIA Unveils $2,499 USD TITAN RTX

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • skeevy420
    replied
    Originally posted by schmidtbag View Post
    For me personally, 4K is probably all I'm ever going to want/need. I currently have a 40" 1080p display and sit about 1m away when playing games. For some games, it looks good enough, even without AA. But for other games, they look bad even with AA (particularly ones where you need to look farther ahead, like racing games). I figure 4K ought to be plenty sufficient, I'm just waiting for a GPU capable of 4K (without AA) that costs less than $500 and doesn't consume excessive wattage. Preferably AMD, since I want to support their open-source efforts.
    2K, 21:9 myself. It would need that same magical GPU you need. There's a lot of extra pixels to push for ultrawide and I'd need something capable of pushing like 6K 16:9 comfortably to play 4K 21:9.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by schmidtbag View Post
    Yup pretty much, though, it is worth pointing out that your screen size and your distance from the display makes a big difference in all of that. A 24" 2K display where you're sitting maybe 2ft away isn't going to need AA. But at 36" at the same distance, you're probably going to notice some jaggies.
    You know, I've noticed that I don't really notice the jaggies when I use my 1080p 49" TV across the room to play games on and that on the same game, same PC I will notice them on a 1080p 23" monitor a foot or two away. You'd think it would be opposite...more dpi, less jaggies, but that's really only true for text in my experience.

    I though about saying the same, but I assumed most of us here are familiar with the size/distance stuff.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by dungeon View Post

    Just buy next gen gaming console when they appear, like PlayStation 5 / XBox or whatever
    I really doubt that either will be able to run or replace my current Antergos desktop. I'm gonna get one, but I don't think a gaming console will run a Linux desktop (or run one well enough to replace an actual PC...like the PS3).

    I've always suspected gaming consoles are why we can't get a nice desktop APU. That would just make it too damn easy for a common user to build or buy a system around. No thinking "is my CPU good enough", no "is my GPU good enough"...just "check it out, they released a new APU that upgrades my entire system with one plug and play component".

    Leave a comment:


  • schmidtbag
    replied
    Originally posted by skeevy420 View Post
    For me, once the resolution gets to 1440p AA starts being almost unnecessary. The whole point of AA, at least SSAA, is to make a 640x480 game appear to be 2K & 4K, to make low resolution appear to be high resolution. Once you're actually rendering at the resolutions we used to use SSAA to simulate way back when, what's the point? To make a 4K game appear to be a 16K game? That just seems like a waste of resources that could be used on shadows and other spiffy effects.
    Yup pretty much, though, it is worth pointing out that your screen size and your distance from the display makes a big difference in all of that. A 24" 2K display where you're sitting maybe 2ft away isn't going to need AA. But at 36" at the same distance, you're probably going to notice some jaggies.

    EDIT
    For me personally, 4K is probably all I'm ever going to want/need. I currently have a 40" 1080p display and sit about 1m away when playing games. For some games, it looks good enough, even without AA. But for other games, they look bad even with AA (particularly ones where you need to look farther ahead, like racing games). I figure 4K ought to be plenty sufficient, I'm just waiting for a GPU capable of 4K (without AA) that costs less than $500 and doesn't consume excessive wattage. Preferably AMD, since I want to support their open-source efforts.
    Last edited by schmidtbag; 03 December 2018, 03:08 PM.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by schmidtbag View Post
    I totally agree. I really wish hardware reviewers would compare performance at 4K without AA more often. 4K without AA is better than 1080p with it, but we're all led to believe that $700 is the "entry level" point for 4K.
    For me, once the resolution gets to 1440p AA starts being almost unnecessary. The whole point of AA, at least SSAA, is to make a 640x480 game appear to be 2K & 4K, to make low resolution appear to be high resolution. Once you're actually rendering at the resolutions we used to use SSAA to simulate way back when, what's the point? To make a 4K game appear to be a 16K game? That just seems like a waste of resources that could be used on shadows and other spiffy effects.

    Leave a comment:


  • dungeon
    replied
    Originally posted by skeevy420 View Post
    Personally, I'd like AMD to use HBM with their APUs. We all know that the ram is the bottleneck with APU graphics performance. Give them 8GB of HBM with DDR4 used like a swap partition; give an APU 16GB of HBM and DDR4 wouldn't be necessary...provided there are actually drivers for using HBM as general system ram.

    Dear AMD,

    I'd like 6-10 of whatever your next-gen cores are, Navi or better for graphics, with 16GB of HBM2 attached for your next desktop APU. We'd probably need a new socket and motherboards to support it. That's fine. A lot of us would be perfectly content with that APU for $400-$600 if it could play most of our games at 1080p120 to 1440p60. Seriously, please combine your next gen $150 CPU & $250 GPU models with some HBM2 to negate DDR4 graphics lag so we can have a real desktop & gaming grade APU on the consumer level.

    Thanks
    Just buy next gen gaming console when they appear, like PlayStation 5 / XBox or whatever

    When these appear, PC will go to DDR5 memory (about late 2020. or early 2021. i guess), so you won't need it again At the time, you will get classic APU running on system's DDR5

    Everything is limited really, we will always have bottlenecks and boundwares
    Last edited by dungeon; 03 December 2018, 03:04 PM.

    Leave a comment:


  • schmidtbag
    replied
    Originally posted by M@GOid View Post
    Well, I personally like, the card allowing it, to keep the textures high while lowering the illumination and shaders effects.
    Lowering shader effects a notch is usually a pretty good idea - it tends to offer a noticeable performance improvement for a very minor visual detriment. Although I haven't experienced AAA gaming in 4K, I have found that in 1080p, putting my face right up to a wall yields almost no perceivable difference in texture when going from "ultra" to "high" in some cases. I'm sure that difference could be noticed in 4K, but how often are you pinning your face against the wall?
    But since you talked about 4K/60fps, you know what pisses me off to no end? People taking a expensive top of the line card to play 4k, setting the anti-aliasing to 8x and them claiming there is not a single card on the market capable of 4k/60fps...
    I totally agree. I really wish hardware reviewers would compare performance at 4K without AA more often. 4K without AA is better than 1080p with it, but we're all led to believe that $700 is the "entry level" point for 4K.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by dungeon View Post

    RTX 2070 is normal card, there is no die space vasting nor tricks there - chips is not so big, nor is there Ti gimmicks

    That is nothing crazy, Navi would likely target that. Only your price for top future AMD GPU of $299 sounds crazy
    $299 isn't that crazy. Vega 56 would likely be around that price if it wasn't for HBM2. A GDDR5 backed Vega 48 would still compete with the 1070...and make the 56 & 64 pointless from a cost/performance perspective.

    Personally, I'd like AMD to use HBM with their APUs. We all know that the ram is the bottleneck with APU graphics performance. Give them 8GB of HBM with DDR4 used like a swap partition; give an APU 16GB of HBM and DDR4 wouldn't be necessary...provided there are actually drivers for using HBM as general system ram.

    Dear AMD,

    I'd like 6-10 of whatever your next-gen cores are, Navi or better for graphics, with 16GB of HBM2 attached for your next desktop APU. We'd probably need a new socket and motherboards to support it. That's fine. A lot of us would be perfectly content with that APU for $400-$600 if it could play most of our games at 1080p120 to 1440p60. Seriously, please combine your next gen $150 CPU & $250 GPU models with some HBM2 to negate DDR4 graphics lag so we can have a real desktop & gaming grade APU on the consumer level.

    Thanks

    Leave a comment:


  • tildearrow
    replied
    Originally posted by L_A_G View Post

    Kind of goes to show how much air there is the price of that Quadro.
    What about double-precision float performance?

    Leave a comment:


  • M@GOid
    replied
    Originally posted by schmidtbag View Post
    You can compete with modern consoles for a similar price. Consoles can't play 4K@60FPS. Hell, many games can barely do 1080p@60FPS with max detail.
    You don't have to buy the best to have a good experience. Nvidia is capitalizing on the fact that people feel the need to one-up each other. Meanwhile, I'm still using an R9 290 that continues to get performance improvements and new features and handles most Linux games at 1080p, 60FPS, max detail. I paid 11% for that GPU vs what the Titan RTX costs, I got some free games from it too, and it still gives me a better gaming experience than consoles.

    Things aren't so bleak, people just need to lower their expectations. If you're ok with 30FPS, 4K gaming is actually surprisingly affordable. There's more to gaming than having the best FPS.

    EDIT:
    It's worth pointing out that if you are playing modern games in 1080p or 1440p, you can turn down the texture detail a notch down from ultra and have little to no visible difference. Those extra high-res textures are only practical for 4K and when looking at something point-blank. Although changing texture detail doesn't have a big impact on overall GPU compute performance, it does have a heavy impact on VRAM usage, and once that gets fully used up, you'll start to get a noticeable performance hit.
    Well, I personally like, the card allowing it, to keep the textures high while lowering the illumination and shaders effects.

    But since you talked about 4K/60fps, you know what pisses me off to no end? People taking a expensive top of the line card to play 4k, setting the anti-aliasing to 8x and them claiming there is not a single card on the market capable of 4k/60fps...

    Leave a comment:

Working...
X