Announcement

Collapse
No announcement yet.

NVIDIA Announces The GeForce RTX 4060 Series

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • jasonjulius1122
    replied
    Originally posted by RejectModernity View Post

    Oh wow, even Nvidia shill is pissed how shitty new nvidia cards are.

    I understand that you have concerns and frustrations about the performance and pricing of the NVIDIA 4060 Ti graphics card. It can be disappointing when a new generation of hardware doesn't meet expectations or provide a significant improvement over previous models. It's important for companies to listen to consumer feedback and strive for better offerings. As for competition, having strong competition in the market often leads to better options and more affordable prices. It's always good to voice your opinions and concerns to help drive positive changes in the industry.

    Leave a comment:


  • ipkpjersi
    replied
    Originally posted by TheLexMachine View Post

    All the 4060/4060 TI cards that I've looked at are using the standard 8-pin connector.
    Yeah, I don't see any reason why the 4060/4070 Ti would be using a 12VHPWR connector, they aren't going to pull anywhere near enough power for it to be necessary. They are very efficient cards.

    Leave a comment:


  • TheLexMachine
    replied
    Originally posted by Paradigm Shifter View Post
    I'd actually be interested in a pair of 4060Ti 16GB cards if a manufacturer releases a version using a standard 8-pin rather than the 12VHPWR. Why, just why, on a 140W card?
    All the 4060/4060 TI cards that I've looked at are using the standard 8-pin connector.

    Leave a comment:


  • drakonas777
    replied
    Originally posted by avis View Post

    How is this supporting NVIDIA? Stretching sarcasm to fit your agenda? Whoa, didn't know modern politics and their tricks have found their way into the normal people discourse.

    Oh, wait, you didn't even understand it was sarcasm. Sorry, not interested in your pathetic disgusting stupid insinuations and vapid accusations.

    Never ever in my entire life I've said or implied "NVIDIA is good". I don't fucking care.

    God, everything you wrote is so mean and stupid it's just cringe.
    Yeah, yeah, I know. You forgot to mention about adding me to the ignore list. That is usually what happens when you face facts and logic - you just ignore them. Pretending to be a victim of "toxic Linux community" after writing some illogical nonsensical bullshit is your typical behavior and everyone in this forum is well aware of it.

    The only cringe thing is your reasoning. I really can't understand how you lack even basic logic in it. Like measuring the worth of people opinions based on the amount of Linux patches they made. Like amount of patches automatically make you right or wrong...

    Anyway, I have no interest to continue this offtopic. Answering your shitposts is quite boring to be frank.
    Last edited by drakonas777; 20 May 2023, 09:40 AM.

    Leave a comment:


  • avis
    replied
    Originally posted by drakonas777 View Post

    For example, one of your latest posts in recent new comments:



    This kind of non-argument fanboy shit-posting is almost a credit card of yours. Also, you do this very early in the comment thread and then complain how somebody dared to answer your with another not serious post, because you know, people should meet this kind of garbage takes with respect and seriousness. Fucking LOL
    How is this supporting NVIDIA? Stretching sarcasm to fit your agenda? Whoa, didn't know modern politics and their tricks have found their way into the normal people discourse.

    Oh, wait, you didn't even understand it was sarcasm. Never ever in my entire life I've said or implied "NVIDIA is good". Sorry, I'll see myself out.
    Last edited by avis; 20 May 2023, 02:52 PM.

    Leave a comment:


  • drakonas777
    replied
    Originally posted by avis View Post
    Citations needed. I've been accused of that a hundred times already without a single proof.
    For example, one of your latest posts in recent new comments:

    Originally posted by avis
    You cannot say good things about NVIDIA on Phoronix It's an evil scummy company and only AMD should be mentioned here.​


    This kind of non-argument fanboy shit-posting is almost a credit card of yours. Also, you do this very early in the comment thread and then complain how somebody dared to answer your with another not serious post, because you know, people should meet this kind of garbage takes with respect and seriousness. Fucking LOL

    Leave a comment:


  • oleid
    replied
    Originally posted by avis View Post

    You think NVIDIA and Microsoft care about Linux?
    Actually, yes. most of Nvidia's professional hardware is running on Linux systems. And Microsoft cares so much about Linux they integrate it directly into their operating system.

    Originally posted by avis View Post
    Tell me how to enable HDR in Linux? Tell me how to enable display scaling with a single command (which works both for graphics output and console)? Tell me how install any random application from the net without compiling/chrooting/virtualization? You know something akin to downloading an exe and launching it?
    It is really amusing how clueless your are. It seems as if you ran Linux once a decade ago.
    Last edited by oleid; 19 May 2023, 04:52 PM.

    Leave a comment:


  • rob-tech
    replied
    Wow, this is garbage really from Nvidia (or NGreedia as I say it), such a small uplift with crippled 8 GB of VRAM which is not enough in 2023 even on a 1080p card. Just the other day I was playing Forza Horizon 5 on the highest Extreme graphics preset with my 5700 XT (paired with 5950x) at 1080p, and while the performance was great at a smooth 75 fps even in most demanding scenes, the VRAM was full with 8000+ MB of usage. This is a 2019 card that I enjoyed for many years and still serves me perfectly for my 1080p gaming needs, however, it is clear that soon I will have to start turning down settings from the highest Ultra presets. I would have to be some deranged lunatic to get a GPU now with only 8 GB, at least 16 GB and preferably 24 if the monitor is upgraded to 4K.

    Leave a comment:


  • blacknova
    replied
    Originally posted by sophisticles View Post

    I would favor a shift in PC gaming, where games started simply focusing on higher quality textures and card started using slower ram that is cheaper and less power hungry but adding gobs of it.
    Might not work since you still need fast ram for good effects, ray tracing, multiple rendering passes, etc. BUT finally using unified memory between GPU and CPU with GPU on-board "cache" (for framebuffer and render targets) might work. Current game just underutilising system memory.

    Leave a comment:


  • sophisticles
    replied
    Originally posted by Melcar View Post
    For a 1080p card 8GB is enough these days and probably for the near future. If you want more then you go to a higher tier with more VRAM. 8GB should be the minimum however, and this should be standard on any card of this type. Price will be the big factor here and for these kind of cards the target should really be $250-$300.
    Unless you want to play RE4 Remake with all the settings maxed out at 1080p, which then eats up 12gb vram and still wants more:

    In this video I want to prove that you can't max out the game's graphical settings even if you have a GPU with 12GB of VRAM. Even at 1080p this is not possib...


    The game is very impressive to see in action but the high quality textures and al; the visual effects eat up ram vram like it's going out of style.

    To a lesser extend RE Village is the same way and there is on other game that also loves vram, but I can't remember the name.

    This may be a sign of things to come, with game demanding more and more vram and honestly that may be a good thing.

    Card have been focusing on using faster and faster ram and game engines have been focusing on using more aggressive AA and AF modes, but the best thing for visual quality is higher quality textures and that requires a lot of vram.

    I would favor a shift in PC gaming, where games started simply focusing on higher quality textures and card started using slower ram that is cheaper and less power hungry but adding gobs of it.

    Leave a comment:

Working...
X