Announcement

Collapse
No announcement yet.

Intel Announces Thunderbolt 5 With 120 Gbps Bandwidth Boost

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by debrouxl View Post
    144 Hz, let alone 540 Hz refresh rate, for displays normally aimed at being watched by humans whose brains can't process images at rates close to those... why, just why ? Oh yeah, newer and bigger is better and more expensive, too. Sorry, I temporarily forgot about that.
    But you can also achieve 4K144Hz 12bit RGB444 without DSC with tb5

    Comment


    • #12
      Originally posted by debrouxl View Post
      144 Hz, let alone 540 Hz refresh rate, for displays normally aimed at being watched by humans whose brains can't process images at rates close to those... why, just why ?
      Oh yeah, newer and bigger is better and more expensive, too. Sorry, I temporarily forgot about that.
      Lol, lmao even. What are you some console gamer making shitty arguments from the 2010s?

      just because you can process images as smooth motion at 24fps does not in fact mean that that's the limit your brain can understand. You ever notice how 60fps videos feel uncanny? Do you know why Gamers consider anything less than 30fps to be unplayable (because it's processed as less than real time) but they'd really rather 60 or more? It's because your brain can and does process much faster and Linus Tech Tips did a blind experiment that showed that yes in fact high refresh rate monitors make you a better gamer https://www.youtube.com/watch?v=OX31kZbAXsA .

      More information = Better Prediction Calculations = Better Gaming. It's that simple. There's going to be a limit and diminishing returns but try to learn about what you speak next time before you open your mouth and shove your foot in deep.

      Comment


      • #13
        Darn, and I just got 40Gbps in iperf3 on my USB4 port and now they moved the goalposts.

        Comment


        • #14
          Originally posted by edwaleni View Post
          Darn, and I just got 40Gbps in iperf3 on my USB4 port and now they moved the goalposts.
          Is that a laptop?🤔

          Comment


          • #15
            Originally posted by debrouxl View Post
            144 Hz, let alone 540 Hz refresh rate, for displays normally aimed at being watched by humans whose brains can't process images at rates close to those... why, just why ?
            Oh yeah, newer and bigger is better and more expensive, too. Sorry, I temporarily forgot about that.
            144 Hz is noticeable, however, beyond that not really. Unless you're a really competitive gamer, but even then it will only be a relatively small benefit.

            Comment


            • #16
              The most important addition is missing from the article: 240W charging.

              Comment


              • #17
                Originally posted by Luke_Wolf View Post
                Lol, lmao even. What are you some console gamer making shitty arguments from the 2010s?
                Nope, I never had a games console aimed at large screens, and I seldom played games consoles. But I did have some arguments with gamers investing (or having their parents invest) disproportionate amounts of money in contemporary high-end computer equipment which makes questionable difference.

                Originally posted by Luke_Wolf View Post
                More information = Better Prediction Calculations = Better Gaming. It's that simple.
                And in fact, I'm even aware of that to some extent, as you can read in my second post in this topic. I'm aware that some unconscious image processing mechanisms happen at a rate beyond 24 FPS, and that smartphones' focus on maintaining 60 FPS for screen effects isn't useless.

                Originally posted by Luke_Wolf View Post
                There's going to be a limit and diminishing returns
                "Limit and diminshing returns" is one of the things I'm seemingly unsuccessfully trying to get at, using sarcasm...

                Over the past couple years, I've spent hundreds of hours in First-Person Shooters, on an old 1600x900 monitor, using my CPU's IGP. The framerate was noticeably < 24 Frames Per Second in a number of complex combat scenes, due to lighting, atmospheric effects, etc.
                Sure, compared to the experience I could have had on higher-end equipment, I certainly wasted ammo on opponents, and I took more damage from opponents... but investing 500+€ in a dGPU + a high refresh rate monitor, or a whole new 1000+€ computer and monitor, so that I can use less ammo, repair weapons and armors less often, or use less food / water / sleep / chems to heal my character, in games where ammo, weapons, armor and healing are plentiful (when one spends enough time gathering them, instead of doing the bare minimum to progress the main questline and thereby missing a huge part of the games' content) could arguably be considered a questionable use for money. Just saying.

                Comment


                • #18
                  Originally posted by debrouxl View Post
                  144 Hz, let alone 540 Hz refresh rate, for displays normally aimed at being watched by humans whose brains can't process images at rates close to those... why, just why ?
                  Oh yeah, newer and bigger is better and more expensive, too. Sorry, I temporarily forgot about that.
                  There's a very simple test to prove you (wrong): set your display to 60 Hz, open the Start menu, and observe the animation.

                  Now set your display to 30 Hz and do the same. Notice the stutter? If not, then go see a doctor.

                  Comment


                  • #19
                    Originally posted by Doomer View Post

                    So what? We stop all r&d until a certain % of marketshare is reached for a particular technology to start working on the next version?
                    That's not what I had in mind when I was penning that post. I just lamented the status quo.

                    Originally posted by dragorth View Post

                    This is not actually as true as you might think. For example, the current gen of AMD chipsets include USB4. But they are being marketed as USB 3 because MS has in their contracts that if they want to sell Windows on it, the USB 4 implementation has to include certain of the optional USB 4 parts, such as Thunderbolt 3.

                    I am of 2 minds on this. The first part is MS is trying to fix the mess the USB folks made in the frankly fragmented standard. On the other hand, I don't like MS forcing their ideas on the rest of the world like this.

                    So, you may have USB 4 ports capable of those faster speeds but the manufacturer is forced to promote it as USB 3.
                    AMD chipsets may support USB5 for all I know. Doesn't mean consumer devices have faster than standard USB3.0 (5gbps) ports. I have a brand new AMD Phoenix laptop where out of 4 USB ports, two are standard USB 3.0. Two are USB-C's with a 40gbps signalling rate but one of them is used for charging. I.e. there's essentially a single 40gbps USB-C port.

                    Comment


                    • #20
                      Originally posted by Vistaus View Post

                      144 Hz is noticeable, however, beyond that not really. Unless you're a really competitive gamer, but even then it will only be a relatively small benefit.
                      it's highly dependant on the person, I can't tell much past 90hz, my brother can reliably tell the difference between a 144hz and a 240hz panel no issues

                      Comment

                      Working...
                      X