Announcement

Collapse
No announcement yet.

PCI Express 7.0 Specification Announced - Hitting 128 GT/s In 2025

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Well Well amd will be happy because them stay ready for future x400/x500 xt card with 0.5 lanes XXDDD because 1 lane is too much for users (lisa su asking for who think in our shareholders and in next lisa su bugatti)

    Last edited by pinguinpc; 21 June 2022, 07:14 PM.

    Comment


    • #12
      Originally posted by s_j_newbury View Post
      There really isn't any obvious use-case that I can see for such bandwidth, at least outside of HPC or scientific data acquisition, perhaps
      And the hyperscalers (which you may be including in your HPC group). But those are the customers that drive the industry forward because they do have a need, and they spend a lot of money ("follow the money" works for both LE and system design).

      Comment


      • #13
        Originally posted by zexelon View Post

        LOL yes, I look forward to the the Nvidia 6090... comes complete with NVME port and optional cpu socket.
        And a small nuclear reactor to power it!

        Comment


        • #14
          if only we could get a new version of thunderbolt with each new version of pcie.

          Comment


          • #15
            Originally posted by tildearrow View Post
            Come on, it's too early! I don't think there are any consumer PCIe 5.0 devices on the market...
            But there is a huge need for PCIe 10.0. One word: snaps

            Without huge speedups in single-threaded decoding speed (CPU) and PCIe bandwidth, future Ubuntu versions will launch apps slower and slower. Even today PCIe 4.0 & latest gen Intel i9 is on par with spinning rust HDDs and 1st gen Core i7. There's a desperate need for faster hardware. It won't take long before the apps inside the snaps will be 100% js node+electron based, slowing down the system even more.

            Comment


            • #16
              Originally posted by theriddick View Post

              And a small nuclear reactor to power it!
              Oh no no no, you wont need more then a light weight battery after you get rid of all the other power hogs in the computer... I mean the 4090 is rumored to consume up to 800W... so clearly its the CPU and RAM that are pushing the power draw limits over that value

              Comment


              • #17
                I think this spec might be mostly be used for glue logic chiplets for accelerators that would normally experience latency bottlenecks in slots. Like AI, Media Engines, really big iGPUs that take over Nvida's xx70 tier

                Comment


                • #18
                  Originally posted by theriddick View Post

                  And a small nuclear reactor to power it!
                  Well, fusion has always been 30 years away. If nVidia can deliver it sooner I am all for that.

                  Comment


                  • #19
                    Originally posted by Jahimself View Post
                    lol in 2032 pc will be a graphic card and maybe some ram, and that should do it.
                    a GPU, and a DPU for my 100Gig networking and OS. RDMA to the NAS's NVME SSDs to load the OS and all my file from.

                    Comment


                    • #20
                      Originally posted by caligula View Post

                      But there is a huge need for PCIe 10.0. One word: snaps

                      Without huge speedups in single-threaded decoding speed (CPU) and PCIe bandwidth, future Ubuntu versions will launch apps slower and slower. Even today PCIe 4.0 & latest gen Intel i9 is on par with spinning rust HDDs and 1st gen Core i7. There's a desperate need for faster hardware. It won't take long before the apps inside the snaps will be 100% js node+electron based, slowing down the system even more.
                      Eventually Canonical will rewrite the Linux kernel in JavaScript, and then.......

                      ...off-topic much? ;p

                      Comment

                      Working...
                      X