Announcement

Collapse
No announcement yet.

PCI Express 4.0 Is Ready, PCI Express 5.0 In 2019

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by waxhead View Post
    If it was up to me the entire GPU should just go away and the CPU should be powerful enough so that we don't NEED GPU's. Perhaps when we have enough cores there is no point in a dedicated graphics processor at all.
    That's a bit unrealistic expectation. GPUs have thousands of tiny and crappy cores, CPUs have less than 20 rather powerful cores. You can't turn one into the other, they are pretty much one the opposite of the other.

    And if you make a middle-of-the-extremes, then it means your system still gets beaten senseless by a CPU + GPU combination.

    Comment


    • #22
      Originally posted by starshipeleven View Post
      Since you still don't get it I'll try breaking it down for you:
      Because routing 500+ watts through the motherboard is not very practical, and somewhat dumb when you could just use cables instead. So you won't get ports that can provide any significant amount of power either.
      It's also harder to dissipate the power from the mobo as there's less room for heat sinks and fans that way. Well, liquid cooling could be an option, but rather expensive as a default.

      Comment


      • #23
        Originally posted by starshipeleven View Post
        That's a bit unrealistic expectation. GPUs have thousands of tiny and crappy cores, CPUs have less than 20 rather powerful cores. You can't turn one into the other, they are pretty much one the opposite of the other.

        And if you make a middle-of-the-extremes, then it means your system still gets beaten senseless by a CPU + GPU combination.
        Of course! I was not talking about a realistic scenario at all. At least not for the foreseeable future. It was the principle I was trying to get across.

        http://www.dirtcellar.net

        Comment


        • #24
          Originally posted by waxhead View Post

          Of course! I was not talking about a realistic scenario at all. At least not for the foreseeable future. It was the principle I was trying to get across.
          I think the previous gen Intel "iGPUs" (before Sandy Bridge) were located on a chip on the motherboard. The idea is not new. It's just not practical with the current pace of GPU perf development.

          Comment


          • #25
            Originally posted by M@yeulC View Post
            Using q x16 connector, though. But as long as you don't need to send back the data to another graphics chipset, stream textures from memory, or do other interesting stuff, you should be fine, if I understand correctly.
            You appear to misunderstand. Expresscards are hot-swappable expansion cards for laptops.
            They aren't common anymore though, probably due to them being stuck at PCIe 2.0, and the trend of laptops being thin and stripped of hardware features.

            Comment


            • #26
              Originally posted by starshipeleven View Post
              I thought computing wasn't so bandwith-intensive, but I'm no expert. I've only seen mining rigs, password cracking rigs and similar consumer-grade stuff where there isn't much bandwith usage.
              That's namely because there isn't enough bandwidth there to do anything more complex. One of the biggest issues with GPGPU compute is bandwidth and latency. More complex applications often times have to repeatedly copy memory back and forth between system RAM and GPU VRAM, so that some calculations execute on the CPU, and then are sent to the GPU, and back to execute on the CPU, in a very tight loop. Hence, in a lot of applications that are more complex than crypto-mining, GPU utilization is limited. If you've ever run some BOINC applications on your GPU, you'd easily see what I mean. You can buy an expensive GPU, but there's only enough bandwidth to saturate maybe 10-20% of the total power of your card.

              Comment


              • #27
                Originally posted by Electric-Gecko View Post
                You appear to misunderstand. Expresscards are hot-swappable expansion cards for laptops.
                They aren't common anymore though, probably due to them being stuck at PCIe 2.0, and the trend of laptops being thin and stripped of hardware features.
                ExpressCards aren't common anymore because Thunderbolt can provide x4 PICe, USB 3.1, and Display Port over the same connector - there is no need to update the ExpressCard standard, it's been superseded.

                Comment


                • #28
                  Originally posted by starshipeleven View Post
                  I thought computing wasn't so bandwith-intensive, but I'm no expert. I've only seen mining rigs, password cracking rigs and similar consumer-grade stuff where there isn't much bandwith usage.
                  I've been working on a password cracker, I can transfer gigs to the GPU to process quite fast, even more so if I get Ryzen, but the limiting factor right now for that is probably RAM, my 1070 only has 8GB and that fills up too fast.

                  Oddly the most expensive part is transferring back to CPU land. I'm using a library that does JIT compiles of compute kernels for OpenCL or CUDA which has been great but extracting only the results I need has been the task that takes the longest so far, my CPU can filter the results much faster on a single core/thread oddly but then transferring the data to be filtered takes just as long if not longer.

                  Pretty sure GPU could filter this efficiently so hopefully the devs of the library I'm using can add support for that. I just moved string generation onto the GPU and it's super fast compared to CPU No longer needing to send gigs of data to the GPU constantly now.

                  Comment


                  • #29
                    Originally posted by quaz0r View Post
                    OMG seriously it is time to honestly re-evaluate where we are at and where we are going with regard to computing, and implement a better overall design. This PCI expansion slot business is a throwback to the early days of computing where anything and everything might and did exist as an expansion card. This PCI 5000.0 with x3000 lanes and whatever is for one thing and one thing only: a GPU. Whether you are into compute or gaming or whatever, that is what it is for. Just make motherboards with a GPU socket already, and dispense with this expansion slot nonsense. Or whatever the answer is, surely there must be an answer more appropriate than a x3000 PCI slot that you plug this huge board + processor into and oftentimes have to plug in a bunch of extra power connectors and everything too. You are practically plugging a second motherboard+proc+ram+power into a friggin PCI slot. It is ridiculous.
                    What about systems that need multiple GPU? GPU aren't just used for games. What about having many USB controllers(and thus more ports), with PCIe 3.0, this doesn't allow for supporting many USB 3.1 / C controllers. Same if you want to add thunderbolt or other bandwidth intensive features. If anything we need mobos being able to support more x8/x16 slots with less of the x1/x4 slots.

                    Comment


                    • #30
                      Originally posted by polarathene
                      What about systems that need multiple GPU? GPU aren't just used for games.
                      Indeed. I'm touched that you wish to inform me coprocessors are not just gaming tools. If we must speak of my own personal interests, I haven't played a game since quake3. My personal area of interest right now is HPC. I'm sure you were truly dying to know that.

                      Originally posted by polarathene
                      If anything we need mobos being able to support more x8/x16 slots with less of the x1/x4 slots.
                      Indeed again. We've now come full-circle back to the intent of my original comment: "There's probably a better way."

                      Comment

                      Working...
                      X