Announcement

Collapse
No announcement yet.

USB 4.0 "USB4" Specification Published

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • tuxd3v
    replied
    Originally posted by ThoreauHD View Post
    So.. what's changed between this and usb 3.2 protocol? This shit's really confusing. Asrock X570's have thunderbolt 3/usb 3.2 already. What's the 4 do?
    USB 3.2 in a nutshell, gives you 2 new SuperSpeed+ modes( IF data are passed trough USB-C conector.. ).
    • 10Gb/s
    • 20Gb/s
    USB 4.0 in a nutshell, gives you 40 Gb/s( doubles the bandwidth.. ), If trough USB-C connector/cable..
    Its also compatible with Thunderbolt 3 and older revisions of USB til USB2.0..

    if you already have Thunderbolt3( compatible with usb3.1, I believe ?!),
    You already have 40 Gb/s..

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by pegasus View Post
    On the other side ... in 2010 you only had 40Gb in HPC cluster, in 2020 you'll have it on your desktop.
    So it takes about a decade for tech to be ready for everyday consumer.
    This is accurate. When I was building DEC Alpha supercomputers at DEC/Compaq in the late 1990's, those machines all had Gigabit Ethernet cards. They were large cards with a big heatsink on them. It was another decade before Gig-E would be a mass market consumer technology.

    With NIC cards, power consumption and heat output are major factors. You can put a big heatsink in a server with screaming fans to cool it. Not so much in a consumer pc, or even worse, a laptop. Semiconductor lithography is a big part of this. As CPU's and GPU's move to smaller process node, older fabs are freed up to make things like NIC card chips. So yes it does take many years before the newest technologies can be economically shrunk down to operate cool enough for installation in consumer equipment.

    Leave a comment:


  • tuxd3v
    replied
    Originally posted by brent View Post
    So does this support PCIe or not? I think this was one major feature where it still wasn't clear which direction USB would take. I of course hope it does retain PCIe support from Thunderbolt! Everything else sounds great - especially the flexible bandwidth allocation.
    USB 3.1 at least is very close, even in frequencies used,
    You can see it by the PCIe raisers they use a USB 3.1 connector/cable, of 1 meter max( due to wave length, line impedance, etc).

    What we see in a cheap way, is sending a pair of PCIe( x1 ), via USB 3.1 cables, but I believe no USB protocol is involved, only the communication Line is used( conector/Cable ), the protocol is PCIe..

    I believe what you want is some PCIe 'pass-trough' USB, or PCIe encapsulated in USB packets?

    Leave a comment:


  • anth
    replied
    The spec can be downloaded from https://www.usb.org/document-library...-specification

    Leave a comment:


  • davidbepo
    replied
    Originally posted by AndyChow View Post

    I second this. I don't want random hardware that has direct access to PCIe lanes. I'm pretty sure they showed external thunderbolt "harddrives" can take over your computer, aka "Thunderclap".

    I can also imagine a USB Killer type device that now fries your CPU directly.
    welp, i guess thats a good reason

    Leave a comment:


  • ThoreauHD
    replied
    So.. what's changed between this and usb 3.2 protocol? This shit's really confusing. Asrock X570's have thunderbolt 3/usb 3.2 already. What's the 4 do?

    Leave a comment:


  • pegasus
    replied
    On the other side ... in 2010 you only had 40Gb in HPC cluster, in 2020 you'll have it on your desktop.

    So it takes about a decade for tech to be ready for everyday consumer.

    Leave a comment:


  • AndyChow
    replied
    Originally posted by carewolf View Post

    Security issues.
    I second this. I don't want random hardware that has direct access to PCIe lanes. I'm pretty sure they showed external thunderbolt "harddrives" can take over your computer, aka "Thunderclap".

    I can also imagine a USB Killer type device that now fries your CPU directly.

    Leave a comment:


  • carewolf
    replied
    Originally posted by davidbepo View Post
    but why?
    Security issues.

    Leave a comment:


  • davidbepo
    replied
    Originally posted by carewolf View Post

    I hope it doesn't. That's a misfeature.
    but why?

    Leave a comment:

Working...
X