Announcement

Collapse
No announcement yet.

USB 3.2 Specification Published

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    If I could network computers with this it would be useful to me.

    Comment


    • #32
      So many different connectors, different cables, different standards, alternate modes, optional features, it seems USB has lost the "universal" part of its name. If its running two channels in parallel to double bandwidth, it has lost the "serial" part as well.

      Comment


      • #33
        Originally posted by torsionbar28 View Post
        If its running two channels in parallel to double bandwidth, it has lost the "serial" part as well.
        It is still serial because the communication over the lane is serial. PCIe is also serial for example, and can use more lanes since its introduction.

        Comment


        • #34
          Originally posted by torsionbar28 View Post
          So many different connectors, different cables, different standards, alternate modes, optional features, it seems USB has lost the "universal" part of its name. If its running two channels in parallel to double bandwidth, it has lost the "serial" part as well.
          All true, but it is still the same 4-wire design. And no matter what you initially intended to create will there always be a manufacturer who will seek an advantage by introducing their own connectors, cables and modes. Sure, it can create headaches for some, but this isn't something you can control, because technology only keeps advancing. If you stop there, because you fear it confuses, then you're just being foolish. You cannot stop it and you will only get overrun by it. The moment USB stops progressing that's also the moment when it's become obsolete, because others will jump in and use the advancements in technology to create something better and they won't hesitate to make it incompatible. You then decide, which is the lesser evil.

          If you could invent a zero-wire connection with an infinite transfer speed will someone ask if this speed is countably or uncountably infinite and then someone will invent something better, and for the sole purpose of connecting an infinite number of infinite data storages together and to still have instantaneous transfers.

          And having a standardized 4-wire design does increase your chance for survival in a post-apocalyptic zombie-infested world when you really only need to strip off the isolation and twist 4 wires together. It may not give you SuperSpeed, but you should be able to listen to your favourite songs while chopping off heads.

          Comment


          • #35
            Originally posted by starshipeleven View Post
            As it is, I can get anything from USB 2.0 ports (yes, I've seen that) up to USB 3.2 in a Type-C connector, this is completely retarded, there is no way to tell what the hell is that port without specs or testing.
            But I think that's the whole point of USB: physical compatibility in exchange for user reading the device's manual to find out its capabilities. I mean, the problem you're describing has existed ever since USB 2.0. You could plug something into a USB 2.0 port, but you couldn't tell whether you'd get 2.0 or 1.1 transfer speeds.

            And yes, I do realize that while the above is not that painful when it comes to a single device, it's a whole other matter when you're looking at a motherboard trying to figure out 6-12 ports.

            Comment


            • #36
              Originally posted by dwagner View Post
              I wonder why they still call it "Universal Serial Bus 3.2" if it isn't really serial anymore (but using two parallel lanes).
              Serial is when you're pushing the data bit after bit, one after the other. i.e.: to send a 8-bit byte, you send 8 pulses (or more if error correction is involved) over a wire. The ur-example is the RS-232 serail port on a PC. It has the advantage of being electrically simple (fewer wires) but has the dis-advantage of requiring circuitry (you need a chip to buffer and send the bits) and being slower (bit-by-bit).

              Parallel is when the bits of the data are all travelling together at the same time, each in a separate wire. The ur-example is the IEEE-1284 printer paralel port. It has 8 wires that send the 8bits of caracters that need to be printed. It has the advantage of requiring less circuitry (the data byte is straigth here in the lines) and being much faster (a whole byte at a time) . (e.g.: this gave the possibility to make digital sound output using only a resistor ladder exclusively, no a single chip required).
              The biggest disadvantage is on the electrical/signal design : you have to make sure that all the pulse reach the destination at the exact same time (just see the weird wavy labyrinth that the memory traces on motherboards use, just to make sure that the DIMM's 64 bits all arrive at the same time).,

              What USB 3, PCIe, SATA express, HDMI/DVI, Display Port, etc. are doing IS NOT parallel :
              they are not sending each bit of a byte down its own separate wire.
              Instead the data is still travelling down the wires serially. But these connectors have more serial lines, meaning that they can send several data packets at the same time.

              I.e.: they are not doing what a IEEE-1284 parallel port is doing.
              They are doing the same kind of idea as two Gigabit Ethernet port being bonded together using something like IEEE 802.3ad Link aggregation.

              This both keeps the advantage of Serial (simpler electrically, as you only use 1 pair per channel, no need to make sure all the bits travel each at the same time), while giving the advantage of parallel (multiple channels for higher speed) without its disadvantages.
              This at the cost of more software complexity (now you need to dispatch packets over several different channels - e.g.: in a round-robin fashion. And the receiving end needs to re-order them accordingly), but modern chips used in USB3/DP/Thunderbolt devices have largely enough processing capabilities to handle this transparently.

              Originally posted by sdack View Post
              People have always been plugging things in and then wondering why it's not working. It's not possible to make this absolutely fail-safe, because you just cannot eliminate the physical world.
              Like putting a USB flash drive into a PS2/USB keyboard pin-converter and then into a PS2 port. (I've actually *seen* this).
              Bonus point if the PS2 port is actually a mouse/keyboard adapter on some legacy gaming console (like the PS2 to Mapple bus of a SEGA DreamCast console)
              (Remember folks: just because it fits doesn't mean that it will work).

              Originally posted by patstew View Post
              The fact is that a flash drive is never going to make a video signal when you plug it into a TV, a Raspberry Pi is never going to provide 100W to a downstream device, and a keyboard isn't going to stick 20Gbps transceivers on its circuit board. No amount of interface standardisation is going to fix that, it's just the nature of the devices in question.
              No, but protocol standardisation and signaling are going to help a lot.
              (Apple is doing this already with Lighting and Thunderbolt cable all requiring a chip in the connector).
              (USB-C connectors are a step in the right direction, because legacy USB data channels are still available in alternate mode).
              i.e.: some standard in the future (USB 4.0 ? USB-D ? USB-C 3.2 gen 4 ?) could require that no matter what the alternative mode or whatever the fuck, every single end-point on this kind of links MUST ALWAYS support USB OTG, and MUST ALWAYS advertise itself over this USB link.

              Thus the TV will be able to detect that it is not plugged to a DisplayPort Source (it's technically already achievable with USB-C ports)
              The Raspberry Pi will politely explain to the laptop that it cannot provide 100W (and in fact, the USB Power Delivery (USB-PD) protocole already handles exactly this kind of negociations)
              And the keyboard doesn't give a fuck (even as today) because it still talks the legacy USB protocol.

              Thus instead of the users not understanding why it doesn't work, they can get a nice informative message.
              (Just think the kind of charger warnings that you get on modern day smartphone and tablet, but extended to every type of weird permutation that you could find on USB-C connectors).

              i.e.: this will alleviate starshipeleven 's plight as he won't need to explain to the users why they are trying to do stupid shit (on the par of the USB flash drive into PS-2 port), the laptop will be able to do it on its own.

              I presume Apple will be the first to try to make an idiot proof user friendly warning (although a pretty useless one - given their tendency to oversimplify).
              And at some point of the future, some Linux framework like Solid will do the same.

              Comment


              • #37
                Originally posted by Zan Lynx View Post

                We don't need any USB 3.0 or 3.1 ports. Make ALL the ports USB 3.2 and a few USB 2.0 ports for keyboards and headphones. That will eliminate user confusion.

                I'd say remove all the USB 2.0 ports but then people will waste high bandwidth ports on a 1.5 Mbps device. Although I do like it on laptops when you just have 2 - 4 of the best ports and don't have to worry about which ones are which.

                Also, I am not surprised. The USB Alternate data streams are already running 40 Gbps over a USB-C port. So obviously the wiring is more than capable of 20 Gbps. All they had to do for this is update the driver chips.
                If we get 6 USB 3.2 instead of 4 USB 3.2 + 2 USB 2.0 ports, how is it a waste? Sure you can connect a mouse or a keyboard into a couple of those ports and not use all the bandwidth, but someone else might actually need those ports/bandwidth...

                Comment


                • #38
                  Originally posted by sdack View Post
                  Sure, it can create headaches for some, but this isn't something you can control, because technology only keeps advancing. If you stop there, because you fear it confuses, then you're just being foolish. You cannot stop it and you will only get overrun by it. The moment USB stops progressing that's also the moment when it's become obsolete
                  Confusing "change" with "progress" is a common mistake. Different != better. Sometimes you end up with a broken hope-n-change mess that nobody has a solution for. Obamacare, for example.

                  Comment


                  • #39
                    Originally posted by torsionbar28 View Post
                    Confusing "change" with "progress" is a common mistake. Different != better. Sometimes you end up with a broken hope-n-change mess that nobody has a solution for. Obamacare, for example.
                    I'm not confusing anything here. You only have the opinion that it's not getting better.

                    Comment


                    • #40
                      Originally posted by nomadewolf View Post

                      If we get 6 USB 3.2 instead of 4 USB 3.2 + 2 USB 2.0 ports, how is it a waste? Sure you can connect a mouse or a keyboard into a couple of those ports and not use all the bandwidth, but someone else might actually need those ports/bandwidth...
                      On a laptop I'd rather have them all be USB 3.2. On a desktop it is often the case that the motherboard can support 4 USB 3.x ports. Any more ports means adding another semi-expensive interface chip and using more PCIe lanes. The motherboard can often wire out a bunch of USB 2 ports just because those are already in the chipset. They're basically free ports!

                      Comment

                      Working...
                      X