Announcement

Collapse
No announcement yet.

USB 3.2 Specification Published

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by DrYak View Post
    Well, actually it does change.
    No, it doesn't. I'm still only talking about the ability to connect. I did explicitly say that you can get more speed, so I don't know why you'd want to object here when I'm talking about the basic ability to connect.

    What matters most is to get a connection and to get two devices talking to each other. This is what is important and what allows the devices to communicate. You may only not get maximum speed from the devices, but many people are happy with just getting a connection, any connection.

    There will always be new devices that are faster and better, but just to be able to connect with them is what people want the most. Sure, they want the best in the end, it's what we all want, but most of them aren't stupid.

    People certainly know they have to upgrade their hardware at each iteration to stay on the trend of ever increasing speeds and sizes. However, most don't want to do this, because they don't see that much value in upgrading to each new generation. They are content with having slower speeds, because of the knowledge that when they upgrade it won't make their connected devices obsolete, but that they can continue using them and may even make them faster.

    Comment


    • #52
      Originally posted by DrYak View Post
      So you have no idea, you try to plug your phone into your tablet with a USB-C-to-USB-C cable.
      But we're slowly getting to the point that both device will detect the situation and issue a warning that charging is impossible.
      But that's the thing, by that point it's too late. I would have bought a USB-C to USB-C cable, maybe even a new phone that has a USB-C port... just to find out that it doesn't work. And then I still wouldn't be sure whether it doesn't work because the cable is not capable of it, or the interface on one of the two ends (or both?).

      Comment


      • #53
        Originally posted by starshipeleven View Post
        Yes but with caveats. With 3.0 ports they usually have either a SS symbol near the port or the connector is blue.
        And until 3.0 the only information you needed to know what that port did was the USB version.


        Theoretically, yes. In practice no as USB 2.0 host ports took over the 1.1 relatively fast and you don't find 1.1 host ports anymore.

        USB 2.0 was nearly an in-place upgrade (still 4-wire) for any USB design, and since usb controllers were integrated in the chipset very early on, they were also "free" for board designers.

        Well, the issue is also on mobile devices with single ports, lack of documentation does not change.
        Now that a Type-C port can also be thunderbolt or displayport, you might want to get a laptop with such ports, and since specs don't usually tell you that the device can use such functionality (but just state USB 3.1 or Type-C)... how you find out without buying the device (usually expensive) first?
        Hey, I did not say everything was peachy
        The more revisions, the more caveats. Till at some point we'll need a new, universal (pun intended) standard.

        Comment


        • #54
          Originally posted by GreatEmerald View Post
          And then I still wouldn't be sure whether it doesn't work because the cable is not capable of it, or the interface on one of the two ends (or both?).
          - The device themselves are supposed to know what they are capable of. Meaning that part of your question (or the interface on one of the two ends) is eventually solvable in software in the user interface.


          - Regarding the cables, their connections and their expected electrical property are pretty well defined.
          USB-C to USB-C cable are expected to have ALL their connectors wired, and when using AWG26 wires for the data, can't exceed 1.3meter.
          (And in the case of Apple's Lightning and Thunderbold cables, they even have chips embed that list their properties).

          If you got a cable with only the USB2 contacts wired, or with inappropriate wire gauges you got ripped off.
          You should RMA the cable (or stop buying 0.99 $ shit of ebay).

          Probably, over the long term, the situation might evolve to the point that real brand will put some home grown certifications (like currently with HDMI cable's "guaranteed to work with HDMI 1.4")
          Currently, if there's "SuperSpeed" or "SuperSpeed+" written on the cable, there are some expected results (10 and 20gbit/s respectively on device that support USB 3.2), if it doesn't reach this speed the cable isn't following its required specification and should by RMA'd.

          Comment


          • #55
            Originally posted by DrYak View Post
            Probably, over the long term, the situation might evolve to the point that real brand will put some home grown certifications (like currently with HDMI cable's "guaranteed to work with HDMI 1.4")
            HDMI cables have annoyed me for years that way. On many cables all the technical specs will say is "Guaranteed HDMI 1.4. Supports 4K", etc. When what I really want is "HDMI 1.4. Guaranteed 18 Gbps." Because "Supports 4K" is meaningless, all it has to manage is 4K resolution at 24 FPS. If what you really want is 4K at 60 Hz with 10-bit color, you need a lot more than "Supports 4K"

            When I was buying a cable to connect a 4K 60Hz TV to my computer the only one I could find with an actual bandwidth number was a Monster cable at 22 Gbps. So I probably overpaid, but I was sick of trying cables that weren't up to the job.

            Comment


            • #56
              Originally posted by DrYak View Post
              - The device themselves are supposed to know what they are capable of. Meaning that part of your question (or the interface on one of the two ends) is eventually solvable in software in the user interface.
              But only after you connect them, I guess? Unless you can write a "USB-Z" app to get those details in order to figure it out yourself.

              Comment


              • #57
                Originally posted by starshipeleven View Post
                Already said many times over.
                I mean that you can find an actual USB 3.0 port wired as a Type-C connector that still runs as USB 3.0 but will look "new and fast" to the average consumer.

                Then you can have a Type-C port that supports also Displayport (or other display modes) or thunderbolt, or both, or neither. How you can know? You don't as hardware specs rarely mention it directly or at all (most just state Type-C or usb 3.1), you try and see.

                And now this too. USB 3.2 does not change the connector, so the only way to see what the fuck that connector uses is looking up the device's spec sheet.

                And power changes without changing the connector is another dumb choice.

                Cables, goddamn the cables. The advanced versions of Type-C need special (more expensive) cables with the same identical port, so you may have 2 working devices and the wrong cable (that still looks mostly the same as others) and that shit ain't working.

                I work in IT support, I know my clients. When this goes live (for now the devices with such ports are few and very expensive) it will be a Windows8-grade clusterfuck with confused and angry consumers.

                Was it so friggin hard to make different connectors for the ports that supported the alternate modes (displayport/thunderbolt)?

                USB has been so great because it was simple and easy to understand/use.
                USB has always been a trainwreck for user support... I’d bet none of my users at work know the difference. we just got far enough down the road that “black” ports were USB 2 and “blue” ones were USB 3 the last 5 years. Except Apple and a few other companies that can’t be bothered with unfashionable color-coding.

                They should have done a clean break at 3.1. Ditch the “A” connector, jump to high speed 10Gb, and whatever voltage changes too. Just have all users get an adapter for support and have vendors keep the old “A” ports however long.

                Comment


                • #58
                  Originally posted by starshipeleven View Post
                  Already said many times over.
                  I mean that you can find an actual USB 3.0 port wired as a Type-C connector that still runs as USB 3.0 but will look "new and fast" to the average consumer.

                  Then you can have a Type-C port that supports also Displayport (or other display modes) or thunderbolt, or both, or neither. How you can know? You don't as hardware specs rarely mention it directly or at all (most just state Type-C or usb 3.1), you try and see.

                  And now this too. USB 3.2 does not change the connector, so the only way to see what the fuck that connector uses is looking up the device's spec sheet.

                  And power changes without changing the connector is another dumb choice.

                  Cables, goddamn the cables. The advanced versions of Type-C need special (more expensive) cables with the same identical port, so you may have 2 working devices and the wrong cable (that still looks mostly the same as others) and that shit ain't working.

                  I work in IT support, I know my clients. When this goes live (for now the devices with such ports are few and very expensive) it will be a Windows8-grade clusterfuck with confused and angry consumers.

                  Was it so friggin hard to make different connectors for the ports that supported the alternate modes (displayport/thunderbolt)?

                  USB has been so great because it was simple and easy to understand/use.
                  Well if you use the same plug then it is backwards compatible so even if it it doesn't support all those features then its easier than to make new port everytime they make a new version.
                  I think it is good that they use the same port.
                  More modern devices will support later standards anyways, so the problem is mostly with old devices.
                  Much better than have 10 different ports.

                  Comment


                  • #59
                    Originally posted by Zan Lynx View Post

                    On a laptop I'd rather have them all be USB 3.2. On a desktop it is often the case that the motherboard can support 4 USB 3.x ports. Any more ports means adding another semi-expensive interface chip and using more PCIe lanes. The motherboard can often wire out a bunch of USB 2 ports just because those are already in the chipset. They're basically free ports!
                    Bottom line, for a given configuration: give us as much ports as you can and as fast as they can be...

                    Comment


                    • #60
                      Originally posted by sdack View Post



                      ... He needs a bigger one.
                      But it isn't USB Type C! Are there going to be USB 3.2 versions?

                      Comment

                      Working...
                      X