> Making USB reversible to begin with would have necessitated twice as many wires and twice as many circuits, and would have doubled the cost.
This is... not correct.
You can make USB reversible with 1 extra pin and 1 extra wire. Grounds on pins 1 and 5, data on pins 2 and 4, and VCC on pin 3. Then have those pins on both sides of the plug and a socket with a single set of contacts on one side.
That's BASICALLY what Apple did with lightning.
Then you implement auto crossover detection, (edit: Gah! you don't even have to do that just flip the flipping wires) which had been around for years and is dirt cheap, in the hub. It would have been like, six, more transistors in the hub IC.
edit: I completely forgot that reversible USB 2.0 plugs already exist and use a simpler (and cheaper) method. They just tend not to be so reliable because of the thinner materials and the fact that they're not spec-compliant so they tend to be grey market jobs made for the lowest price possible.
There are also reversible USB plugs that are a single "tongue", inheriting that design from USB drives that don't have any plug to speak of but look more like a card-edge connector:
> Incidentally, the late Steve Jobs deserves a tip-of-the-hat for helping USB become a long-lasting standard. It was in 1998 that USB made some real headway, courtesy of the iMac G3, the first computer to ship with only USB ports for external devices (there were no serial or parallel ports). That came three years after USB 1.0 debuted, with a data rate of 12Mbps.
Also PCgamer:
> Given that it was a protocol designed to counter the typically Apple expense of sticking Firewire onto a motherboard, cost was the biggest reason for the eventual success of USB as a thing
Apple started building FireWire 400 into Macs when 12Mbps USB 1.0 was not really an equivalent. USB 2.0 didn't make the cut for the first generation iPod but was supported on subsequent models.
I remember presenting this in a project as an undergraduate and I was challenged by the lecturer that it would be highly unusual for a consumer device to ship without RS232 and I could only say “yeah”
> I remember presenting this in a project as an undergraduate and I was challenged by the lecturer that it would be highly unusual for a consumer device to ship without RS232 and I could only say “yeah”
It's kinda funny that this makes complete sense to me and I wouldn't have argued it either, but at the same time despite having been using computers since the late '80s the only thing I've ever had that required the use of a serial port for normal operation was my 2001-era Palm m100.
PS/2 keyboards/mice, parallel printers, and gameport controllers I had many of each, but all my modems were internal so the first time I ever plugged something in to a serial port was a null-modem cable in late '99 so my friend and I could use Windows 98 Direct Cable Connection to play Counter-Strike against each other without having to buy ethernet cards.
Aside from that one Palm device everything I've ever plugged in to an actual serial port has been a temporary thing like that null-modem connection. Mostly serial console for last resort access to misbehaving network appliances. Even then, most devices that have had the slightest bit of effort put in to them in the last 20 years have at least built in a USB-Serial adapter these days so we don't have to deal with the battle of devices that want full 12v RS232 and USB-DE9 adapters that deliver somewhere between 5 and 8.
The problem with USB-a isn't that it's non reversible, it's that it's non-reversible _and_ rectangular, so it's not clear at a glance which way round it should go.
All they had to do was make the connector have a non-symmetrical shape so that it's immediately obvious which way round it goes when you pick it up - you could do it without even looking. Think of how much time we'd have collectively saved with this minor design change.
> All they had to do was make the connector have a non-symmetrical shape so that it's immediately obvious which way round it goes when you pick it up - you could do it without even looking. Think of how much time we'd have collectively saved with this minor design change.
I disagree. When connecting an HDMI cable I sometimes have issues, especially if in a weird angle.
However I do concede is faster to connect an HDMI than a USB
Yeah, I want the connector that can be attached by only moving the TV a few inches, just enough to get my hand to fit and feel around, not something I have to rearrange the furniture for and break out some headlight to see which direction the cable is oriented. With BNC cables, I could do it with my eyes closed.
Right, the only thing worse than USB was the PS2 port. What sort of madness drive someone to invent a round connector that is keyed.
Hyperbole aside, there are some very good reasons to go with round connectors, it can be very strong and you can put a big nut on them to have an extremely secure lock. however the PS2 connector is nether of these things, It would require a much more robust key to work well.
The worst connector I ever encountered was on a old ATI capture card, it was the same barrel as a ps2 connector but had about 15 pins. Everything had to be perfect for it to insert, a tiny scratch or a slightly bent pin, it was not going anywhere.
The PS/2 connector was just a mini version of DIN connectors from the mid 1900s. They were sometimes used for printers and floppy drives in place of wider parallel connectors. In general, once you knew where ‘up’ was you could figure out the orientation in a way that is still more intuitive than USB. XLR cables are similar and still widely used, but have more external cues as to their orientation.
I would line up the barrel and then rotate while pushing in until it plugged in. I don’t remember bending any pins. The mark on the top helped until the tower form factor turned the motherboard sideways. Still do this for XLR and MIDI but they’re more robust.
> Right, the only thing worse than USB was the PS2 port. What sort of madness drive someone to invent a round connector that is keyed.
The Apple desktop bus (ADB) that was used for keyboards and mice in Mac-land was also round. They improved it by using asymmetrical plugs, which marginally helped.
I’ve seen that claim but AFAICT it comes from a single source, which was a presentation by someone involved in the FireWire connector design. So at this point it’s more of an urban legend, until we have some solid confirmation.
It does not really matter though, they are very similar in construction, and are very sturdy, and I really like them both :)
I've seen the same claim made. Apparently it was the sturdiness that led to it being copied. Good enough for kids beating on it and stepping on the ends, more than good enough for professional applications.
No, they both used the same physical form factor, but it seems to have just been a model from a OEM used by both nintendo and apple around that time, rather than one spec deliberately copying the connector from the other.
It doesn't help if you've trying to do it totally blind, but the 2 open square holes go on top, once I figured that out I've never had any issues plugging them in correctly. The only one I have issues with is my keyboard because the pcb in the plug is black and the it's a little hard to tell which side is the open holes and which side is the black pcb.
Vertically aligned motherboards are hard to judge what "up" is. Usually up is where the expansion boards are plugged in. But not all computer users know how a computer looks inside.
Tbf I think the most prevalent issue with USB was from people plugging them in where they cannot see the port properly, ie the back of a computer tower (most often) or back of TV/console etc.
Not really sure that that's USB's fault so much as a lack of front USB ports. I think in those cases the consumer can chose to get some sort of hub that's easier to access/can be pulled out from behind their setup.
Now that C exists though, idk if we really need to go much further. Connector much smaller than it is now will become fragile.
I've always thought they could add a single fiber core to USB cables for some "ultra high speed" standard, since USB4 is predicted to top out at 80GBps, but we could get 50Tbps or so over a single fiber if we really needed to.
But after owning several fiber cables, for networking and HDMI fiber based cables...yeah they're far, far too weak to replace copper. Just look at 'em wrong and they snap/break.
Plus tbf if we get 80gbps from usb4...the problem then becomes that we don't reaaaally have a use case for that speed. Maybe network adaptors, but beyond that...I mean a modern SSD will saturate that with a sequential read I suppose, but still.
I guess it would be easier with the larger size of the USB-A connector, but I still struggle to plug in micro-USB cables sometimes, as it's still difficult to figure out its orientation at a glance.
You don’t always want the shield coupled to the signal ground. If it’s coupled on both sides then you can have all kinds of fun problems. Ideally, the shield should be connected to the ground on one end
> Ideally, the shield should be connected to the ground on one end
Depends on the signal. With low-frequency stuff (like audio) you want to ground the shield on one side to avoid ground loops, but with high-frequency stuff (USB 3, for example) you definitely want to ground the shield on both sides: if you don't, you end up creating an antenna which is really bad for your EMI performance.
USB 1.x happens to be at an in-between point where neither issue is a big deal. The connector spec literally says "use accepted industry practices", so either approach is fine if it manages to pass emissions testing.
That goes against all the guidance I’ve read, but I’m an amateur so you may be right. I’m confused on why it relates to signal speed though. For instance, I’ve read that this guidance also applies to Ethernet which is quite high frequency and based on differential pairs
> Question: “I have been told that you should only ground the Ethernet cable at one end. Is this true?”
>
> Answer: “Sometimes yes, but only if you are seeking to avoid a ground loop. You likely don’t have to worry about this. If you are running shielded Ethernet between two structures that have their own separate AC systems or separate grounding rods installed, then YES, you should worry about what is known as a ground loop. A ground loop occurs when you have actual conflicting AC ground systems. Merely bonding your Ethernet cable at multiple points to the same AC grounding system in a single structure does not create ground loops. Think in terms of ground rods. How many ground rods are involved? If you have more than one with more than one structure involved then you are wise to bond to ground at a single end only.”
Ethernet needs crossover because it has dedicated transmit/receive pairs, but USB 1.x/2.0 uses a single wire pair for both tx & rx. Ethernet crossover is pretty much identical to serial null modem cables, and those were around for a few decades by then.
USB "crossover" of the single differential wire pair would be an inversion of 0 and 1. Pretty trivial to do once you figure out a way to communicate crossover or not.
That shorts together a lot of the contacts as you're plugging it in.
There are of course various devices which somehow ended up using TRRS for USB, but unplugging/plugging those while powered is definitely not recommended:
Most of this is just cheap design on part of the ergomech community. It's not that hard to add some protection. E.g. some keyboards use a Schottky diode on VCC for reverse polarity protection and ESD protection diodes to protect the data lines:
Of course the design requirements of USB would be different, since you'd always plug/yank the cable on a machine that is on, rather than protect against doing this accidentally sometime. Given that, it probably made more sense to completely avoid the issue.
It's more an issue of ESD protection being kinda tricky on DIY boards due to the need to stick with THT parts. When you're building a $200-$300 keeb, a $0.05 diode doesn't really all that much price-wise.
ESD protection diodes also don't really solve the problem. They are intended for short-duration high-voltage low-current spikes, not long-duration low-voltage high-current ones. Definitely better than nothing, though!
I'm not an expert but couldn't you just make the tip live and not connect the rest until that's connected? Or have a switch for when the plug is fully inserted. I see that it's a problem for sure, but it seems easier than the problems they solved to make usb c reversible
> I'm not an expert but couldn't you just make the tip live and not connect the rest until that's connected?
No. You have to deal with disconnects being possible at both the host side and the device side. Making the tip live means you're touching a live wire to each contact in turn while plugging in the device side.
They'd have to make it not fit in a normal TRRS hole or use some fancy power negotiation stuff to avoid blowing up headphones. It would be cool though. But perhaps easier to mess up than USB. Cheap headphone cables aren't great.
TR/TRRS have very small contact patches since you're connecting to the tangent of a circle. That's not good for anything carrying more than a very small current. And even then, it can be an unreliable contact.
1: GND/sense
2: DATA +/-
3: VCC
4: DATA +/-
5: GND/sense
Have the host side connect only pin 1 to GND and leave pin 5 floating, but have the device side connect both 1&5 to gnd, with individual pullups for 1 & 5 to VCC.
The device will sense LO on 1 and HI on 5 if either both sides are plugged in straight or both sides are plugged in reversed, but it will sense HI on 1 and LO on 5 if exactly one of them is reversed. This allows the device to detect when inversion of D+/D- is needed.
Ethernet hardware now has auto-crossover. However, that hasn't always been the case. There was a good decade in the 90's where you needed a special cable wiring if you wanted to connect two computers vs a computer to a switch.
I once used a crossover cable with an old 3Com card and a Linksys switch - whatever it sent down the wires, the Ethernet port on that switch was fried and has never done Tx or Rx since. I bandaged it with electric tape so I don't connect something else, and also in hope it will someday heal.
The 3Com card was unscathed. In fact, it is probably feeling stronger after that exertion.
Now I know I'm old because I just assumed that's how it still was. I'd be like the weird old grandpa at a party talking about a choke or carburetor or a hand crank on a car while all the kids just stare blankly.
Easily going into the early 2000s. I remember trying to setup LAN parties in 2004/2005, where the wrong combination of patch or normal cable with hub/switch/PC made everything a nightmare.
I found a few patch cables mixed in with regular cables. one day, i got so tired of messing with them, i cut the cables and put them in the trash. i could have just trashed, them, but that didn't feel right. i mean, what if someone were to go dumpster diving. the curse would then be on them.
Whenever I have a bum cable and confirm it’s the cable, it gets destroyed. Too many cables have failed to get into the trash or have somehow crawled back out for me to do anything else.
Uh? They're trivial to tell apart, just put the RJ45 jacks side by side and look at the colors of the wires going in?
Same sequence -> Straight ethernet cable
Same sequence except for 1 swapped pair -> Probably a crossover Ethernet cable
Inverted sequence -> ISDN cable, burn it.
Also (apart from the ISDN cables) - it hasn't mattered for close to quarter of a century, all switches and NICs are auto-MDI-X nowadays, why throw away the stuff.
I've certainly more than once outright thrown out items for just falling outside of the return window rather than hand-me-downing them to friends for being so crappy that I didn't want to ever operate them ever again in any context.
The one I remember the most was a Bluetooth speaker that yapped your ear off for a half-minute combined with an obscene amount of jingle beeping at 100% blast and more LEDs than a Christmas tree as a power-on/off procedure back when they were novel items.
Random anecdote: I worked at a company with an on-prem server room and the building got struck by lightning. It fried some servers and quite a few nic cards on employee computers. Our CTO decided since we'd be in the building doing hardware replacements anyway, now was the time to redo our wiring closest as well.
That was my introduction to wiring up CAT-5 cables and rj45 connectors and, with so many to make, I needed a way to remember the wiring diagram without having to read it over and over or use a visual reference. The mnemonic rhythm I used to remember the pattern was "orange/white, orange... green/white, blue... blue/white, green... brown/white, brown". Of course, said out loud, you couldn't tell if it was "orange/white, orange" or "orange, white/orange" and I kept confusing the rest of the team when I'd say it, but it worked for me.
This was circa 2005/2006 and I still say it like that. It just stuck.
When was Auto MDI-X introduced? A quick google failed to get me an answer but I remember USB being available while I still needed a crossover cable to hook up 2 switches.
If it makes you feel any better... for some reason there are some currently in production business jets that still require crossover cables for maintenance... so they ARE still required sometimes.
You encounter this in older industrial equipment. Some of the old ethernet chipsets don't support automatically switching the pairs around. It's not really a problem since at least in our local area they need to be installed by licensed cablers who are used to dealing with legacy devices.
Many devices won't get updated until chipsets go EOL or there's some other compelling reason to do so.
Having said that, modern chipsets seem to all support it.
I've also dug further into my parts bin thinking "oh, that yellow cable would've been perfect, but I need a non-crossover cable"... could've used the yellow one instead of digging further. Ah well.
You have to consider that initially it was supposed to be connector for stuff like mouses or keyboards or gamepads or printers. 12Mbits.
But yeah that was only one of many misdesigns along the way. Like making turning ports on and off optional features, mini-usb (seriously, none other usb version I've seen failed as often as mini-usb connectors), just how overcomplicated USB-PD is, or the recent mess in USB-C cabling.
USB1 was designed in times where transistors were significantly more expensive. And in times where laptops or phones or any other situation when you disconnected peripherals were far less common. Like, it still was improvement over PS/2.
It's also not just few transistors to flip the lines around, it's also the logic to drive that
Y'all are talking as if you were in the vacuum tube era.
There were 42 million transistors in a Pentium 4, I'm sure they could have put a few thousand per USB port if they thunk optimistically instead of pessimistically.
When USB came out we already had some very crude 3D realtime rendering already at that time available to mass consumers, I'm sure we could have figured out how to flip power on a port if we put our minds to it.
Think "how can we make shit happen" instead of excuses of why it's hard.
It's by doing hard things that we got to the 80 billion transistors of modern GPUs.
> There were 42 million transistors in a Pentium 4
USB 1.1 was in '96
That's Pentium 1 era. Pentium 2 was released year after. Pentium 1 had one tenth of that
> When USB came out we already had some very crude 3D realtime rendering already at that time available to mass consumers, I'm sure we could have figured out how to flip power on a port if we put our minds to it.
Why your six neuron brain can't imagine saving pennies on chips used in cheap computer peripherals?
How are you going to implement the rectifier, though?
The usual implementation is using diodes, but that means you lose twice the diode forward voltage to the rectifier. Even with a Schottky diode that means you are losing 0.5V-1.0V. That's a 10%-20% loss at 5V!
You could use a transistor-based "ideal diode" solution, but that's not going to be cheap - especially in the 90s.
My take on USB-A: aside from audio connectors that have axial symmetry (and have no meaningful orientation), I never used any reversible computer cables prior to USB-C, and that’s not the main issue. Nor is it the axis of symmetry of the casing allowing failed upside down plugin attempts.
The shitty part of USB-A is that the tactile feedback of being slightly misaligned is identical to it being upside down. My experience tells me no amount of vigorous jiggling or extreme self-confidence will ever allow for consistent average 1.5 attempt plugins.
All the standards before USB were in my memory even less user-friendly — trying in vain to reach behind a heavy computer and unscrew the two jammed retaining bolts holding in a serial, parallel, scsi, vga, or DVI plug with slippery bent plastic jacketed heads. Almost zero clearance from the plug case, that really did suck. Screw it in slightly loser next time, now you’ve got a flickering monitor, dummy.
And ps2/mini-din connectors sucked too - having the connector off-center or at the wrong rotation also felt quite indistinguishable.
This is somewhat related as it's the next step of the above.
The floating tongue of the USB connection is in the socket, not the on the cable. As in the most fragile part of the design is on the device side, not the cable side. This means that you can more easily break your $1000 device not the $2 cable. They repeated this mistake with USB-C as well.
It's not hard to make a port where the cable is the free floating tongue and the device is a more robust socket that wraps around that fragile piece. I know everyone's happy about the iPhone moving to USB-C but the tongue on the cable side that it had was much better. Anyone who's tripped on a cable and broken a USB socket can attest to this.
I broke mine on a S21. It's also getting wobbly on 2 Essential Phones.
As OP said, now those 3 expensive devices are mostly broken or at risk instead of the cheap cable. It's bad design compared to the Lightning connector, and I don't like Apple...
It's not really "bad design", it's an intentional tradeoff.
Doing it the Lightning way makes it significantly more difficult to guarantee good signal integrity, especially once you get into USB 3 speeds and beyond. Not to mention exposing sensitive pins to all kinds of ESD shocks!
Basically buy 2 Lenovo X due to the broken usb-c power slot. There is a backup one but for business notebook you fly with you you really do not want a broken notebook.
I was surprised this mistake was made with USB-C. Surely a committee voted on this and knew what they were doing? Isn't Lightning the opposite way around?
Idk, lightning has the opposite problem where the contacts are exposed and degrade very quickly on the cable and need to be replaced constantly. Great for apple. And lightning has the gripping pins on the phone which degrade rather quickly too over repeated use. No cable is perfect I fear
Unless we go to something radically different then one side or the other is going to have the flimsy bits.
For my money, I’d take a flimsy cable over a flimsy device port. Replacing a cable is a two minute trip to Amazon. Replacing a device port is a pricey and annoying trip to a repair shop or an evening with my soldering iron cursing the universe.
Fwiw as a known clutz i haven't yet broken a USB C socket, the worst i've done is gummed a socket up with pocket lint. The symmetry helps avoid as many issues.
I did break USB A sockets though. They could easily be jammed in backwards and with the offset from center tongue and the cable perfectly designed to wedge in the wrong way and snap the tongue i did it far too often.
The best feature of those little thumb screws was that the computer-side mount could become unscrewed and fall off when you were trying to remove the cable. On some machines, it was held in with a tiny little nut on the inside of the case. The nut was just the right size to fall onto a motherboard and bridge PCB traces or exposed I/O pins.
Of course, there was no way to detect this by feel.
> trying in vain to reach behind a heavy computer and unscrew the two jammed retaining bolts holding in a serial, parallel, scsi, vga, or DVI plug with slippery bent plastic jacketed heads
Oh yes, that brings back very bad memories.
Only gradually did it dawn on me that nothing really bad actually happens if I didn't screw the connectors tight on DVI/VGA etc. :)
(Yes, these are technically not plug-and-play, but the occasional disconnect sure beat all the banging my head on the desk and swearing profusely every time I changed something with my personal setup.)
This is what I do with these (terrible) screw-in connectors.
- Loosely plug in the connector
- Screw in the right side, relatively tight
- Use the right screw as a pivot point, by pushing the plug to the left, this will properly seat in the connector
- Screw in the left side until it touches, do not tighten
To unscrew, use the left side as a pivot point by pushing the plug to the right, this should loosen the right screw, unscrew it, then, unscrew the left side, which shouldn't be tight if you did it correctly.
Note: you can switch left for right and right for left.
The general idea is to wiggle left and right by using the screws as pivots. Do this to unscrew if it is too tight. If only one side is screwed in (don't do that) and it is too tight, screw in the other side and use the pivot trick.
I don't like not screwing these in as they have a tendency to come loose, especially since they also have poor feedback and chances are that they aren't properly inserted to begin with.
And side note: another thing I hate with this plugs is that when you pull out the cable, the plug tends to grab ever other cable that's on their way. In fact, some of these plugs look suspiciously like boat anchors and seem to be just as effective at grabbing stuff.
The D-sub connectors were pretty intuitive, as far as I can remember. The 9-pin joystick ports never caused much trouble.
Making the USB connectors symmetric in their outer shape but not symmetrically pluggable was really inexcusable. When it came out I remember thinking that it was worse than any of the existing connectors. And this is supposed to be the new “universal” connector?
> The D-sub connectors were pretty intuitive, as far as I can remember.
I never personally had a problem with them, but I can attest that people did. At my first job (a small computer shop), we had multiple customers who tried to jam their D-sub connector on wrong, and broke one or more pins off. Since this was back in the day when the cable was hard wired into the monitor, that generally meant they had to buy a new monitor.
That is malicious, you cut off the d-sub connector and put another one on. I have never heard of someone replacing a monitor because the pins on the connector were damaged.
I've never heard of anyone replacing their hard wired monitor cable. And in any case I certainly wouldn't have had the skills, so I think "malicious" is an uncalled-for term here. Maybe my boss (the owner) knew better and chose to charge people for new monitors, but I definitely didn't.
As a repair shop, it is your job to repair the broken thing, not just sell them a new one. A big shielded molded cable can be fixed, either at the connector or replacing the whole cable where it terminates into the PCB. If I had known about a shop disposing of 500-700$ monitors with bad cables, I would have made a fortune.
As I have said, I had no idea you could fix such a thing. Hell I didn't even know that until this very thread, much less 20 years ago. As such it was not in the least malicious (ignorant perhaps, but that's a separate thing).
In any case, your unfounded criticism of my moral character is really irrelevant to the main point I made: it was totally possible to fuck up plugging in D-sub connectors, and I witnessed multiple people who had done so.
I apologize, my comment was too strong. I am sure many shops did this, probably because they got into retail when the home computer market was booming and weren't yet skilled in repair.
I wired a plug for the first time in my life this past year. Nor have I ever known someone who has wired a plug. It's pretty rare to have to do something like that, in my experience. Certainly at 18 I wouldn't have had any reason to believe that was even possible.
Sure, obviously different people will have different experiences. That's why I was careful to qualify it by saying "in my experience". But we never did that at home (though oddly enough my dad did plenty of other electrical work around the farm), and it definitely wasn't something I got taught in school. Until this year when I replaced my dishwasher, I had no idea one could even replace a power plug or that some things required you to wire a power plug into them.
Wow, that is almost impressive. Especially since you will find power plugs hanging next to the light bulbs in almost any grocery store in the world. But I guess, one need to know, to see them. Maybe we shouldn't think the users of our software are totally hopeless when they don't find all the functions we put in the menus.
Heh, as a former EE masquerading as a software developer, I would elicit gasps by...taking the case off a computer (do it while it's turned on and jaws hit the floor).
I did this once, it's not trivial. If you don't impedance match correctly, the colors get all wonky. It's a fairly specialized skill and more challenging than your typical solder job. Honestly easier to take the monitor apart and solder on a whole new cable.
At least PS/2 cables were mostly made with "keyed" rubber jackets where one side was flat and the other rounded, so in the dark you could tell which way was up.
(Though, that keyboard and mouse were different ports was stupid, so it loses points there)
Never have I been able to successfully plug a PS/2 port in by haptics alone.
I can totally see how they were designed to theoretically allow for it; practically, the tactile feedback for the correct orientation is just way too subtle.
And regarding the two different ports: I can't remember (or rather, I was never motivated to revisit the plugs once connected to experiment, since it was such a pain) – was there a technical reason for that, or could modern mainboards/BIOSes/OSes detect and correct for that?
> was there a technical reason for that, or could modern mainboards/BIOSes/OSes detect and correct for that?
IIRC, there was a technical reason in that original mainboards lacked the circuitry to detect and correct for it (I think different mouse vs keyboard interrupts were involved?), and it would have been a extra x¢ on the BOM. But modern mainboards just use two of the same circuit and in fact do work fine if you swap them (tested just now, sample size of 1).
PS/2 connectors were a regression on AT DIN & D sub 9-pin serial for most practical purposes.
The only upside of PS/2 was a higher mouse data rate, particularly with high monitor refresh rates, made mouse movement smoother. You didn't get this for free in Windows though, you needed a registry tweak to tell the mouse driver to bump the sample rate.
The huge downside was that PS/2 peripherals couldn't be inserted when the PC is turned on. They need to be inserted at boot time, and aren't detected if they're plugged in later.
And the keyboard and mouse connectors were physically the same but not generally switchable.
The fact that keyboard and mouse connectors in ps/2 were not interchangeable was really annoying. No matter how much USB-A is annoying, this alone made having USB keyboards and mice a win in my book.
Keyboard was always the one closest to the mobo, since it was there first (and was originally a larger, more robust DIN connector); the mouse piggybacked on top of that later.
> unscrew the two jammed retaining bolts holding in a serial, parallel, scsi, vga, or DVI
Yes, it was kind of annoying, but OTOH it provided reliable robustness. Today, many connectors are just too prone to unintentional disconnect. And in the case of optional locks, like e.g. on some DP cables, one never know if the cable has the lock or is just stuck.
I fail to understand what you mean with plugging a mini-DIN plug "off-center". Please explain!
I find it easy to use by feel alone. You can feel when the sleeve fits the socket. Then you can rotate it in the socket until it can go all the way in.
You’re reaching around the back of a computer and you can’t see the exact location of the mini-din jack, you have to angle the plug normal to the plane of the computer back (more or less), you have to center the plug (translate the plug horizontally and vertically) to align it with the jack, and you have to rotate it so the the indentation (and of course pins) align.
But… My old computers all had a slippery steel plate surrounding the jacks, and the jacks had plastic centers aligned flush with the plate, having the plug overlap the steel plate never felt terribly different from only having the rotation wrong.
I know everybody is still on the high from USB-C but I'm still going to go on the record it will be one of the biggest disasters in cable history.
1) There's going to be video-only cables; low-voltage only cables; etc and this time everything is usb-c so you can literally only tell which ables work by testing all of the dozen cables instead of the 1-2 USB-As you have.
2) The connector is symmetrical but the pins aren't. You can see a wiring scheme of how symmetry is handled [1]; literally manufactures are going to cheap out and not do that and you have 1-way USB-C cables without any kind of orientation markers.
An obviously key'd connector like firewire / ethernet would've solved all of USB-A's flip it thrice problems. And this could've been allowed in a backwards compatible where old USB-A cables were basically a skeleton key and the new USB-A cables had a ward that blocked them from fitting incorrectly.
I have USB-C cables that look the same, but some of them carry video and some don't.
I have USB-C charging devices that will charge off of some cables, but not off the one that will charge my laptop. I now have a box of weird crappy cheap USB-C cables with taped-on notes that say which toys they'll charge.
I never had this problem with previous USBs. If the cable fit in the port, it would charge and transfer data. Sometimes you needed a different cable or port or something to go faster, but they all basically worked.
Yeah, I don’t know how anyone can say this is not a problem. I’m sure there’s some deep technical reason why each cable does or doesn’t work, but as a user I have no idea why it’s the case or how I can figure out which cable will work where.
I thought I was being smart ordering some decent Anker cables until I found out when trying to connect a display that they apparently they only support charging? And maybe USB 2.0 transfer speeds?
So I went back looking to try and figure out what I needed instead. Depending on what the various devices involved support, I needed some variation of a thunderbolt cable (2, 3, or 4?) or a USB 3.1 Gen 1 Type C or a USB 3.1 Gen 2 Type C. But then I’m supposed to know something about thunderbolt versus DP alt mode versus god knows what else none of which is clearly labeled or explained anywhere on the devices I’m trying to connect.
But then there’s this handy table from Adafruit that says maybe not? https://learn.adafruit.com/assets/85323 — apparently only Thunderbolt supports video, or DisplayPort? And USB-C “full featured” cables are only for higher speed charging, not displays?
Find 10 sites, get 10 different answers.
Anyway, turns out a USB 3.1 Gen 2 Type C cable worked for this specific display. Why? No idea. Will it work elsewhere? No idea.
I’ve been in the industry professionally for about 25 years now and I have no idea what cable I need to purchase and have no idea how to tell before I purchase whether it will work for a specific use case. I have no idea what non-technical people are expected to do with this besides try every cable they own and pray.
I have a Thunderbolt docking station for my laptop. There's no confusion about the cable, because the cable is permanently attached to the docking station. I just plug it into the laptop in the correct port (designated by the Thunderbolt logo).
Yep, I've got USB-C cables that only operate at USB 2.0 speeds, drives me nuts. Though USB-A suffered from this same issue, even once we got USB 3.0 it was hard to tell what speeds it supported (3.0, 3.1, 3.2, 3.2 gen 2).
I don't own any Apple devices myself and don't even have a MBP through work anymore, but I do have a couple of Apple USB-C cables and a spare laptop charger, still. They conveniently seem to work with every kind of device for any use. Data, video, fast charging.
However, I do have one other cable that won't do data. I think it came with a cheap charger I bought while travelling. I would certainly like it if there was a label saying charge only or something to that effect, on the incomplete cable.
I own a usb-c male to usb-c female extension cable that I connect to a yubikey. The yubikey only works in one orientation. I don't have any other bits to test hypotheses with, but I believe the cable was made with the symmetry problem described above.
AFAIK there's no such thing as a standards compliant USB-C male to USB-C female extension cable. The very concept goes against the spec[1].
You have a point though; I'm sure there are abominations out there that abuse the USB spec in all sorts of interesting ways, they just haven't been something I've encountered in my years of using USB-C for everything.
Ah, good to know. I'm tempted to respond that this means the spec has failed to address an important use case. But I'm certainly not going to learn enough of the spec to back up that claim.
You're not wrong; it's an unfortunate limitation, even though there are solid technical reasons for it. As a sibling commenter pointed out, a standards-compliant alternative would basically just be a USB hub with a single USB-C port.
Usb-c to usb-c extensions are currently explicitly prohibited by standard, both because of issues like this and ways that they defeat safety mechanisms with charging.
It is not broken. It is the best solution to a problem, even if it is imperfect.
It seems that the committee behind the spec was somehow unable to learn from decades of experience, and did not predict that people would want extension cables, and that companies would make them. That's not an actual problem for me. Not being able to reach my yubikey when I'm using my laptop at my desk, is a problem.
I have many USB-C cables that don't behave like other USB-C cables. Just this past week, I had to buy another cable for my son's desk setup because of the many USB-C cables I had, I was out of the ones that would carry 4K video and USB-C 80+W PD to his laptop at the same time. (I had a couple that worked, but they were in use for my wife and I, and the couple handfuls of others I had were capable of zero or one of those two functions.)
Kinda like saying that I have a bunch of very light indoor extension cords and you are wanting to run an outdoor high amperage device (both 110v in this example). Yea, the interface for both is the same, but one is going to be used far less often and require a much more expensive cable for a reason.
What cable did you get? I’ve bought a bunch off Amazon and none work as well as the one cable I have that does work. I use a 4K + PD cable to connect my laptop to my monitor. The cable I have that came with the monitor works but I want a shorter cable. All of the ones I’ve bought cause the video to periodically cycle off and on.
I can’t find which of the old ones definitely worked (because I bought 6 at once and didn’t manage to return the duds) and though it arrived on Sat, I haven’t had time to fully test the new one.
It carried 4K/60, but I haven’t tested it across all our devices to be sure it does full power delivery to all. (If it only works on some, I have no problems sending it back until I get one that works well for all.)
I’ll try to remember to come back and report on thread if I found one that works for all my devices (and what it worked for). It’s not something that I’ve time to carefully test during the week.
Thanks. So far I’ve ordered 3 cables and none work. The first one was a charging cable. I didn’t even know that was a thing and it’s when I started to learn about the differences between cables. My second one didn’t support DP Alt Mode which I seem to need. The third one was a 100W 4k USB Gen 3.1 that supported AltMode, but it just didn’t work reliable (that’s the one that results in the screen blacking out periodically.
I’m thinking I should just spend $70 for the Apple cable because it will probably work.
Your last paragraph annoys my frugal Yankee nature enough that it’s a long-last resort for me.
On my personal desk setup, I don’t charge my Macbook in the same cable that carries video. My wife’s setup is a single cable connection (but she has an Air, so only a single external monitor).
Working from home, I undock rarely, so two cables is not a problem for me, but I want a one cable for the kids who will dock/undock almost daily.
> USB-C has been out for years now and neither of these predictions have come to pass. Your comment aged poorly before you even finished writing it.
Number 1 absolutely has. I've been somewhat in the market for a tester that I can plug both ends of a USB-C cable into and see what voltage, speed, and other specs it supports since most of my cables have zero markings on them.
New iphone ships with a beautiful braided USB-C cable that's charge only. I'm going to cannibalize the usb-c cable from my external nvme drive if I ever want to plug the new phone in. Some usb-c cables are 100W compliant, many are not. None of them are clearly labeled.
I used a Bluetooth analyser once that had a USB-C connector that only plugged in one way. It had a dedicated "wrong way around" LED to tell you to rotate it.
Kind of hilarious. It's the only example I've ever seen though so not a real problem IMO.
The cables issue is real, but I don't really see the alternative. Would everyone really be happier if all USB-C cables had to be expensive thick 40 Gb/s 100W cables? No.
Could they not have done… something else besides make everything visually identical? At least give us a fighting chance?
In a perfect world require every port to have the supported modes displayed beside it, and every cable to have the supported modes etched on the metal connector or something? Lightning bolt for PD, little screen icon for video, something?
It will fall apart as soon as cheap unlicensed electronics hit the market, but at least in general use “hook this brand name laptop up to this brand name monitor with this brand name cable” would not be a seemingly completely inscrutable trial-and-error problem.
> In a perfect world require every port to have the supported modes displayed beside it, and every cable to have the supported modes etched on the metal connector or something? Lightning bolt for PD, little screen icon for video, something?
Guess what? That's already the case! Manufacturers just choose to completely ignore that part of the specification.
They do try to get manufacturers to print symbols next to the ports, but that isn't really a full solution anyway. There are too many modes, and you have things like hubs to deal with.
I suspect the real answer is better software support to tell you why something isn't working.
Wait, there already are PD-specific cables, video-specific cables, and the like. If your cable is short enough, it's not an issue, but if you want one longer than 1 meter, you will likely have problems with 4k video on a cable that isn't specifically designed for high data rates.
The high speed and power delivery cables have chips on them, so the connected devices won't use high speed or PD if the cable can't handle it, regardless of length.
Data vs power-only is very much a problem. See this Daring Fireball note about an iPhone review: Becca Farsace plugged her iPhone into an external drive for shooting video, only to discover later that it was only useful for charging, and the recording was very low quality.
(This is something Apple should fix at the software level by alerting the user, but it highlights the problem that cables are not readily distinguishable.)
Also I have to say that the situation of USB-C is much better now than in the past.
At the beginning of USB-C you had devices that were not implementing the standard correctly, and using the wrong cable could fry a device.
Now the devices are better protected, you can charge any device with any charger. Of course the charger needs to be powerful enough but if it's too powerful it's still safe to use.
We still see quite a few new products being released which fail to implement the spec.
A common offender is devices which leave out the CC resistors, making them only work with A-C cables. But the soon-to-be-released Raspberry Pi 5 also violates the specs, making it basically only work with their proprietary charger.
You’ve been fortunate (or far more clued in to usb standards than just about everyone) if you don’t already have multiple usb c cables with different capabilities.
It’s possible for noncompliant usb c cable to fry electronics.
I don’t know of any other connector standard where the connector doesn’t clearly indicate or imply the cable’s capabilities. I’m sure they exist, but generally for consumer electronics if a cable fits, it works. Except for usb c.
USB-C problems surely exist, but I have never heard of single-side only plugs, and video-only cables simply don’t exist, yet these are GPs specific concerns.
I have one USB-C cable that I know can be used to charge my laptop. I can't grab any old USB-C cable and use it to charge from my the exact same power brick. I have other cables from USB-C to Display-Port and I know only of one that does 2160p@60hz.
USB-C has been fucking fantastic for me. I plug one USB-C cable into my personal laptop or my work laptop, and I get a USB hub for my external keyboard and mouse, my external monitor, and power, all with a single plug. Makes it so easy to swap everything in and out.
This is the dream indeed. However to make this dream happen I had to buy and trial and error 6 USB-C cables before I accidentally purchased one that carried enough wattage, supported the correct video mode, had enough bandwidth for 100 Hz video signal and that actually worked. The market for USB-C cables is a jungle.
Then it is broken I think you have both a broken cable and a broken device, or one not up to spec. Some devices only connect to one side, which will work on a good cable, since it is connected to both sides. One of the PC manufacturers did this early on with USB C. I can't remember who it was.
3) The most fragile part (the wafer in the center of the receptacle) is on the expensive-to-repair device, while the least fragile part is on the cheap-to-replace cable. HDMI makes this same mistake. With USB-A, at least the plastic part on the device is relatively robust, and with Lightning, both plug and socket were rather robust (albeit prone to debris incursion). With older pin-type connectors like VGA and DVI, the fragile pins are completely on the replaceable cable.
Im not sure this is true, the part in the cable is just tiny pins and far more vulnerable than the wafer. For years now I've dug woodchips and other junk out of my phones usb-c socket with paperclips and never came close to damage. It seems sufficiently recessed to prevent all but the most aggressive attack.
Edit: in fact, trying now it seems to be perfectly recessed that I can't even push on it from an angle with the connector, it has to go straight in a bit before it even touches, on any of my devices
I mean, I'm referring to things I've had to deal with. Failure isn't obvious until the user complains of random disconnects, and then you find that although everything looks fine from the outside, you can wiggle the wafer inside like a loose tooth. The plastic of the wafer has cracked and is being held in place only by the metal in the contacts.
I'll grant that it's robust in that it continues to work more than I'd expect, for example continuing to charge the laptop but not reliable for the external display. And once connected, sure it's secure.
I suspect it's not the cable mating that causes the failure, but other foreign objects… keys in the bag or something… that manage to dislodge it. And some laptops have more play in the connector than others, which I think allows the connector on the cable end to hit the wafer with an angle of attack sufficient to dislodge it. Some of my users handle heavy bulk materials all day and are anything but gentle with computers.
In any case, I'm sure the designers considered this, and far be it from me to second-guess their compromises. It is what it is. But if the port has failure modes, this is one of them.
It's almost as if the designers have considered this potential failure mode for devices that often get dropped even while plugged into a charger :)
I can emphasize with many of the gripes people have with USB-C cables and supported modes being confusing, but the port is really, really well thought out: Make the cable the part that fails more easily [1]; make it so that the plug only minimally acts as a lever while partially or fully plugged in; cover the pins on the plug (because otherwise people will inevitably touch them, and these cables can carry up to 48 V of voltage).
[1] That part has actually worked well for everything but my Yubikey :( USB-A and an adapter it is for the next one.
A significant issue probably affecting your Yubikey is that there are very few male USB-C plugs available. The ones that do exist are mostly designed for USB-C cables, which uses a tiny 0.8mm thick "paddle card" fully embedded in the plastic cover.
Alternatively, there are "brick nogging" connectors which stand up straight from the PCB and are therefore unusable for a Yubikey, or "sink board" ones which require a custom jig for assembly.
Meanwhile with USB-A you don't even need a male connector at all - just have a bit of your PCB stuck out in a vaguely USB-A-looking shape and it'll work just fine.
> The most fragile part (the wafer in the center of the receptacle) is on the expensive-to-repair device, while the least fragile part is on the cheap-to-replace cable.
Why do you think that the wafer is the most fragile part?
One might argue that the latch mechanism would be the part most exposed to wear and tear due to mechanical stress incurred by plugging and removing. Those side latches are located in the plug, not in the receptacle.
Personally my last two phones have had this. It starts slowly - the cables occasionally don't work, then some cables won't work at all, then no cables work without 2 minutes of gentle manipulation, eventually it becomes almost impossible to charge. You waste so much time on this before it gets so bad that you can't live another day with the phone.
I've had this happen about three times now, but every time the actual problem is pocket lint inside the connector. After I clean it out it is back to working like new. It's not always obvious that it's lint until you start scraping back there since it gets jammed pretty hard when plugging in the cable.
I was told this exact opposite argument about Lightning being worse than USB-C. The line of reasoning was that the springy part breaks and in USB-C the springy bit is in the cable, while in Lightning the pins are sturdy.
I broke the ring on a M.2 Wi-Fi card, but installed the cable onto it anyway. I think the part I bent off will probably rip off and get stuck in the cable the next time I take the connector off the card. Here's hoping I'll never have to do that...
I can certainly see (1) happening, especially with cables that come with the device they are expected to plug into. It already widely happens with fallback to slower bit rate or lower power charging.
But (2) seems very unlikely. Most consumers just won't even consider that it has an orientation, and after it fails a few times chuck it in the bin or send it back. If it comes with a device (i.e. printer or monitor comes with a non-reversible cable) they're likely to send the whole device back. That would be a ridiculous false economy for the manufacturer.
(1) as in Video only, as in they won't work to top up the battery in random Bluetooth gadgets and the like? Amongst compliant cables, you might occasionally find some that are rated 40 Gbps but do not identify as high voltage capable. But I suspect that once manufacturers stomach the wire cost of those fat 40+ cables (display!), skimping out on PD capability just won't make enough of a difference to abandon efficiencies of scale. PD cables that are no/low bandwidth? Sure, those will be with us forever. But "display only" will be a rarity.
What would be the alternative? An entirely new physical connector every time manufacturers feel the desire to make this year's model slightly more capable than last year's? Some form of "physical semver" where "not all new features are possible with all old cables" mandates a new plug shape?
If we were still limited to domain specific connectors (one for storage, one for networking, one for audio, one for video, one for each type of user input) we'd still run into the same (non)issues: unless every revision of, say, displayport came with it's own unique connector, without digging a little deeper (checking some version/capability symbols) you don't know wether a given cable supports the latest feature set or not. The universal in USB is not the problem.
I had a USB-C Xiaomi phone that at some point only allowed being plugged in in one orientation. I guess one side failed, but then again, that wasn't the only failing of that phone, nor even close to the most annoying.
That might not have been a symmetry problem per-say.
It could be that the phone correct determined the orientation but a pin was dirty/broken so it couldn't be used. When flipping the cable around the pin in use changes to one that isn't dirty/broken and so it works fine. USB-C may have a ton of pins but that doesn't mean all of them are needed to charge so flipping it can move a broken pin into the unused pins.
oh, yes, very likely it was a broken or dirty pin (that's what I mean with "failed"). I wouldn't be surprised if it was somehow a software problem as well (apparently for simplicity) some cheap USB-C port setups are wired as two ports.
Not ideal for sure but I don't see the cable disaster you claim, specifically what clear improvement you would have with multiplication of connectors.
Connectors, cheap and reliable are hard to design, and there's that much solution that works, and economics of scale works wonderfully by this kind of mass production (the good enough converges to reliable cheap). Also, the backward compatibility is real, and not to be dismissed too lightly. Say, for an USB-A connector, would really go one way where a USB 3 cable and device should NOT work with your older computer ?
There will be waste, there will be disgruntlement, but at real diagnostic will be possible (instead of the vague indications like «use provided cable» we have for HDMI and DisplayPort). For power, you can already find cable with a mini LED display on the plug to show max wattage once plugged.
Limited cables will quickly have label to show their specific capability (4K/8K, for PD only, 100W 60W), for cables you buy. General case for cables provided by manufacturer is, you keep you cable with the device, no real confusion here.
I know many have the cable drawer full of just-in-case that never comes that would dream to have it blissfully replaced by unlabeled USB-C cables, but this is just not how general people handle the problem. They keep the cable with the device, and learn what they need before or after buying a replacement, or better, asking someone in a shop, those still exists for all I know.
USB-C is actually surprisingly tricky to screw up.
For example, video-only cables are only possible if they have active electronics - so that means you are buying a very expensive cable which somehow doesn't follow the specs. I know Monoprice does make one, but it starts at $115 and it is explicitly marketed and labelled as video-only.
Low-voltage cables are the same thing. All passive cables can do 100W using 20V, and the 240W ones using 48V are explicitly marketed as such. There are some optical cables which do not transfer power at all, but again you're looking at specialized >$100 stuff. You don't just accidentally grab one of these out of your Box Of Cables.
As far as I am aware, no cable manufacturer cheaped out on the wiring because there is zero reason to. You can't simplify USB 2 cables like that, and anyone selling USB 3 cables wants all pins for it to actually be usable as a data cable.
The biggest problem with USB-C is that the USB Implementers Forum created a very clear design to indicate exactly what each device and cable is capable of - but then manufacturers decided it was ugly and just completely ignored it. If everyone just followed the damn spec there wouldn't be nearly as much confusion.
> You don't just accidentally grab one of these out of your Box Of Cables.
Yes you do. You set up the studio and wire it all up, with all of the devices, and with the right $100 wires and $30 wires and $2 wires. (Along with $50k in equipment.) But then you need to move out for some reason, so you tear it all down and it all gets boxed up and loaded into a truck and then you have nowhere to set that studio back up so then you end up with a box of cables in your living room that then gets shoved into your closet. Now you have a box of cables, some more expensive than others and you are just gonna randomly grab one until you find one with the right ends on but now also the right innards as well which means you'll have to test it.
I am not looking forwards to looking at a tangle of usb-c cables and extracting the one that actually meets my needs. Still better than trying to pull a DB9 through, but still.
I think this is a bit hyperbolic. Once you try to implement any standard in practice to billions of people & devices you're going to find the same constraints and similar trade-offs everywhere.
Manufacturers are going to cheap out, always. There's no way around the fact that goods can get on the market claiming a thing that it doesn't do (perfectly) in practice. Tech is littered with semi-compliance. I hate to kick the can down to 'let consumers figure it out' but there's hardly a realistic alternative, you just can't police ~10 billion cables sold a year across ~10 million retailers. I haven't heard of your first example anywhere actually happening, it sounds theoretical more than widespread in practice in 2023, 8 years into this USB-C thing.
As far as cables with different capabilities, what is the alternative here?
1. Physics of signaling demand certain capabilities require higher quality / more expensive cables. Zero chance you get people or manufacturers lined up around the idea of highest common denominator (e.g. 40gbps capable $30 cable when consumer wants a $3 charging cable), and that's sorta moot anyway because...
2. Standards evolve, and we're all tired of playing the game we've played for 50 years in tech of new physical connectors every few years.
It's either a) you have the same physical connector and different cables, or b) different physical connectors that briefly use the same cables before they become differentiated when standards evolve.
The reality is 'it works but not as good' (speed or charge-wise) is a superior tradeoff to 'I have to buy a totally different cable' in the high % of use cases where 'not as good' is possible, which is stuff like 'charge my phone' or 'plug my printer/audio device/whatever in'.
Honestly the only unforced errors I see with USB-C are consumer education related. USB versioning should be simple (3, 4, 5) not intentionally obtuse ("3.2 gen 2x2"), and something like iconography should be added to indicate premium cable capability (specific higher voltage & bandwidth capabilities) to avoid to play the plug in and try it game.
"I have to buy a totally different cable" is something that most of us with over 20 devices with USB-C ports on them have probably experienced, only it's worse than that as it's more like "I have to buy several different totally different cables and try to remember to return the ones that also don't work".
For a time at work, our desks had different docking stations for employees with Mac laptops or HP laptops. USB-C "standard" notwithstanding.
So whats your solution? We all get that this is imperfect, the question is can the solution be improved upon within the constraints, and if not, what constraints should change?
Because USB-C does a lot and touches a lot of things, I think there's a tendency to place the blame on it (and specifically its cables) for any circumstance where 'plug does not work as expected', which I agree is a frustrating problem that isn't fully worked out.
I bet your HP docks are like Dell power adapters, which only provide specific volts/amps and maybe use some proprietary negotiation and aren't great device-agnostic chargers. I've experienced this firsthand, and unfortunately the blame lines squarely with HP/Dell/your IT department. It's also a sort of 'stage 1' problem in USB-C's rollout, in that manufacturers are used to doing things their way (e.g. proprietary power bricks), and the feedback loop of customer expectations hasn't yet reached a point to pressure them to change. When customers say 'this sucks it doesn't charge my iPad', HP/Dell/whoever begrudgingly realize they need to up their charger game, and little by little the problem is solved.
I think politics might be the best lens to see USB-C through, because USB-C will only work as expected when consumers are providing that pressure. Hence my point about consumer education and clearer labeling being the unforced error of USB-C: moving up the learning curve quicker will minimize these problems and shake manufacturers into offering the thing that meet user expectations and the promise of USB-C. Ultimately someone has to make these things though and manufacturers all have their own misaligned incentives. It's the job of standards to meet them where they are and provide a path to aligning their incentives with better user experience. The reality is it's herding cats.
I haven't seen anything in this thread or other USB-C threads that has offered a solution that improves upon the spec itself that is realistically possible within the constraints USB-C is operating in. That's unfortunate but I think represents the idea that USB-C is, to paraphrase, 'the worst type of connector, except for all the others'.
There isn't a particularly Pareto-better solution (other than better labeling, as you suggest).
One of the things I find particularly aggravating though is the discussion that goes approximately "it's outrageous that I should have to buy a new cable for my device <glares menacingly at Apple pre-USB-C>" followed almost immediately by "yeah, it turns out I have to buy a new cable for my device <because reasons>..."
USB-IF should publish a table of every possible cable/adapter/receptacle type, with a short alphanumeric code that can be printed or molded onto the connector.
The set of permutations is too large to fit into an icon, but a lookup table allows for unlimited detail.
Power-wise it is 60W, 100W (deprecated), and 240W. Data-wise it is 480Mbps, 5Gbps, 20Gbps, 40Gbps, 80Gbps. A simple power/data rating like already specified[0] is enough for >99% of cables.
Totally it's a very solvable problem w not many variants.
It's unfortunate the USB group chose these particular labels. They're so ugly that you'd never want one on a cable, and they don't reduce to small sizes well. It's like they were never designed to be put on cables, or they never consulted anyone who ever used or sold a cable as to whether they were going to use it.
I'd have preferred something that could be depressed into the plastic mold of the connector with two icons, a bolt and a dot:
- default (USB2, whatever min watts) is blank cable
- 60w has one bolt, 100w+ has two bolts
- 5gbps has one dot, 20 has two, 40 three, 80 four
That way you avoid the uncleanness of labels and their eventual scratching off, while maintaining at-a-glance (and even physical feel) understanding of cable capability. It also scales to future improvements. Honestly I think this would solve like 90% of people's complaints with USB-C.
I am not aware of this problem existing in practice.
Chargers and devices of course have a wide variety of power ratings, but those are generally all compatible with each other. Just get a charger with a higher rating than the device and it works.
When it comes to cables, pretty much everyone just uses the standard 20V 3A / 20V 5A / 48V 5A cables. The only exception I am aware of is Oppo's SuperVOOC.
A 140W MacBook USB-C charger can not charge an OnePlus 8.
A 67W Poco (Xioami) USB-C charger can barely charge a Pixel 6 (it basically runs out of battery while charging, 200-300mA are transmitted according to an app).
The standard is also just terrible even if followed to the letter. The connectors are just too fragile for a day-to-day interconnect, and a fire risk when higher currents are involved.
I've now had to replace 2 USB-C host ports (on expensive Apple devices, where an "official" repair would cost ~50% of the value of the device) despite taking reasonable care of my devices (under which care no other connectors ever failed).
Examining them under a magnifying glass showed that the metal pin delaminated from the plastic middle part of the host connector and was ever so slightly skewed towards the adjacent pin, presumably making enough contact/interference to make the whole thing fail.
In a sane design, a misalignment/skew of <1mm would be well within tolerance and would be a non-issue. Worse, despite the machine being from the same brand that ushered all this crap onto the world, there is no software support or notification to say that something is wrong - the machine just silently stopped charging after a random delay. Very annoying when you plug your machine in to charge and only realize (at the most inconvenient time) that it silently stopped charging and you've now drained whatever battery was remaining.
Even with the new ports, I can make it lose Thunderbolt/DisplayPort connection by bumping the connector. It's probably a mismatch of tolerances between connector and port (maybe my cables are out of spec), but it's never been a problem with any other connector.
Are you using Apple, or major-vendor cables? Im curious because delamination of a pin on 2 separate ports implies something catching/dragging on it, maybe there is something either poorly designed, or simply mangled in the end of one of your cables.
Major vendor cables, and the delamination was one connector's "finger" so not really cable-specific. The second connector looked fine at casual glance but there was something wrong with it as it wouldn't even hold connection with a USB2 device - too much slack.
I don't recall having this issue with any of the connectors it replaced. I have devices that are close to a decade old and all their USB/Ethernet/HDMI ports are still working just fine.
It's a terrible design that prioritizes form over function/reliability.
From my understanding I think that is not technically possible because to pass video they need the data lines but maybe I'm mistaken?
IIUC again even if it did work I think these would be non-compliant. People hell-bent on doing non-compliant things would do it irrespective of any design.
> low-voltage only cables
That's a given due to physical constraints (length, diameter: would you accept that all cables are 0.8m / thick and unbendable). It was so with USB-A too, as well as non-standard power delivery, at the risk of setting things on fire or destroying devices. USB-PD makes it so that you basically can't fry anything or melt a cable, it falls back to the best possible through negotiation. But then again, people hell-bent on doing non-compliant things would do it irrespective of any design.
> literally manufactures are going to cheap out and not do that and you have 1-way USB-C cables without any kind of orientation markers.
Non-compliant for sure. Ah, yes, people hell-bent on doing non-compliant things would do it irrespective of any design.
At least with USB-C+USB-4+USB-PD we get a fighting chance.
> An obviously key'd connector like firewire / ethernet would've solved all of USB-A's flip it thrice problems
Rumour has it that USB-B which is keyed, has five positions due to squareness. FireWire and Ethernet have the same state superpositions as USB-A. Hell I've seen people shove an Ethernet cable the wrong way in and have it fit (breaking the infamous clip in the process). USB-C? I lay it basically flat and clip it in, barely looking at the connector and not even taking a glance at the port. Worst case it needs a few degrees rotation.
My APC UPS is connected to my computer with a USB-A to rj45 cable. It is the most cursed cable I own, narrowly beating out the type-A to type-A usb2 cable I needed to buy to get a HTC vive to work. (I later "upgraded" to a type-c cable with A-to-C adapters on either end.)
IIRC, the Ethernet standard means ports should tolerate 5V, but I'm still spooked someone's going to use the cable thinking it's something else.
Not every cable. Not if you make the connector a closed standard that can't be used without ruinous licencing fees and keep a squad of patent attorneys on retainer. Thankfully USB is not one of those.
Maybe it’s news to iPhone people (and I say this as someone whose only used iPhones since he got a smartphone over a decade ago other than for 6 months where I used a Windows phone and 3 months of an Android phone), USB-C wasn’t invented with the iPhone 15.
It’s been around a long time and most people have managed just fine, and absolutely loved the common cable. And everyone hated the iPhone user because you had to pull out a new cable to charge it.
These would be absolutely trivial to answer if the manufacturer followed the "cable logos" guidelines[0] - but pretty much every manufacturer conveniently seems to forget those.
4. Yes, all USB-C to USB-C cables can charge at a minimum of 60W which should be sufficient to charge almost anything
5. It depends, but the output capacity is printed on the power supply[0]. (Compare to USB-A, where the most you could do is 12W.)
6. The minimum of the cable (60W for any cable, 100W for some cables, 240W for a very few[1]) and the charger
[0] Annoyingly, this problem is compounded when using a power supply that can output (say) 100W on one USB-C port, but when
both are used, the limit on the two ports is 60W / 40W.
[1] There are vanishingly few products (power supplies, cables, and devices) that support USB EPR. If you have one of these, just use the power supply and cable the device came with; this is no worse a situation than with devices that used proprietary power cables.
> Some “low wattage” specification USB-C devices are compliant with USB-C but are only compatible with USB-A power bricks because thet require 5V:
This is because they are not USB-C compliant, despite physically featuring a USB-C port. The USB-C spec calls for any USB-C sink (i.e. device that consumes power) to include a pull-down resistor (Rd in the diagram on page 4: https://www.ti.com/lit/wp/slyy109b/slyy109b.pdf?ts=169690039...)
Devices that do not include this resistor, and are therefore unable to be powered with a USB-C to USB-C cable, are noncompliant and should not exist.
> Some USB-C cables are only 31 watts, and some devices can’t charge with less than 60W
The USB-PD handshake is performed independently of the cable, and uses the CC wire that is mandatory even in USB 2.0 C-to-C cables (page 76 of [0]): https://host.zlsadesign.com/Bk62vSMZp.png
You're right about the 60W point. Anecdotally, every device I've ever tried to charge has had no problems at 60W (although of course they'll charge more slowly than if they had 100W), but this is definitely not guaranteed.
There is a certain combination of cable+adapter+device that can ruin your device. I don’t remember the details. I heard about it on the Accidental Tech Podcast
According to the USB-C spec page 37[0], even USB 2.0 C-to-C cables will support USB-PD at 3A (negotiated with the CC wire which is mandatory even on 2.0 C-to-C cables.)
> if HN commenters can’t figure this stuff out, what are the chances that normal people can?
Plugging it in would probably be a good start. I can't name a single time where I've ever mistakenly identified a USB-C cable. Sometimes I've plugged it in when it wasn't connected to the wall, but I think worrying about non-compliant USB-C cables is no more relevant than caring about non-MFI certified Lightning cables.
Until you take a USB C cable with you in your laptop bag to connect with your portable USB C external monitor that gets power and video from one USB C cable and find out you have the wrong USB C cable with you.
Hate to say it, but it's entirely PEBCAK to not select a Thunderbolt cable when they want DisplayPort alt-mode. It's why the damn cable and marketing even exists in the first place.
99% of people only care about power (and optionally data), so that's what the spec focuses on.
Technically you could pipe compressed video over USB 2.0. It's not a part of the base specification though, so I don't see why either of these features are worth bringing up. iOS didn't even support class-compliant USB hardware until last year, so... it's interesting that you'd drill down on that.
Regardless, the fact that DP-Alt-Mode exists is not a refutation of the base USB-C spec. You're either spec compliant or you're not; same as Lightning without the license fee.
As far as I know, only the amperage is negotiated; every compliant cable needs to support at least 20 V.
And even then, the minimum is also 3 A, which allows for 60 W to be carried on even the cheapest cables – enough for many use cases. I actually like having lighter and more flexible cables for most of my devices.
> The connector is symmetrical but the pins aren't. You can see a wiring scheme of how symmetry is handled [1]; literally manufactures are going to cheap out and not do that and you have 1-way USB-C cables without any kind of orientation markers.
With all the many USB-C headaches I’ve heard of (and only very rarely encountered myself), I’ve never seen that happen.
The most common problem must be using a USB 2 only cable for a use case that requires USB 3 speeds and/or video.
I wish there was a requirement that USB C cables must have sort of ID chip that can be easily read that tells what the cable supports, so that we could have simple testers that you can plug a cable into and be told what speed and power it supports.
That exists. It's called an e-marker and you can buy testers for about $60 that will read it for you. If a cable doesn't have one of those markers then odds are it can't handle over 5 amps or USB 2.0 speeds.
It is kind of annoying that that functionality isn't built into phones and PCs though. There was some talk about building support into the Linux kernel[1] but it doesn't seem like that went anywhere.
Except for power, that is all determinable from the cable itself. USB-C cable supports power and USB2 data. Resistor determines the power, either legacy USB, 1.5A or 3A. Every USB-C cables support that. If there are USB3 pins connected, then it supports USB3 data and alternate mode. I think USB3 has negotiation protocol to figure out the data rate. There is also negotiation about power delivery on USB-C specific pins.
Everyone here constantly complains about USB-C any time it comes up, but my experience has been that it's fucking awesome that so many of my devices have the same cable type: phone, tablet, laptops, USB battery packs, VR headset, Switch, Deck, various game controllers, various retro game handhelds, portable monitor, antigen COVID-19 detector, car battery starter, all kinds of stuff really.
And now that lightning is going the way of the dodo, soon I'll finally be able to consolidate to basically just the one cable type for trips.
Up until this year I've had exactly 3 USB-C cables: Nintendo Switch, which is always plugged into its dock; laptop charger, also always plugged in at my desk; and phone charger. No real room for the mistakes they're describing.
Nintendo Switch is non-compliant [1] so you've already seen (2) it just hasn't been a big enough problem to be visible to you. I suspect a big reason for that is the Switch is a wall<->USB-C charger so you don't try to connect it between many USB-C devices.
I was responding to the "out for over a decade" part; how long they've been available doesn't mean much if they haven't been very common for consumer electronics for that long. Up until this year I've had only 1 such cable, my phone charger, that could get mixed up in any way... and that one couldn't be mixed up because I didn't have a second unused cable to mix it up with.
I've been using USB-C on basically every consumer electronics device I've used over nearly the past decade. Over the years I have accumulated:
1. A laptop with a USB-C power/data/DisplayPort port
2. A phone with a USB-C power/data port
3. Multiple USB power banks which use a single USB-C port for both charging and discharging
4. A smartwatch with a dock that uses USB-C
5. Multiple flashlights with USB-C charging
6. Multiple bluetooth speakers with USB-C charging
7. A pair of earbuds with a case that uses USB-C charging
8. Dozens of various USB-C cables and power bricks, some of which came included with the products above
So far I've had zero issues with any of them. Sure, not all cables support full charging speeds/data rates, but generally speaking I can just plug things in and expect them to work in some fashion.
Maybe to iPhone users USB-C feels like a new thing? To everyone else it's been a mature ecosystem for a long time now, which is why a lot of the comments in this thread commenting about it as if it's a brand new, unproven technology seem very strange to me.
Edit: Just remembered I also have an air duster and an electric toothbrush that use USB-C. Those didn't immediately occur to me because I don't have to charge them often. There might be more I'm forgetting...
I'm an Android user, and for your points 5, 6, 7, and 8 all of mine are USB-A to Micro USB (except for a flashlight bought this year which came with a USB-A to USB-C cable, and I use wraparound bluetooth headphones instead of earbuds and get the feeling those designs are just extremely rarely updated).
No displayport or smartwatch here, and my power banks all came with A-to-micro or A-to-C cables (mostly the former) - there's a convention in that ecosystem that power flows from the A connector to the other end.
> there's a convention in that ecosystem that power flows from the A connector to the other end
That's not just a convention, it's a vital part of the specification :)
USB-C devices go to significant lengths to make it hard for users to violate that invariant using an unsafe combination of otherwise safe adapters – that's why there is no USB-C-to-A adapter, for example (because that would let you build an A-to-A cable, and USB-A hosts are not required to detect possible loops/short circuits; USB-C hosts are).
I mean, I feel like they prefer to keep using A-to-C or A-to-Micro because of power flow instead of packaging C-to-C (and from some quick Amazon searches, C output still looks rare, the majority of what I'm seeing is C input and A output on the power bank itself).
Almost always, as I've noticed only after grabbing a friend's power bank and a C-to-lightning cable for a day trip...
I believe an USB-C device port is easier to implement than a host/power supply port, since the host is only allowed to supply Vbus after ensuring that it's connected to a device (by actively probing the resistor connected to the CC pins etc.), whereas a device port only needs to present that resistor. After all, even a passive C-to-micro-B cable can do that.
Yes, I have seen some products still using micro-B. I've generally just avoided them because I don't want to have to carry another type of cable, so maybe that's colored my perception a bit of how widespread USB-C usage is.
For the products you've purchased that came with an A-to-C cable, have you tried using them with a plain C-to-C cable? In my experience usually that will "just work". (Maybe there are some power banks out there that don't support outputting power through their USB-C port. If those exist I've managed to avoid them so far.)
Just tried plugging my phone into one of those power banks like that: The phone recognized it was plugged into USB, but didn't start charging. The light on the power bank did turn on though, so I redid it with a USB power meter: The power bank was sucking power from my phone at about 3.4 watts.
> The power bank was sucking power from my phone at about 3.4 watts.
At least on Android, there's an option to change the power direction (IIRC, it's in the same place where you choose the data connection mode). I have to use that option to switch the direction whenever I use the USB-C cable to charge my phone from my laptop, otherwise it tries to charge the laptop from the phone.
I know I have that option for wireless charging, but at least for this power bank I can't switch the direction. It says on that screen that the connection is controlled by the other device and fails to switch.
The comment you're referencing is claiming that the Switch is non-compliant at the power delivery protocol level.
Assuming that that's true, this has nothing to do with being wired incorrectly, nor is it an implementation mistake that's facilitated by the design of the USB-C plug being reversible. It could literally happen with USB-PD over USB-A/B.
USB-C plugs are supposed to have 2.0 data pins only on one side. It is only the sockets that should have them on both sides. The other data pins are negotiated.
When USB-C was brand new, I got a flawed breakout board where only the socket's data pins on one side were connected. I did not notice it until it was soldered together and the device got power but no connection when the cable was connected in one orientation. But then I had already built a metal case to fit that breakout board perfectly, so I just left it in.
Bonus: Buy 5 phones from 2023 with their chargers. Buy 5 random USB-C cables. Use 5 cables that were shipped with random stuff (rechargeable lamp, etc).
In my experience you can barely get the phones to survive without their "own" wall charger + cable combination. It's really terrible.
I don't understand how a 120€, 140W MacBook usb-c can't charge my OnePlus 8 at all. Not even with 5V 1A.
> There's going to be video-only cables; low-voltage only cables; etc and this time everything is usb-c so you can literally only tell which ables work by testing all of the dozen cables instead of the 1-2 USB-As you have.
Whether or not this happens, it will have nothing to do with USB-C's merits as a connector. You could (theoretically) have the same mess happen with USB-A connectors as well.
But we tried the port for each use case thing in the 90s and it sucks. VGA, Printer port, ps/2 for mouse etc. It's pretty easy for me to tell which usb cable works with my monitor dock because it's a really thick one.
If the standard included specs on the enclosure, then maybe we wouldn't need three tries. Especially when plugging these in blind, like the back of a monitor, I wish the case had a funnel shaped shroud so you know the male end was aligned correctly even though it might be reversed.
Another trick is that the USB standard says the USB logo should be facing up, so you should always try it first with the logo facing up. Not every device follows that standard and not every device has a clear top/bottom, but this really cuts down on the proportion of failed attempts.
The pattern on the aluminum of the plug is also helpful in cases of predictable slot orientation (like a laptop, except a Dell Latitude from back in the day that had upright connectors on the back unless I'm misremembering). The side without the seam down the middle and instead a tiny rectangular cutout center but slightly lower than the other two cutouts, should face up. In the dark, the seam can be felt by lightly scraping with a fingernail.
One trick used to be the UBB trident logo (embossed into the cable) always faced upward when plugged into a USB port on a computer.
Sideways ports may be a crap shoot I'm not sure if their orientation is standardized. And I'm not sure about modern USB version if that's still the case of logo faces up.
Counterpoint: I knew a few people awhile back that wrecked FireWire audio gear because they forced the connecter in backward and powered it on
Even now i do a small repair shop as a side business. Broken hdmi connectors because someone forced the connector in upside down isn’t exactly common but it’s not exactly rare either. Some people think stuff really needs to be forced together, apparently
> Using electronics needs a drivers license like equivalent.
I bet everyone seriously wishes there were a "seriously honestly just not a fucking idiot" license. The fact that we know which way a connector goes puts us in a whole different league from the rest of the population; most people are so braindead stupid that they can't even fathom being smarter. Sometimes I wonder if being neurodivergent is even worth it. Do I even want the privilege of knowing enough of the bare minimum to outclass half the Earth, if it means that I'm constantly depressed by how stupid everyone else is?
BTW I wouldn't pass a graduation exam if my life depended on it, because my brain is not a textbook database of things that I don't care about. But that just means my ADHD has gotten the better of me.
The issue isn't knowing which way a connector goes; the problem is that some people act like gorillas and try to use brute force to force connectors into place, and then don't understand why it's broken.
Normal people will try to push a plug in with a normal amount of force, find it doesn't fit, then realize they're doing it wrong and try something different. (Or in the case of USB-A, turn it over and try again, find that's wrong too, then turn it over yet again and find it fits perfectly.) But there's some sector of the populace that just doesn't seem to understand how much force is appropriate for things like delicate electronic connectors. I'll bet that if you took a tour of their homes, you'd find many other broken and beat-up things too.
My Mac Mini that I use as a server is positioned where I can't tell which port I am plugging into with any accuracy without unplugging it which I don't want to do
I suppose the joke went over your head, sorry about that. I found it comical given the HDMI connector shape and how it would be nearly impossible to get wrong without being oblivious.
I mean, there already is an integrated element of punishment – if you don't manage to plug in the HDMI connector correctly, you don't get to watch any TV :)
I agree that there was more that could have been done to signal correct orientation.
But an additional wrinkle is that the orientation of the port itself was not static. On laptops and desktops, it was horizontal. On certain other devices, it was vertical. Often these ports were in hard-to-reach or hard-to-see places, and having to reconcile the orientation in each case probably added to the frustration.
Not really, HDMI ports are in weird orientation and mounted in hard-to-reach places as well, but the shape makes it very obvious (even by feel) which way it should (or can't) go. This is much harder with the rectangular USB plug.
I have definitely been frustrated by an hdmi port I couldn't see. I almost always have to gain visibility on the port to get the cable inserted correctly. If I happen to already know the orientation of the port on a device I can do it by feel, but I could also do this with the microusb-b to my old cell phone.
It’s still better than nothing at all! If you can physically see the connector, that’s a huge help. For USB-A, you often had to look into the cable to figure out which way it went.
I agree that HDMI can be heads or tails based on if it’s tucked behind a TV, but sometimes I can feel enough to get it right. That was never so with the original USB.
The "by feel" part is critical. Lots of ports on various devices are deliberately in the back of the device, or otherwise hidden, so that the wiring can be kept neatly out of the way.
I think it's worse than USB. It's so damn narrow and has a requirement for precision. It's hard to know if you've got it upside-down or you're just not well-aligned.
I'll often use my phone and get a picture of the plugs on the device before I dive in with the cable. This is especially useful when working with gear I'd rather not have t0 move.
When I was a teenager, I had a Windows machine I used for 3D animation and an iPod. I needed to charge the iPod and my mom was calling "dinner!" so I plugged it in quickly and ran off to eat.
When I came back, I realized the Windows machine had a cheap Firewire socket that didn't enforce the orientation. I had plugged in my iPod upsidedown.
I don't remember what it fried. I don't remember the iPod dying, so I'm guessing that socket never worked again.
Did they not get a mechanical engineer to design the thing? And if not, why not? I wouldn't expect a mechanical engineer to know anything about electrical protocol design, so I wouldn't expect an electrical engineer or software engineer to know anything about designing a mechanical assembly (like a connector housing).
Some part of the design process here was managed very badly.
Hey, they could have made it not rotationally symmetrical but mandated an orientation opposite of the one that makes the socket look like a cute little surprised face, thus ensuring a long, drawn out, losing battle between the correct orientation and the looks-like-a-little-face orientation.
The tl;dw is that there is no standard and never has been, so the idea that there's a "correct" orientation for these outlets that's been warring for dominance with the cute one is a myth.
The rest of the video digs into the claimed benefits of turning them upside down, and finds that they're quite small and probably outweighed by the inconvenience of the sheer number of devices that assume that you're using the cute orientation.
Female (host) USB-A orientation is not always consistent, thought, which adds to the confusion. Thought the orientation you de scribe is the most common zone, I can recall at l'East two devices (laptop and desktop) where it was reversed.
I used to "bedazzle" my USB cables. Bedazzling is when you decorate something with little plastic reflective jewel stickers. I always put the jewel in the same side (the "top") so I could feel which way the cable should be oriented. Then I could be confident I had the orientation correct without looking.
I called this USBedazzling. The other folks in the office didn't think it was a funny as I did.
One of my earliest experiences with USB was back in the day where I had a PC without any USB front ports that were free. So I grabbed my USB thumb drive and reached around the back of the PC tower and plugged it in without looking.
USB didn't show up in the O/S. Thought maybe it wasn't formatted correctly and went through some diagnostics. Eventually went for the remove it / plug it back in technique. Had a look at the back of the PC and noticed I'd managed to plug it into an empty Ethernet port.... yeah they're about the same width give or take some tolerance. Also usually placed right next to each other.
Back on topic, I do find those bare/exposed usb keys (like a yubikey) to be quite annoying.
I think people really need to think about USB Type A in the context of the connectors it was 'competing' with: DSUB9 (VGA), RS-232 (serial), DB-25 (parallel), PS/2 (keyboard/mouse). These were, by modern standards, complete garbage connectors. Huge. Didn't stay plugged in securely unless the connector had screws. Only went one way.
USB 1 stayed plugged in, required no screws, had higher bandwidth than any data port you'd find on a typical PC, the wires were much thinner, and it could even power low power devices! Yes, a design that doesn't take three tries (my average) to get right would be nice, but it exceeded every existing port by miles. Audio jacks were the only ports that didn't have a direction back then.
Also, those older plugs were more or less not hot-pluggable. The computer had to be off (really off, not standby) before you plugged/unplugged a cable, or you'd risk damaging the port and/or the device.
The only exception I know of is that Apple Desktop Bus (mini-DIN) had been designed (by Woz !) to be supposed to be hot-pluggable but Apple cheaped out on overcurrent protection so it never was.
Edit: I forgot: in a MIDI connection there is an opto-coupler behind the receiving end, to electrically separate devices.
> Audio jacks were the only ports that didn't have a direction back then.
On computers perhaps. There were plenty of round connectors in other uses though, e.g. Edison screw, Belling-Lee (TV/FM connectors), RCA, BNC, type C and type N RF, TOSLINK, etc.
Most of those wouldn't have been much use for a computer serial port though. Something like TOSLINK, with power and optic fibre would have been great. Too expensive at the time I think. But a round TRRP-style jack would have worked and been cheap.
Agree that USB-A was much better than the existing alternatives though.
If you have a reversible connector, then you might be entangled in a dilemma of which way is the "better" or "optimal" or "canonical" way, while with a non-reversible one there is only one way and thus no dilemma.
And in fact USB-C is not really physically reversible, because the hardware detects which way it's plugged in and permutes the signals, as opposed to mirroring all the wires, so there is indeed a "right" way of inserting an USB-C plug, except it's impossible to tell which it is without a dedicated hardware tester.
> the hardware detects which way it's plugged in and permutes the signals
For USB3/4 signals, yes. But the hardware needed to interact with those signals is complex enough that making it also support swapping pins is a minor detail.
The USB2 and power lines, on the other hand, are all present on both sides. This means that simple devices don't need to detect orientation; they can connect the duplicated pins together and everything works.
A USB 2.0 device is usually supposed[1] to just short the two possible positions of each data pin in its USB-C port (the four power pins are always shorted, of course, as are the four ground ones). Orientation sensing only comes into play when you start using the SuperSpeed lanes (for signal interference reasons).
No, there are dedicated high-speed "mux chips" for this[0]. You can get them with integrated redrivers to compensate for the inevitable signal quality degradation, if needed.
Once you get into the USB 4 range you're going to need all four lanes anyways, so you can just do the permutation in software.
I'm reluctant to consider shoddy or noncompliant implementations as counting against a technology. If I did, then it'd follow that there's no such thing as a good idea, and I could only agree with that if I were having a bad episode of ennui.
> I'm reluctant to consider shoddy or noncompliant implementations as counting against a technology.
I find it interesting that it is typically done with programming languages. "This programming languages allows people to write complicated, unreadable code, hence the language is bad".
In fairness there are plenty of people who think one should not consider it a language flaw if you can shoot yourself in the foot. See: basically every C programmer.
That's my point: where do you put the limit? A gun allows you to shoot yourself in the foot. Does that make it a bad gun? I think most of us would agree here: if you are at high risk of shooting yourself in the foot, you should not be given a gun.
Nobody ever says "oh, this laptop is bad, because if you take it in the bath you will probably break it". But for some reason, for professional programmers, we consider it is absolutely unbearable to request that they know their tool.
Well, my point is that not everybody thinks that it's unbearable to require programmers to be careful with their tool. So I'm not sure if your point really works as a result.
But that said, I personally am one who thinks we should try to do better with programming languages, so I'll give you my $.02 for what little it's worth. The difference between something like a gun being able to blow your foot off, and the C programming language being able to metaphorically blow your foot off, is the complexity of avoiding the failure state. With a gun, even the most beginner gun handler understands that whatever the gun is aimed at will have a bad time when you fire it. They therefore can very easily know the answer to "when I fire this, will I shoot my own foot" is "no, because I'm not aiming at my foot".
With C, it's more like "the gun will shoot your foot if you aim directly at it - but also it'll shoot your foot without aiming at it if it's the third Tuesday of the month, or if Venus is in conjunction with Saturn, or if your aim is tilted exactly 1.4 degrees off target". That is, the ways you can hurt yourself in C are vastly more complicated, and include plenty of situations where even an experienced practitioner may not realize that they're in for a bad time.
I think that is why you see a push to eliminate sharp edges with programming languages that you don't see elsewhere. If it were readily obvious when you're in for some pain with programming languages, people wouldn't object so much.
> Well, my point is that not everybody thinks that it's unbearable to require programmers to be careful with their tool. So I'm not sure if your point really works as a result.
Oh, I had misunderstood it, sorry (I thought it was a criticism of "those weird C people").
> With a gun, even the most beginner gun handler understands that whatever the gun is aimed at will have a bad time when you fire it. They therefore can very easily know the answer to "when I fire this, will I shoot my own foot" is "no, because I'm not aiming at my foot".
I tend to disagree: usually people do spend some time learning how to use their gun before we give them the responsibility of carrying one. There are many rules to follow, and still there are accidents (even from professionals). I wouldn't say that a gun is "fool-proof". I wouldn't give an assault rifle to somebody, let them fire a few rounds, and then bring the rifle in my car while I'm driving without checking anything. Actually I would pay attention to everything they do and I would check that the gun is unloaded myself.
Sure, getting a loaded gun, removing the safety and firing is easy. Carrying and using a gun correctly is not. To compare that to C, copy-pasting a `main` function and running it is easy. Writing good C code is not. But writing good Rust or Kotlin code is not easy either. That's what it takes to be a professional: you have to learn to use your tools.
Given those microcontrollers are 0.05 USD at worst, meh, though I acknowledge it sounds a bit crazy if you don’t recognize how cheap micros can be nowadays. Laptop chargers have used similar schemes (perhaps with bare serial EEPROMs instead of MCUs) since forever, FWIW.
The absurdity to me is not the cost, but the complexity. Our (or at least my own) intuition of an extension cord is as a dumb cord just relaying signals, but they are more complex than that, now.
I can swear I've felt that when I incorrectly plugged my USB-C to dock station, then I had some crazy issues e.g related to bluetooth. I tried restarts, adapter unplugs, etc, etc.
But when I unplugged the cable and reversed it then the issue disappeared
I've seen people try to connect ps/2 and vga cables upside down, and mash all the pins in the process. I'm fine with flipping a USB cable over a couple times until it goes in. /shrug
Back in the days when 200 MB was a big drive for a personal computer, IBM was working on a 1 GB drive. I worked at a place that was doing firmware for a company that was going to use those 1 GB drives in their product, and that company had access to some of the earliest test units. We were writing the firmware for that product, so IBM allowed them to loan us a couple of the drives.
I once put one of the drives into my test system, and plugged in the power connector. It used a standard Molex connector [1].
The way the drive was mounted I couldn't actually see the plug or socket. It's a keyed connector so that should not be a problem. Yet somehow I managed to plug it in upside down which fried the drive.
Afterwards I did some tests and found that the plastic on the connector on the drive was very soft. If you tried to plug in the wrong way the parts that were supposed to get in the way due to the keying would just deform out of the weight.
It took more force to insert it the wrong way, but not more than was often needed when you happened across a plug and socket combination that were a tight fit so that didn't tip you off.
I reported that to the person who had loaned us the drive and he told me he'd fried two of them that way. He said that when he told IBM about that his contact there said that they were losing something like 10% of the drives during testing right after manufacturing due to their technicians getting it backward, and the parts list had already been revised to switch to a connector hard enough for the keying to actually work.
You can also insert the USB-a connector upside down, if you use enough force. It won’t really break anything but it will bend the plastic part so it becomes hard to insert the right way.
I just touch the connector to find the void. It always goes down and this trick never failed me. I have no problem plugging in USB on the first try even in the dark.
> Someday we'll look back and laugh (or cry) at our early USB struggles.
Hah, sounds like the editor is someone too young to have used serial/parallel/vga ports. Early business PCs, for what ever reason[1] were built like tanks. They weighed dozens of kgs/pounds. And the connectors were often screwed in with a screwdriver for good measure.
Yes, you could wing it when being sloppy, but they weren't always very tight and cables heavy so you'd want them screwed in for a long-term installation to avoid issues.
When USB came out it was a revelation that you just needed to pop it in and were done. Yes, you'd have to look at it first, but that was the same with every other port of before that. I remember PS/2 being perhaps hardest to line up correctly.
So USB 1.0 fixed one problem, but not every problem. Not exactly a reason to cry.
[1] Probably inertia from main/minicomputers which were serious installations and needed to keep running through wartime. :-/
From the article: “Bhatt's idea for the USB was inspired by his own experience as a user dealing with tech frustrations far beyond the scope of a get-it-wrong-the-first-time cable.”
Exactly what made USB so awesome when it first came out. I guess you had to be there.
D-sub and likes were lot worse. USB is robust and don't need things like screws.
Din and mini-din also have similar issues. Compared to either family USB is clearly a more usable design.
>But in an effort to keep it as cheap as possible, the decision was made to go with a design that, in theory, would give users a 50/50 chance of plugging it in correctly (you can up the odds by looking at the inside first, or identifying the logo).
I despise this "design decision." Literally bakes in the 50% of the time you plug it in the wrong orientation and lose seconds of your life. Who cares if the sands of time are moving more rapidly than asteroids? We can save money and make more of these badboys to fill the landfills with once they're obsolete. Hard to argue that logic that clearly won via market dynamics. But where are we now?
All I can say is that before USB 1.0, the myriad of connections on a PC motherboard were ridiculous. USB was leaps and bounds better beyond PS/2, and VGA and various serial and parallel ports would require you to physically screw in the plug. Just look at: https://www.electronicshub.org/wp-content/uploads/2016/01/Po...
It was a nightmare. And virtually none of those ports would self-install drivers when a device was plugged into it. USB 1.0 was magic when it came out.
Agreed. Everyone forgets how awful it was and even though USB wasn't perfect, it was a huge improvement over the status quo. For that, I can forgive the inventors for making a mistake. Nobody's perfect. The travesty is that it took so long to correct the issue.
If they were going to do that they at least could have used a D shaped connector or something so the orientation of both the plug and the port are obvious, even in bad lighting / from an awkward angle.
I think people underestimate how cheaply products are made.
> Making USB reversible to begin with would have necessitated twice as many wires and twice as many circuits, and would have doubled the cost.
Adding more wires, even if it is a few is not the cheapest one and therefore not the one that wins. The relative convenience of a feature is always trumped in early days by cost-to-produce.
USB-C recently introducing the symmetry of the connector does not imply that manufacturers will use it... ie you have a symmetric connector that performs differently based on how it was plugged.
> But in an effort to keep it as cheap as possible, the decision was made to go with a design that, in theory, would give users a 50/50 chance of plugging it in correctly
They overlooked the statistically significant case where it takes more than two tries to plug it in correctly. Knowing you might have been wrong makes you prematurely abandon an attempt where you had it set the right way.
I don't really get why people hate USB-A so much. I almost always plug my USB-A in the first try, because most of my devices have a clear up/down (e.g. a laptop or a docking station), and the plastic part always goes down.
Of course the vertical ports behind a machine are a bit harder to access, but... well anyway they are. And usually those are not the ones I unplug/replug often.
They didn't have to double the wiring, just add wiring to the other side of the connector. That is barely 2 cm of wiring, at most. Also, manufacturers could have made the connectors in a way that would clearly show which side is the top and which is the bottom. So none of the issues had to happen but lazy manufacturers and gullible inventors caused all of this on their own.
I, genuinely, have a USB-C cable which will only charge my phone in one orientation.
That's worse than Type-A, since I can't even look at the end to see which side has the plastic bar, I just have to try it and obviously it fits, but see if it charges. Which means it takes about 94 attempts because I only realise the other end's not plugged in, or it's turned off at the wall, on the 92nd.
I have an A-C cable which has a reversible type-A plug (basically, it looks like a huge ugly lighting plug) and charges both ways but data only works one way. The first time I wanted to use it for carplay I lost some hair.
I bought a JBL headset which USB power cable is reversible, instead of a big plastic piece for half the plug, it has a thin strip of plastic, so the block of plastic of the port side can go in either side of the plug: https://images.nexusapp.co/assets/4a/75/a0/5733575.jpg
But I guess it might just have two wires (+5V and ground) and no data wires (I can't check it now because I'm at work).
I had a USB cable with a reversible A plug, but after a few months/years it would stop charging (because the pins either wouldn't make contact with the A socket, or the wires snapped off the pins).
If we're not going back to circular slip-ring connectors that allow rotation (like TRS or DC plugs,) then the step after the +-90-reversible USB-C is a triangular one. So you only need to rotate it +-30 if you got it wrong. Now I really want a triangular connector. Like in alien-tech movies.
This is not so crazy - this was exactly the progression in screw drives. First came slotted screws (2-fold rotational symmetry), then Phillips/posidrive/Robertson (4-fold), and now Torx (6-fold). Going back to slotted now is actually irritating.
Of course the constraints and trade-offs are very different... still it would be a piece of cake to plug them in on the backside of a box with your eyes closed.
Nothing to me, for the same reason you gave. I'd love to see them making a comeback.
My guess is they won't come back because modern connectors are so thin that if you made them equally narrow, they'd break/bend easily. (And yes, 3.5mm really is too thick to be a standard size, e.g. in wearables. The outer thickness of the USB-C socket is 3.3mm.)
Hear me out. The original USB only had 4 pins. I always thought a round ringed TRS style plug like headphones use would have been great. No way to orient it incorrectly.
I think the problem these connectors have is that each ring has to slide over the other contacts to get to its correct location.
USB Type-A has longer contacts for pins 1 and 4, the power pins, so they contact first and remain contacted before the data pins make contact. You can see that in this picture https://www.electroschematics.com/usb-how-things-work/.
That's not going to save you when you try to plug in the device-side connector with the computer-side already connected (and therefore already applying power to the cable).
One problem with HDMI is that it has no clips or screws or similar holding it in, unlike DP, DVI, VGA, BNC... I remember I used to knock the HDMI out of one of my monitors with my foot all the time. It plugged in going straight up and barely could resist a bit more force than gravity, and the cable dangled behind the desk since the tower was on the floor.
Yes, I hear this a lot, and it's not like it happens to me all the time these days, but it's a clear flaw in the design that there's nothing extra to hold it in like those other things.
Because when you have to plug hdmi on a wall mounted tv or monitor that is close to wall and hard to rotate - you end up not only guessing orientation but having a hard time to find that port... :)
In the same vein, USB-A has a plugin-nable space barely the same diameter as a headphone jack – there have been a few times that I've plugged a headphone without looking into an adjacent USB receptacle. It happened surprisingly often when MacBooks had USB-A.
As annoying as this has been when it happened to me, I also have wondered if they did it on purpose to make it easier for assemblers. If the tooling needed for USB, Ethernet, and eSATA are all the same size you can save cost through reuse.
I think there is value in remembering that this was an era where peripherals were plugged in with a view of semi-permanence, most plugs even had screws. I also remember thinking at the time how much easier USB was to use than trying to align a PS/2 connector.
Despite the obvious design drawback, I can't blame intel entirely, a big contributor to the problem was that the ports were still being positioned in hard to access areas by computer makers. The first iMac design was a bit more forward thinking in this regard, whereby USB ports were prominently positioned on either side of the keyboard, this made it very easy to plug in the USB mouse and use the other for something like a USB stick. The problem with the type A design isn't just the orientation issue, the rectangular port design provides very little tolerance for off-angle insertion, having the port in full view helped a lot with insertion.
> Making USB reversible to begin with would have necessitated twice as many wires and twice as many circuits, and would have doubled the cost
Not really. I've got a cable which is USB 2.0 type A one side, Micro-B the other side and both the sides are perfectly reversible. It's already been many years since I've bought it and it still serves me reliably for both charging and data needs. The manufacturer didn't even have to change the standard to produce a reversible cable working great with all the old devices whose manufacturers never heard reversible USB cables exist.
Rubbish. These are just excuses for terrible & thoughtless design.
USB is by far the worst designed thing in the history of humankind, in terms of the cumulative amount of frustration it has caused globally.
All common-sense usability principles were ignored, and these are not new principles. Take the old RS-232C or RJ45 jacks, you just looked at it and you knew which way to plug it in. Hell you could do it blindfolded by feel.
Anyone bitching about USB does not remember the horrors of serial ports. My favorite cables were the ones where the connector housing metal was so soft it was actually indeed possible to plug them in upside down and bend all your pins. Spent lots of time in the 90s with tweezers straightening pins.
All he had to do was make one side of the connector rounded. But still a vast improvement.
I used to struggle with USB-A plugs until I made it a point to actually look and pay attention. 99% of all plugs are labeled and logo side faces up. For devices that have a natural orientation (e.g. laptop), it always goes one way. For mating two cables, just look at where the male PCB in one is.
Turns out that plugging in USB-A cables is a skill you can learn in 5 seconds.
> It was in 1998 that USB made some real headway, courtesy of the iMac G3, the first computer to ship with only USB ports for external devices (there were no serial or parallel ports).
Um, no? The iMac had an Ethernet port, a phone jack for an internal modem, and TWO FIREWIRE PORTS. And Microphone and Speaker jacks.
The Original Bondi Blue 233 MHz iMac Rev A (tray-loading CD-ROM) had:
- 2x USB
- 1x 10/100 Ethernet
- 1x phone jack
- Infrared port
- 1x audio input
- 1x audio output†
- 2x headphone ports
†I thought it had two front headphone ports so that two students could share a computer in school labs, but googling for that suggests it was probably a later revision feature.
It is genuinely surprising that out of all companies Apple was the one to pioneer USB usage.
Quite the stark contrast with their current modus operandi of always making something custom so it can be as incompatible as possible with anything non-Apple (and so they can sell more dongles). Well until they were dragged kicking and screaming to USB-C by the EU anyway.
It was really only the iPhone that lagged behind, and that’s because they switched to lightning not but a couple years before USB C was finalized, pissing off everyone who invested in the 20-pin connector ecosystem.
That always annoyed me. I get that people had to change cables for the first time since they got their first iPod, but lightning was such a MASSIVE improvement over the 30 pin dock connector… I can’t imagine anyone would want to go back.
I was really expecting something similar with the USB-C switch but it doesn’t seem to have happened.
I don't know if this is common but I've encountered an issue with two different devices with the latest reversible USBs that it would only charge if I plugged it in one way and not the other. I would plug my phone and I had to check the screen to see if it was charging or not; if not, I had to unplug, turn the connector around and plug back in the other way. I remember longing for the older micro USB where the right orientation was obvious. Obviously my phone port must had been damaged; but I suspect that because this happened to me with two different phones, it may be related to fragility of the new ports due to smaller wires or something... Anyone else had this issue?
Yep, probably damage to either the connector on the cable, or the port on the device. Does the connector also feel a bit looser when you plug it in the non-working way? (Not required, but it's a dead giveaway.)
> you can up the odds by looking at the inside first, or identifying the logo
No, this really only increases your frustration once you find out that some USB manufacturers can't tell their head from their *** when it comes to making the connector.
I was always hoping for a 3.5mm jack for a connector.
Radially symmetric, it can go in any direction. I've heard arguments this could cause connection issues since it could rotate while connected, causing small contact misses, but I think that could be designed around.
USB A to Micro USB cables with reversible connectors on both ends have been available for a few years - much too late, but still useful. I bought some on AliExpress, and while probably violating some specs, they have been working nicely for me so far.
Reversible is overrated. what I always liked were the genderless connectors. Actual engineers probably don't like them because making a long cable by chaining a bunch bunch of short cables together is almost always a bad idea so they are very rare. The only two I have ever seen in the wild was the IBM bus and tag cables(a strong contender for the most manly data connector I have ever seen) and some unknown telephone headset cord with a seriously over engineered connector.
Backwards rationalization / cognitive dissonance for a poor engineering tradeoff.
The cost of zillions of the same panel mount connector will converge to dirt cheap. Although cents matter, user experience matters more if everyone hates your craptastic cheap design. Thankfully, Apple came along and gave the world USB-C which simply increased the bandwidth and current with the convenience of rotational symmetry. A USB-C panel-mount connector costs $0.492 in bulk. Fancy rugged and waterproof ones are $2.18. Micro USB B costs $0.45 in bulk. 10% increase in cost: big whoop for everything except Arduinos and portable kids' night lights.
Looking at the logo on the plug is enough to get it right at the first attempt, but only for horizontal ports on a laptop or desktop. For vertical ports or ports on the top (chargers have both types) it's not so easy.
Ajay Bhatt: Good question. We had looked at it, but the whole goal here was to make it very inexpensive, and at that point, we were trying to solve all the USB problems with two wires. At that point, if you added wires to make things flippable, you have to add wires, and you also have to add a lot of silicon. Wires and pins cost real money, so we decided to keep it as cheap as possible. With serial port and parallel port, there were versions that were 25 pins and 36 pins and so on and so forth. The cables were very thick and expensive. We were trying to solve all the problems. We went in favor of fewer wires. In hindsight, a flippable connector would have been better.
Our goal was to say that this interface should be such that it should work on a mouse and it should also work on a high-end printer or on digital cameras. That’s what we were looking at, the range of products. At one end, we wanted it simple enough, so there could be very low costs. At the other end, we wanted to make sure that it could be scaled and, just as we speak today, we’re running the USB at tens of gigs. The original one was running at 12 megs. We’ve come a long way in scaling.'
That's of course nonsense as others have pointed out, but I still decry the insane electrical implementation: bidirectional differential[1] pairs. Having instead (like Firewire) a dedicated pair in each direction is so much simpler and leads to much better performance. Instead we got evermore convoluted and complicated USB which has long surpassed Firewire in cost of cables etc.
[1] of course USB isn't always truly differential which is part of the problem.
More interesting question for me is what exactly made USB possible? I understand that parallel communication wasn't possible because even for small differences in wire length, data on each wire will arrive at different times and that limits speed. So, people went with serial communication. But what exactly enabled high speed bit transfer? Why wasn't it possible before? Similarly, why USB2 speeds weren't possible before? What technical advances made it possible?
Well, high speed serial coms existed before USB, so the premise of the question is a bit wrong. RS-422 officially supports 10 Mbps over short distances, for example, and various serial WAN protocols supported >100Mbps over copper before USB launched.
I would argue that what USB (1.1-2.0) does differently from previous serial peripheral ports is mostly software and standardization, and really relate to making it cheap and simple for "normal people" to use:
1. USB 1.1 supports only two bit rates (1.5 and 12 Mbps), which is autoconfigured before device enumeration. 12 = 8 * 1.5, so the clock divider is cheap and easy.
2. USB limits cables to fairly short lengths (<=5m) compared to earlier serial ports (RS-422 supports 10 Mbps at 15m)
3. USB (pre-OTG extensions) rigorously enforced the idea of a "host" (upstream) and "device" (downstream), at both the protocol and physical connector level (which greatly simplifies things--for example, you can't create a loop, and don't need STP to detect it). A child can easily see that a B (device) socket doesn't fit an A (host) plug.
4. USB device enumeration and configuration are extensively software based, and USB defines a number of standard device classes so that many common types of devices (e.g., keyboards, mice) don't need specific drivers.
5. One thing that USB does that's less common among serial peripheral interfaces is to use a single differential pair for data going in both directions (it's time-shared: so the host polls the device then the device responds). The pin-count is less than a "regular" RS-422 port, but you still get the advantages of differential signaling.
6. USB carries power with only one more pin than needed to create a minimal bidirectional serial data connection, so "lite" devices don't need a separate power connector (ignoring very slow protocols like "1-wire").
7. USB does some funny things with packet framing (like NRZI encoding and bit-stuffing) and some things to help reduce device cost (like the JKJKJK packet preamble to sync the device baud-rate generator and using SE1/SE0 states for device disconnect and bus reset signaling), but none of that is really fundamental to making a 10Mbps-class serial interface.
I used to know one of the people involved in the spec at Intel and he long claimed to regret they couldn’t find a (presumably politically acceptable) way to make it reversible.
My take: round connectors are the best connectors — checking the plug orientation is a waste of bloody time. Every time you plug something in you waste a few seconds checking the orientation and flipping the plug over. Even reversible plugs are inferior to round ones.
USB-A has only 4 wires: they could have easily used a co-axial connector. Hell, they could have used a 3.5 mm TRRS phono cable and it would have been better.
Seriosly? Overcomplicated protocol to make it very hard to bit-bang and thus pay USB-IF for every chip, every logo. There was already 100mbit Ethernet at that time, just add power wires and use existing network stack in software, but how can you squish money from it?! Ofc they needed something new proprietary, overcomplicated, with terrible reliability. USB is the curse of modern hardware.
> Sometime later, I also learned that "three" is usually the magic number for correctly plugging in a USB Type-A device. It's a maddening dance and it begs the question, why wasn't the Universal Serial Bus designed with a reversible connector from the outset?
Honestly when this is the biggest problem you face on a daily basis, your life must be relatively easy.
Seems like it always took me 3 steps to plug in a usb-a cable.
1. Doesn't easily insert so assume I've got it upside down. Flip.
2. Still doesn't easily insert so inspect, and realize it had it right the first time.
3. Insert for real this time.
I don't miss usb-a, but it was a vast improvement over the rs-232 and parallel port cables that preceded it.
As an added bit of genius, it’s exactly the right width to fit snugly into an Ethernet port. (Which most laptops keep right next to the usb ports)
So on the rare occasion that you get it right by feel the first try, you’re not actually plugged in to anything and get to spend 10 minutes debugging your printer before noticing.
I just realized, all this time, i should have known which way the USB ports were on the back of the pc.
All I had to do was visualize the motherboard inside the case and I'd immediately and easily know which way to plug the cables in without looking.
Fun fact: the Nokia 2780 Flip, despite using USB-C, only accepts the cable in one orientation. Put a cable in upside down, and nothing happens. Confirmed with the charger it comes with and one C–C cable that I already possessed.
(Additional fun fact: the box says it uses Micro-USB.)
There is a knowledge that you have to hold your USB cable with the logo upwards. Unfortunately, it is too arcane, so lots of cable and even device manufacturers just not aware of it. Also I it becomes quirky when the socket is vertically oriented.
If the plug actually complies with the spec, and has the metal shroud as it is supposed to, it works fine, and will not go in backwards. It's making USB connectors which are just a piece of PC board that's the problem. Looking at you, Yubikey.
I thought the tongue should have a little bump on the right or left, so if you're upside down you can tell straight away because the connector will want to angle over, rather than wiggling and wondering and doing the standard 3 tries thing.
Making the part that is PCB on cheap USB sticks protrude a little from the metal rectangle? Would have required slightly deeper sockets, but I like the idea, would have saved some frustration.
My USB hub will allow me to plug in a USB-A connector the wrong way. When I do, it has some sort of fault and resets all the ports. I don't know how this is physically possible, but it's somehow worse then requiring 3 attempts.
USB was a scam from Intel (and Microsoft) to put the PC at center of "your digital life". They both feared that Firewire would be able to send digital streams around and no PC would be necessary.
Has no one told the one about the USB guy's funeral?
They put him in the ground, but realized he was the wrong way 'round so they pulled him out and put him in the other way.
And from such heroic beginnings, they've become a despicable cartel that shakes down small businesses for thousands of dollars by selling artificially scarce numerals.
my ILEC camera, PS5 game controller, iPhone, laptop, work laptop donglebook, all charge from the same cord and power block when they need to
my portable speaker is still microusb on one side and usb 2.0 or whatever nomenclature on the other side
airpods are primarily inductive charging, but it has a lightning cable port, along with my apple keyboard and mouse when I need to connect them to a different computer and go wireless again
Reversibility for USB 2.0 data signals is mechanical: There are data pins on both sides and only one set of pins get connected. The socket side connects both sets.
There is only one more wire in a USB-C 2.0 cable: it is used to signal orientation and if an end is a power source, sink or a headphone adaptor.
USB-3 signals are more complicated though. There can be up to two bidirectional high-speed channels. The aforementioned sensing pin is used to figure out if those channels need to be swapped or not.
From other comments it sounds like no. Instead, every port must detect the orientation and switch to using the correct lines in software. IIUC, each cable also needs a small IC to assist with this.
This is... not correct.
You can make USB reversible with 1 extra pin and 1 extra wire. Grounds on pins 1 and 5, data on pins 2 and 4, and VCC on pin 3. Then have those pins on both sides of the plug and a socket with a single set of contacts on one side.
That's BASICALLY what Apple did with lightning.
Then you implement auto crossover detection, (edit: Gah! you don't even have to do that just flip the flipping wires) which had been around for years and is dirt cheap, in the hub. It would have been like, six, more transistors in the hub IC.
edit: I completely forgot that reversible USB 2.0 plugs already exist and use a simpler (and cheaper) method. They just tend not to be so reliable because of the thinner materials and the fact that they're not spec-compliant so they tend to be grey market jobs made for the lowest price possible.
Here is one: https://www.amazon.com/Tripp-Lite-Universal-Reversible-UR050...
No doubling of wires or circuits required, just a thin double-sided PCB.
Was the connector form-factor inherited from an earlier project and the players didn't want to design a new one?