The thing that has been bothering me for a while is that the USB spec allows for software detection of capabilities. You can read the emarker data and see the supported protocols, speeds, voltages, etc.
But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
Perhaps someday it will earn the same level of importance as charging; iOS 26 calls out slow chargers on their iPhones, so you can run to the Apple Store and buy a fast one!
They probably have to weigh potential new hardware sales against added complexity. I have counterpoints too but: I believe they try to protect users’ mental models of their ecosystem (which perhaps I appreciate when I don’t notice, and can’t stand when something is uncustomizable). Like there are enough variables they don’t trust us with as it is.
> In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
"This device can run faster" or something.
AFAIK that's just when plugging in a USB 3 device into a USB 2 port or using a USB 2 cable.
> that's just when plugging in a USB 3 device into a USB 2 port
Dell XPS laptops (and some others) can also warn if the charger isn't providing the full wattage the laptop is rated for. This warning is an option that can be turned off in the BIOS settings.
I usually turn it off because I sometimes intentionally do day trips with a smaller/lighter portable charger which delivers 45w to my laptop which can need up to 65w due to having a discrete GPU. However, 45w is more than sufficient to charge the laptop during normal use on the Balanced power plan with iGPU. I only need more than 45w when gaming with the discrete GPU active.
Just this morning, my old Latitude failed to boot with a “this charger is only giving 20W and that’s not enough to boot this laptop” error. (I was testing a new USB-C charger that’s obviously going back.)
Weirdest part was it was 100% charged, so could have booted with 0 Watts of charger but decided not to boot with 20 Watts more.
Oh, refusing to boot at all is evil. I've never seen that.
Sure, you or I would just unplug the charger and run on battery but bad UX decisions like that generate a support call to me from my 95 yr old mom. It should not only warn and continue to boot, it should use whatever power is on offer to reduce the rate of battery drain.
My wife's work laptop gives this stupid warning anytime any USBC charger is plugged in, other than the Dell brick. So even a dock delivering 100w would get a complaint. The Dell brick offers non-standard charging at 140w, which can't get replaced by standards compliant, smaller chargers.
I wonder if it's possible for a regular machine with two high speed ports to do a cable test by itself. Maybe it can't test all the attributes but could it at least verify speed claims in software?
Apparently the USB driver stack doesn't report the cable's eMarker chip data back to the OS. However benchmarking actual transfer throughput is the ultimate test for data connections (vs charging use cases). Unfortunately, TFA doesn't really go into this aspect of cable testing as the tester seems to only report eMarker data, which pins are connected and copper resistance.
Since a >$1,000 automated lab cable throughput tester is overkill, my thumbnail test for high-speed USB-C data cables is to run a disk speed benchmark to a very fast, well-characterized external NVMe enclosure with a known-fast NVMe drive. I know what the throughput should be based on prior tests with an $80 active 1M Thunderbolt cable made for high-end USB-C docks and confirmed by online benchmark reviews from credible sources.
There would be too many factors involved for a proper test. Many laptop USB controllers would probably not even have the capacity to run two ports at full speed simultaneously.
Even Apple now has one of those, when you plug something into the USB 2 port on the MacBook Neo.
There’s still nothing when you plug a usb3 device in using a usb2 cable.
I strongly suspected my old xps had nonstandard things going on with its USB C charger
> But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
There is. I used to use a KVM with USB 2 ports connected to my PC's USB 3 port, to which I connected a monitor with integrated USB 3 hub to drive my keyboard and mouse. Windows would show a popup every time telling me that I should use a faster cable.
There are also popups telling me that my laptop is connected to a "slow" usb-c charger.
That’s quite a simplistic one unfortunately - USB 2 and 3 use different controllers in the PC, which it can indeed detect. The sub-flavours of 3/4 less so.
I've used all manner of archaic usb cables for data transfer when in a pinch and windows has never shown me anything at all. Could it be the external device you were connecting to triggering the windows notification?
I have seen these kinds of notifications on occasion but they are far from the norm.
On iPhone, when connecting an external MIDI device via USB, the phone told me that the device was drawing too much power and would be disabled.
I don’t know if they check that via USB protocol, or if they are measuring the actual power draw on the USB port.
In order to use the device, I had to connect it via an externally powered USB hub.
I suspect most users do not even realise things are slow.
Oh, they very much do. But like with everything in technology, they can do fuck all about it, so they resign and maybe complain to you occasionally if you're the designated (in)voluntary tech support person for your family and friends.
Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
They also tend to hate technology, because us nerds are often unbearable.
They hate having to go through people that get them upset, in order to use their kit.
Not just tech (although it’s more prevalent). People who are “handy” can also be that way (but, for some reason, techies tend to be more abrasive).
I’ve learned the utility of being patient, and not showing the exasperation that is often boiling inside of me.
Amen. I couldn’t have said it better.
In general for the 40+ years I’ve been a programmer I have detested the practice of not surfacing diagnostic information to users when technology makes it possible to do so in a clear and unambiguous way.
Most users tend to ignore diagnostic information.
"What did the error message say"
"I don't know."
This is because error messages have historically been bad, unintelligible, un-actionable, and hard to separate from soft errors that don't actually matter.
'Segmentation fault. Core dumped.'
'Non-fatal error detected. Contact support.'
'An error occurred.'
'An illegal operation was performed.'
'Error 92: Insufficient marmalade.'
'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?'
'Saving as .docx is not recommended because blah-blah-blah never gonna give you up nor let you down.'
I can't blame any normal user from either not understanding nor giving a shit about any of these. If we'd given users actionable information from day 1, we'd be in a very different world. Even just 'Error 852: Couldn't reach the network. Check your connection to the internet.' does help those who haven't turned of their brains entirely yet.
I had a programmer pushing multi-gig packages to a Meta Quest 3; and it was taking around a minute. He didn’t even think that it could be faster because he assumed the Quest or software was slow and didn’t check.
I implored him to try a different cable (after checking cables with the Treedix mentioned in TFA), and the copy went from taking over a minute to about 13s.
Its not just normal people confused.
I find some programmers (and this is presumably true of any industry) very narrow in their expertise within technology.
I think you are right, but I think what I said is also true.
People will notice some things. For example, with USB if they are using it for local backup they might notice, but with a lot of devices they will not. When they do notice, they will feel powerless.
Even if we had a wider choice, they are not well placed to pick products. There is no way they will know about details of things such as USB issues (a cable is slow, the device will not tell you if it is) at the time of purchase.
I think any of us just have to look at how many people ask us for recommendations on basic things like docks and adaptors to see how common this is. On top of that you can’t even trust what’s on the tin sometimes.
This is true of basically everything. Even trivial home maintenance people will just put up with things being broken most of the time over learning how to fix them.