The thing that has been bothering me for a while is that the USB spec allows for software detection of capabilities. You can read the emarker data and see the supported protocols, speeds, voltages, etc.
But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
> In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
Oh, they very much do. But like with everything in technology, they can do fuck all about it, so they resign and maybe complain to you occasionally if you're the designated (in)voluntary tech support person for your family and friends.
Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
I wasn't surprised to learn that when Linus Tech Tips released those new usb-c cables, that they all sold out almost instantly. They put their entire reputation on the line to claim (and label) the exact capabilities of their usb cables. Isn't that all we really want?
One thing to realize is that especially for high resolution video cables these cheap testers can't really deliver. The way to test them is a eye diagram (see: https://incompliancemag.com/eye-diagram-part2/ ) and testers with that capsbility cost upwards of 10.000 Eurodollars.
No. What it can affect though is the bandwidth of the cable, meaning e.g. for HDMI cables, they might not support higher resolutions or framerates. If it's on the border you might see random disconnects or screen blanks.
The quality degrading is not something you will see, as it's a digital protocol.
"Audiophile grade" HDMI cables are likely to just be a Shenzhen bargain-bin special with some fancy looking sheathing and connectors. I would trust them less than an Amazon Basics cable.
Indeed. If I want super high quality cables, I get them from Blue Jeans Cables, who tell you exactly what Belsen or Can are cable stock and what connectors, as well as the assembly methodology.
No. What I am saying is that it is hard to test the quality of a 8K 240Hz 4444 video cable without having a device that can send and receive this or even higher.
If you send bits across a line fast enough you're grtting into the territory of RF electronics, with wrong connector or conductor geometry you will get echos on the line and all kind of signal loss. A good digital protocol should keep this at bay with error correction and similar mechanisms, but if you want to know what the good cable is on a better than binary scale of works/does not, you need to look at these things.
I wanted to have a model which tells me the modes which are supported and which is actually selected for a reasonable price and which I can order at a reasonable trader. this model seems to do the trick
It seems to be a more comprehensive "Make sure the lines go where they are supposed to" tester. Looks pretty good.
But the devices that test things like transmission speed, are a lot more expensive.
I think that many of the issues that this device tests, can be mitigated by simply buying cables from reputable sources.
But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
I don’t know if they check that via USB protocol, or if they are measuring the actual power draw on the USB port.
In order to use the device, I had to connect it via an externally powered USB hub.
I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
"This device can run faster" or something.
Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
The quality degrading is not something you will see, as it's a digital protocol.
"Audiophile grade" HDMI cables are likely to just be a Shenzhen bargain-bin special with some fancy looking sheathing and connectors. I would trust them less than an Amazon Basics cable.
If you send bits across a line fast enough you're grtting into the territory of RF electronics, with wrong connector or conductor geometry you will get echos on the line and all kind of signal loss. A good digital protocol should keep this at bay with error correction and similar mechanisms, but if you want to know what the good cable is on a better than binary scale of works/does not, you need to look at these things.