In earlier installments of this ongoing series, I wrote mostly about hi-fi cables and the metal they're made of. I did touch on cable dielectrics, though, and even mentioned the truly mind-boggling fact that, according to such BIG big-guns as Maxwell, Faraday and Gauss, it's not so much the metal or even the movement of the electrons that carries the signal "through" a cable, but an "electric field" that surrounds, but is not part of, the conductors. When no less a light than Dick Olsher -- a former Stereophile writer and a physicist in real life -- confirms that the transmitted signal moves vastly more quickly than the electrons that supposedly carry it, by a goodly number of orders of magnitude, then not only the wires, but actually the electrons that we are accustomed to think in terms of, assume an entirely new relationship to the signal and, at that point, a great many of the conventional arguments and explanations of the cable industry about how and why cables work start looking like they may require re-examination and reinterpretation.
It's not only the cables, though, that become stranger the more you get to know about them, it's also the connectors that they're fitted with. One of the strangest -- the RCA connector -- is also probably the most common connector for high-fidelity audio, video and digital applications.
The RCA connector was developed by that company sometime in the early 1940s as a cheap and semi-permanent way of connecting the phonograph and the electronics chassis inside that company's home radio-phonograph consoles. Originally consisting of just a stamped, tulip-shaped metal "shell" and a hollow metal pin separated by a cardboard (or later sometimes phenolic) spacer, the RCA connector was ideal for use with coaxial cables -- just solder the inner lead to the pin and the outer shield (which was also the return conductor for the circuit) to the shell, push it into a matching jack, and a solid and reliable contact was formed that could, and was intended to, last for years without service.
As hi-fi came along, the RCA connector was cheap and handy and, for the single-conductor shielded coaxial cables and unbalanced circuits that were used with the overwhelming majority of early hi-fi gear, it was just perfect -- especially later when transistors allowed for physically smaller equipment. The connector's compact size and high allowable packing density was a distinct asset for receivers and integrated amplifiers, and as they became available, modern multi-input preamps.
Now here's where it starts to get a little strange: Although the common wisdom has it that ordinary RCA connectors have a designed characteristic impedance of 50 ohms, that certainly wasn't what they were originally used for. For phono circuits, until the late 1950s, virtually all (meaning "I don't know of any that weren't, but there might have been some") phono cartridges were either "crystal" or "ceramic," and had output impedances in the multi-thousands of ohms. Similarly, with the tube circuits of the time, the input impedances -- whether for phono or line-level circuits -- were 47,000 ohms or higher. (Just as a side note, a great many of the crystal and ceramic cartridges then in use had outputs of 1 full volt or even more, so that even a phono circuit was likely to be at "line" level.) So given the impedances that RCA connectors were actually used for, how were they -- or better yet, if they were, WHY were they designed as 50-ohm connectors?
By the time video and then digital audio hit the home market, the RCA connector, which had originally been known as a "phono plug," was solidly established (at least in the United States) as the connector of choice for consumer electronics. At the high-end, some English companies like Naim still used the DIN plug, and Mark Levinson's pricey solid-state gear used the CAMAC (Fischer) connector (originally developed for [Woo! Woo!] use in European atomic power plants) but virtually everything else that ran unbalanced lines ran RCAs -- including video and digital.
The problem there was that while the RCA connector was supposedly designed as a 50-ohm connector, digital and video circuitry was uniformly standardized as impedance-matched (not "loaded") 75 ohms.
For most home applications, where the cabling is usually of only moderate length, that's -- as its ubiquitous use clearly demonstrates -- probably of little consequence. But for more critical applications or more critical viewer/listeners, it represents a problem that, to my knowledge, only one company, Canare, has yet sought to solve.
Canare has offered what it claims to be a "True 75-ohm" RCA connector. And it actually may be just that, but only, I suspect, if just the right cable is used with it. About 15 years ago, I had occasion to speak with the head of the connector design department for one of the world's most sophisticated military and aerospace electronics companies. That company had bought a considerable unterminated length of one of XLO's Reference Series cables (probably, we guessed, but they would neither confirm nor deny, for use in one of its super-secret surface-to-air missiles) and we were discussing the right way to terminate it.
In the course of that conversation, I mentioned to her (yes, her, you sexist brute!) that although we obviously knew how to calculate the characteristic impedance (Z0) of a cable, we didn't know how to do it for a connector. "How," we asked, for example, "can you even measure the R, C and L of pieces of metal and dielectric material that small? Do you have some special kind of equipment?" Her answer was surprising: "What we do," she said (I'm paraphrasing, of course) "is to make up some connectors that we think ought to be close to what we want; use them to terminate 100-foot lengths of cable of a previously-tested known Z0, measure the now-terminated cable to see if the reading has changed and, if so, how, and to what degree; and then go back, make some more connectors, and try again."
In short, what she said was that they do it by trial-and-error, but the one thing that she said that made a whole lot of sense was that, with something like an RCA connector, the easiest way to get close was to use a dielectric in the connector of essentially the same dielectric constant as the cable's insulation and of about the same thickness, so that the conductor spacing within the connector would be about the same as the spacing within the cable. Because the distance between the conductors has a direct effect on both the capacitance and inductance of the cable, that made sense to me, but it also meant that, because different cables (RG-8 and RG-59, for example) have completely different spacing, a so-called "75-ohm" connector might not be 75 ohms at all if used with the wrong cable.
Oh well, what difference can it make, anyway? Cables (and connectors) all sound the same, don't they? More on this next time.
See you then!