The "Cryotron" super conducting tube

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
The book "Cryotron Files" was reviewed in this weekend's Wall Street Journal:
‘The Cryotron Files’ Review: Taking the Cold War to Subzero - WSJ

behind their paywall, here's a snippet of the review:
Our lives are secretly ruled by transistors. They power our computers and, with them, modern society. Today’s most advanced microchips pack more than 21 billion transistors onto a single piece of silicon slightly larger than your thumbnail.

But in the 1950s, when computers were still in their infancy, the transistor’s dominance was far from settled. In an age of bulky and power-hungry vacuum tubes, the U.S. military and American industry searched desperately for some way to speed up and shrink computers for defense and commercial applications. In “The Cryotron Files,” Iain Dey and Douglas Buck tell the story of a little-known invention called the cryotron—a liquid-helium-cooled superconducting competitor to the transistor that was, for a time, the front-running technology in the quest to build the fastest, smallest computers. Their interesting book weaves together the biography of the cryotron’s inventor, Dudley Buck of MIT (the father of one of the book’s authors), with a history of key aspects of Cold War defense programs. Along the way we are given an insider’s look at the 1950s military-industrial complex and the ease and informality with which academia, the military, intelligence agencies and industry collaborated.
 
OK, I'm a physics guy. ( Cryotron - Wikipedia ) gives a short better explanation than the paragraph above. Note at the bottom of the Wikipedia article the rejoinder, “because the TC transition is slow”.

Thinking out loud, that slowness business is probably what killed it as an idea. As long as you're looking for (now “ancient”) tech that was retired-without-ever-being-used-for-much, you could take a look at the whole µm-scale vacuum-tube logic gambit that was actually actively pursued until the 1980s.

I don't remember what it was called. But the idea was to get rid of the hot cathode (“heater”), and replace cathodes with pyramidal and REALLY sharp nitrogen doped nano diamond “cold emitters”; the scale could be shrunk to at-or-below 1 µm, as long as the vacuum maintained was REALLY good.

I remember seeing a photomicrograph of a BCD (binary coded decimal) decoder; it was about 1 mm on a side, and wasn't particularly highly integrated.

Again… at the time… there was quite an attraction to finding an alternate to the increasingly radiation sensitive silicon-chip trend, in the 1980s. Surviving — and actually “working” through — a massive nuclear war was considered an ideal for critical computing equipment. It was the primary reason that CORE memories were actually manufactured all through the 1970s and even quietly into the 1980s. Because they were EMP (electromagnetic pulse) immune. A computer might reboot, built on silicon-on-sapphire logic (just about the only incarnation that wasn't particularly radiation-and-EMP sensitive), but with CORE, its operating state was also preserved. Whatever calculations it was doing could mostly be resumed without a hitch.

I believe all this went aside when it was realized that no — runtime EMP survivability — wasn't much of an issue for computers buried under hundreds of meters of granite. The military's most crucial and sensitive computers, buried at all sorts of completely invisible but secret spots all over the nation, are basically immune — collectively — due to the shielding of the kilometer of dirt, rock, stone and water above them.

Beyond this, I'm duty bound to say no more.
GoatGuy
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.