Valve based computers

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Just think how far we've come in so little time.
 

Attachments

  • IBM vacuum tube computer.jpg
    IBM vacuum tube computer.jpg
    96.6 KB · Views: 425
Disabled Account
Joined 2017
This thread makes me wonder weather or not some vintage computers can be used for various tasks during winter with which they would be inefficient at the task but efficient at generating heat and therefore heating up the room.

This alternative way of thinking about heating a room would make a vacuum tube based computer a viable winter day project/hobby.


if you can't afford to build your own vintage vacuum tube computer then maybe using a vacuum tube television hooked up to a microcomputer would be a good alternative.


And core memory, little bitty toroids of ferrite material with wires running through them.

One micro-meteorite and they would've been gonners.

One moth stuck in the mainframe on the ground and they would've been gonners.

I was thinking the other night in bed actually about how nasa could've simply used the sun to stir/heat up the oxygen tanks by opening a side of the orbital spacecraft and exposing the tanks to the sun and watching their temperature. No point thinking about it now, they stuck a highly flammable heater in the tanks and decided to heat (and therefore "stir") the oxygen that way.

KABOOM.
 
Last edited:
Fanatic
Joined 2009
Paid Member
...Makes me think about something I read a few years back.
That some forms of processors if made from graphene on silicon, can have a cooling effect:
"The Illinois team used an atomic force microscope tip as a temperature probe to make the first nanometer-scale temperature measurements of a working graphene transistor. The measurements revealed surprising temperature phenomena at the points where the graphene transistor touches the metal connections. They found that thermoelectric cooling effects can be stronger at graphene contacts than resistive heating, actually lowering the temperature of the transistor."

Link:
Self-cooling observed in graphene elctronics

I read some "notcompletelyfaroutintermsofrealistic" Sci-fi book at the same time, where the author based much of his material on fairly recent research. But he had ships with AI that had carbon based CPUs close to the hull, to dissipate heat into space. "what a waste" I thought.

Edit:
This stuff is also interesting!
Tiny defects in semiconductors created 'speed bumps' for electrons—researchers cleared the path
"UCLA scientists and engineers have developed a new process for assembling semiconductor devices. The advance could lead to much more energy-efficient transistors for electronics and computer chips, diodes for solar cells and light-emitting diodes, and other semiconductor-based devices."

You know when they can break it down to the very basics that they're on to something:
"It is like trying to fit one layer of Lego brand blocks onto those of a competitor brand," Huang said. "You can force the two different blocks together, but the fit will not be perfect. With semiconductors, those imperfect chemical bonds lead to gaps where the two layers join, and those gaps could extend as defects beyond the interface and into the materials."

In sum:
"The study builds off of nearly a decade of work by Duan and Huang on using van der Waals forces to integrate materials. A study they led, published in Nature in March 2018, described their use of van der Waals forces to create a new class of 2-D materials called monolayer atomic crystal molecular superlattices. In an earlier study, which was published in Nature in 2010, they described their use of van der Waals forces to build high-speed transistors using graphene."
 
Last edited:
First computer I used was a Bendix G-15D. Vacuum tubes, a drum memory and a modified IBM selectric as the keyboard and printer. Used a high speed optical reader to load the operating system from paper tape in a cartridge.

Second was a Digital Equipment Company PDP8s. It used a teletype and a switch bank to load the first program to tell it to read the teletype's paper tape to load the program.

I gotta tell you ROM was a big step forward. First computer I built was a kit. The second one on the market. I thought it was a better deal than the first one. Turns out it used the Motorola CPU chip and really never did much.

Finally bought a new PDP8i inside a VT52 with dual 8" floppy disc drives. Printer was a daisy wheel unit. Used it for many years. Cost in the mid 70's was $13,768.00.

I don't think I have spent that much since on all my computers combined.
 
First computer I built was a kit. The second one on the market. Turns out it used the Motorola CPU chip

The Motorola CPU of the day would have been the MC6800. The two prominent computers featuring that chip were the SWTPC 6800 system, and the MITS / Altair 680B. I had a SWTPC system. Originally it ran a MC6800 at a blazing 921.6 KHz with 2 KB of memory, and cost me over $1000 in 1975 money.

By the time I had fully expanded it to cover a whole 3 foot by 8 foot workbench, it dimmed the lights on power up, and still "really never did much."

Being a Motorola employee at the time we had access to advance information and chip samples. We had a "computer club" too. Our fully expanded systems ran a MC6809 at 2 to 4 MHz, had 64 to 96 KB of static memory (bank switched) and one or more MC6847 graphics chips. Mine even ran two processor cards simultaneously. All of this sucked lots of power, and pretty much emulated the Radio Shack Color Computer that would appear a year or two later running the same chip set. Some users even reconfigured their SWTPC systems to run CoCo software.

The IBM PC would push this monster off the workbench in a year or two, and it would get donated to a computer museum a dozen or so years later.
 

Attachments

  • swtpc_6800.jpg
    swtpc_6800.jpg
    716.5 KB · Views: 175
The Motorola CPU of the day would have been the MC6800.

I had an Ohio Scientific computer on a board. Microsoft Basic in 8K of ROM (they even wasted a few bytes for the copyright Bill Gates). The first thing I did was a floating point FFT in basic IIRC 256 points took all the memory and took several minutes to run. In 1970 $13,000 would have bought 2 run down houses in East Cambridge each worth $2M today.
 
Last edited:
Member
Joined 2016
Paid Member
Yes, I had one of those too... The 6502 was a good chip!
But I loved the 6809... Such a well designed device. A real shame Mot screwed up and lost the IBM business, 68000 series devices were great.
Having moved on to DSP work, we had some of the first silicon of the 56001 - another truly great device from MOT. Met some of the designers there. I still have the MOT home made certificate for finding a major bug in the pipeline system on the first silicon, they made it as a sort of joke...
 
In the sixties I worked for Leo Computers who were a subsiduary of J Lyons Ltd. At the time they were a very
succesfull teashop business and were the first in the world to use computers for business purposes rather than
scientific or for defense. They built Leo 1 (Lyons Electronic Office ) to do this.

This worked so well that they decided to try to market a development of this machine Leo 11 and their at the time their unique expertise.

Leo II was a valve serial machine with 2k 44 bit words of memory. The memory was stored in mercury filled steel tubes acting as delay lines with 4 words per tube. This was installed in 4 large lines of racks with extensive cooling.There were card i/p, line printer o/p and tape storage devices. The controller for the printer used 2 x EL34 power tubes to drive each of its 128 columns ie 256 valves. Other devices has similar large quantities of valves.

Leo had two of these machines at its headquarters which it used for very lucrative bureau work but by the early sixties the mercury in the delay lines was starting to leak. This couldnt be fixed easily so a plan was hatched to try to replace it with a ferrite core memory from the new transistorised Leo 111. I was recruited primarily to do this as I had previous experience of ferrite core memories with Mullard.

It wasnt a difficult job just a serial/parallel converter with some level shifting but what was a nightmare was the unreliability of the Leo 11. Any voltage surges like switch on's would always blow some germainium diodes uses in the logic gates. This would take hours if not days to fix each time. Diagnostics were very basic like getting the machine to play tunes with program loops over its loudspeaker while varying ht voltage levels in a matrix till you got the right sound.

Sadly by now Leo could not compete with the likes of IBM (who I went to work for) and were taken over by a larger company.

Im pleased that I had the experience of working on these pioneering machines. Last year I made a visit to the National Computing Museum in Bletchley Park mainly to see the working Colossus but was equally pleased to see that some volunteers are building a fully working copy of Edsac though they have to use a lap top instead of the mercury delay lines ,which it also used for modern health and safety rules.
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.