[GTALUG] CRT memories [was Re: IBM - cache skirmish story.]

Russell rreiter91 at gmail.com
Tue Apr 24 16:00:11 EDT 2018



On April 24, 2018 1:33:11 PM EDT, "D. Hugh Redelmeier via talk" <talk at gtalug.org> wrote:
>| From: James Knott via talk <talk at gtalug.org>
>
>|   However, that system was based on a Data
>| General Nova 800 and used on dumb terminals (made by VST) that used
>| delay line memory.

If someone saw you loading a mercury delay line memory module onto a truck
these days, the five eyes guys would be on you faster than Trump on a Tweet on a national security threat.

https://upload.wikimedia.org/wikipedia/commons/f/fd/Mercury_memory.jpg

>
>Video terminal "VDT" development was very much gated by developments
>of memory technology.  CRTs need constant refresh so there needs to be
>some kind of backing store for the image.
>
>Many different solutions were developed.
>
>Tektronix developed a CRT technology that retained an image once it
>was written.  The trouble was that the only kind of erasing was total
>image erasing.  Think of an etch-a-sketch.  I used one of these with a
>PDP-8 in the late 1960s.  It had a lot of advantages over the Teletype
>Model 33 ASR.  Think of the output as going through more(1): after a
>page of output, you had to type a control character to request the
>next page.
>
>Remember, the PDP-8 was a computer costing $10000 or more and only
>having 4k words of main memory (12 bits/word) (1967).  A frame buffer
>for a black and white 640x480 screen would require 25k words!  The
>tail would be wagging the dog.  In those days RAM was implemented as
>core memory.  Each bit was a little torus of ferite, with three or so
>wires running though it.  Assembled by hand.  It cost roughly a buck a
>byte.

I bump into a guy in my neighbourhood once and a while. He told me about weaving some of this stuff together when he was at UofT. I only ever saw the core display at the OSC. 

>
>A little earlier, IBM made the 2250 display.  It was a vector display:
>the screen was painted via vectors.  For graphs, this was a very dense
>representation.  It cost more than a house.  Both University of
>Waterloo and University of Toronto had one, highly subsidized by IBM.
>I think that it was developed for NASA.
>
>The next step was to store characters in a buffer: much more compact
>than a pixel buffer. Refreshing would by by raster scan, but a
>character generating ROM would on-the-fly generate pixels for each
>character.  The buffer would be about 25x80 = 2000 bytes.  Even this
>was expensive so different kinds of implementations were used:
>
>- magneto-restrictive delay lines (eg. in the IBM 2260 or the VST)
>
>- shift registers (logically similar to delay lines, but using
>  semiconductors) (e.g. DataPoint terminals)
>
>- finally: RAM
>
>Until RAM was used, terminals often did not allow editing the middle
>of a screen.  up, down, etc. were not implemented.  These were the bad
>old days.
>
>One early CRT that I worked with was the product of an MASc thesis at
>UofT.  It used a slow-decay orange phosphor.  The refresh was the duty
>of the program in the attached computer (IBM 1710).  The output was
>encoded the same way as plotter output was encoded.  As long as the
>program output the same stuff every quarter (?) second or so, it was
>visible.  (The attached computer was about a hundred thousand times
>slower than current machines.)
>
>In the mid-1970s, the Dynamic Graphics Project at U of T commissioned a
>couple of displays: a large monochrome vector and a more modest colour
>raster display with one byte per pixel and a 256 entry colour mapping
>table.  Each cost about $20k.
>
>Eventually RAM became cheap enough that a colour frame buffer was
>affordable for individuals.  For example, The Atari ST (1985)
>supported only 16 colours at a time for a resolution of 320x200 --
>somewhat usable.  I preferred mine in monochrome at 640x400 but my
>kids preferred colour.
>
>Now GPU cards come with 4G or more of RAM!

I chose my recent motherboards for the fact there is accoustic isolation gained by placing the audio channels on different pcb layers and also; the fact the planar is bolstered to accept massive video cards. 

Ledger domain likes CUDA cores. The US miners are staking out our power at the border. Thought I'd look into it.

https://www.ctvnews.ca/mobile/business/power-sucking-bitcoin-mines-spark-backlash-1.3899128


>---
>Talk Mailing List
>talk at gtalug.org
>https://gtalug.org/mailman/listinfo/talk

-- 
Russell


More information about the talk mailing list