The original IBM PC had a clock speed of 4.77 MHz. The 8088 processor inside was actually specified to run at 5 MHz, so how did IBM end up at 4.77?
To tell that whole story we need to briefly transport back to 1953. That year the National Television Standards Committee (NTSC) approved a new standard that allowed television broadcasts to add color while remaining fully compatible with existing black-and-white TVs. This was done by adding a "color subcarrier" that did not interfere with the original B&W signal. The color subcarrier is a signal with a frequency of 3.579545 MHz.
Next we move to the design of the Intel 8088 microprocessor. It was unusual in that it required an asymmetrical clock instead of the usual square wave. The common 8-bit processors of the day, such as the 8080 and Z80, needed a clock with a 50% duty cycle – that is, "high" exactly 50% of the time, and "low" the other 50%. Together the high and low times are a clock cycle, so for the Z80 there would be 4,000,000 of these clock cycles in a second (4 MHz). The 8088, on the other handed, needed a 1/3 duty cycle clock – that is, high 33% of the time and low 67%.
To ensure the clock for the 8088 was just right, Intel provided a companion clock generator chip, the 8284. In order for the 8284 to generate the correct 1/3 duty cycle clock, it needed to start with a clock going three times the actual desired speed. Since the 8088 was designed to run at 5 MHz, the 8284 was designed to take an input of 15 MHz.
At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts – for those customers that chose the CGA video card instead of the Monochrome Display Adapter.
What do you think of the Intel "Atom" with the Nvidia "ION" chipset? I almost feel that the PC for TV may burst on the sceen this year 2009 (this has been a long time comming - Possible new form factor).
What do you think about google realtime voice translation technology? Lots of interesting stuff going on this year.
I was just surfing this evening! I teach a computer class, and I mention you from time to time as a living legend.
Hope all is well with you.
I bet you feel cheated out of a lot of money.
Where's the part where you give credit to Digital Research, Gary Kildall and CP/M for their part in all this?
Hats off to you...................
I wonder if it was the 50-cent savings or if having the parts run at the same frequency had other benefits. ...Not that I can think of anything off hand. But surely customers would have paid a buck or two for all their software to run 5% faster; the machines cost hundreds of dollars!
@anon: That is rude, and the truth of the matter is that 99.9% of the population is exploited anyway.
Post a Comment