The original IBM PC had a clock speed of 4.77 MHz. The 8088 processor inside was actually specified to run at 5 MHz, so how did IBM end up at 4.77?
To tell that whole story we need to briefly transport back to 1953. That year the National Television Standards Committee (NTSC) approved a new standard that allowed television broadcasts to add color while remaining fully compatible with existing black-and-white TVs. This was done by adding a "color subcarrier" that did not interfere with the original B&W signal. The color subcarrier is a signal with a frequency of 3.579545 MHz.
Next we move to the design of the Intel 8088 microprocessor. It was unusual in that it required an asymmetrical clock instead of the usual square wave. The common 8-bit processors of the day, such as the 8080 and Z80, needed a clock with a 50% duty cycle – that is, "high" exactly 50% of the time, and "low" the other 50%. Together the high and low times are a clock cycle, so for the Z80 there would be 4,000,000 of these clock cycles in a second (4 MHz). The 8088, on the other handed, needed a 1/3 duty cycle clock – that is, high 33% of the time and low 67%.
To ensure the clock for the 8088 was just right, Intel provided a companion clock generator chip, the 8284. In order for the 8284 to generate the correct 1/3 duty cycle clock, it needed to start with a clock going three times the actual desired speed. Since the 8088 was designed to run at 5 MHz, the 8284 was designed to take an input of 15 MHz.
At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts – for those customers that chose the CGA video card instead of the Monochrome Display Adapter.