This is not meant to be a comprehensive introction to the history of TV, but rather a brief primer on how TVs work and why signal compression was decided to be necessary for colour broadcasts.
Black + White TVs:
Not counting flat LCD or Plasma displays, TVs and monitors are made of two parts: special vacuum tubes, the front part of which is coated with phosphors, and the supporting electronics. When you shoot electrons at phosphors they glow, and the first black and white TVs used this to generate a moving picture. A single electron 'gun' sprayed a stream of electrons against the phosphors, and they glowed white when struck by the electrons.
By rapidly moving this spraying stream of electrons an image could be painted on the screen, though it fades very quickly. By repeatedly and rapidly redrawing the same image a stable picture could be painted, and by painting a different picture every time you could make the image appear to move. By sending a radio signal containing a video stream, you could make dozens, hundreds, thousands or millions of TVs display the same image at the same time. The government and electronics industries saw that this was a Good Thing, and a large chunk of radio waves were reserved and divided into 'channels'. This was fine, for a time, but then it got a little complicated.
Back when televisions were making the switch from Black + White to Colour a hurdle needed to be cleared. Each channel was allocated a certain amount of radio wave space, or bandwidth, but sending a colour image required three times as much bandwidth. Since this would reduce the number of channels available - large industries were already established, and consumers were used to having a choice of channels to watch - clearly they couldn't cut back now! In order to grant more broadcast bandwidth to more channels the amount allocated to each channel was reduced, so less space was left for the original stream and
the new colour data! The result of this was a compromise, struck by the engineers, their companies and the government regulatory bodies. What they did was produce a high-resolution black and white picture the same as the old standard, and a low-resolution colour image to be overlayed on top of it.
It still works this way today, and in many respects very little has changed. At first this seems like a horrible idea, low-resolution colour is a Bad Thing, but 'fess up: you never noticed, did you? The reasons for this are plentiful, not least of which are the deficiencies in human vision. Your eyes perceive chrominance (colour) at a far lower resolution than luminance (brightness). We all see a high-res black and white image with a low-res image slapped on top, that's how our eyes work. More or less. It's complicated, but it's true. Have a look at this image from Ogre Battle:
Click to enlarge
The black and white image is full-resolution, the colour image has been reduced in horizontal resolution by one third, and blurred horizontally to simulate the drop in resolution in an NTSC video stream. Now looking at the bottom image you might recognize the quality as something like what you normally see using AV cables; it's not so bad is it? You might be surprised how far this can go before you really notice how bad it is.
The NTSC standard allots only a small chunk of bandwidth for colour in video, and typically consumer video equipment uses less than the maximum, and what's worse is some colours have better resolution than others! TVs, VCRs and video encoders will handle different colour resolutions, and one problem the consumer has in trying to make a buying decision is specs listed for horizontal resolution are the black and white resolutions, not colour ones. Most manufacturers do not give out colour resolution for their equipment.