Questions about how analog RGB monitors handle color gamut

Started by panzeroceania, February 14, 2017, 10:30:10 AM

Previous topic - Next topic

panzeroceania

So as we all know NTSC had it's own white point and color gamut standards and was carried over YIQ, Y/C, or Composite video.

What I'm trying to figure out is what were the assumptions and standards surrounding broadcast RGB CRT monitors, as well as computer RGB monitors.

The tube and the board were rated for a certain bandwidth that allowed certain refresh rate and resolution configurations, up to a certain limit. 15kHz, 31kHz, etc

But what about color gamut, what are the most extreme colors handled by these monitors? If they did double duty as an NTSC monitor like many broadcast monitors did, surely they could cover all the colors in the NTSC gamut, but then again that's a very big gamut indeed. Who were the standards bodies, and what did they define as the standards or limits

cgm

Professional NTSC monitors will use the SMPTE "C" colorimetry and phosphors. Believe it or not, this is a sub-set of the original NTSC specification. The first color TVS, such as the RCA CT-100, were noted for their vivid color reproduction compared to much more modern sets because their tubes had full NTSC gamut phosphors. This was dropped from later consumer sets as it was more expensive to implement.

Most CRT computer monitors (run of the mill stuff) try to support the sRGB color space which is smaller than NTSC's color space. If they don't offer any sort of color adjustment, they are generally locked at 6500K, just like NTSC monitors.

Professional monitors (both CRT and LCD) targeted at graphics professionals will advertise higher gamut color spaces like Adobe RGB or a percentage of the NTSC color space that is covered. They offer full adjustment of color space and temperature in order to be calibrated with a meter.

panzeroceania

Here's another question, it's fairly straightforward to measure the color gamut of a display or printer with some software and a colorimeter, but how do you measure the color gamut of an analog source device, like a super Nintendo.

If you are using a monitor and colorimeter, the monitor could skew the data, and if you use a capture card, capture cards are notorious for changing, clipping, or making assumptions about the color coming from the source device, so how then can you measure it?

radorn

I guess you'd need to run some software in the device you want to measure or calibrate. That software would designed to produce a test picture with known RGB values. You'd then feed the video signal to some well calibrated external measurement device. That way you can tell what are the actual signal levels the DAC is putting out in response to the RGB values the software is sending to it.

I'm sure video engineers have been using such things forever.