Bunch-o-Questions

Started by zedrein, July 12, 2009, 12:05:28 PM

Previous topic - Next topic

zedrein

Not wanting to make a series of threads for each of these questions, I've decided to neatly categorize them all into one. So then, here they are in numerical order:

1.) What sort of displays did the programmers and artists use on those consoles from days of old? (NES, Master System, SNES, Genesis and PS1) I would like to know if they were using the type of displays that we covet so much today.

2.) Are displays that accept 15 kHz RGB still being manufactured in the United States? If so, are they typically incredibly expensive? Can they be purchased by consumers not affiliated with professional broadcast studios?

3.) Are popular standard connectors like RCA and mini-DIN better for conveying video and audio signals versus the proprietary connectors that console manufacturers typically use? (ex: Nintendo, Sony and Microsoft's "multi AV out" ports) I know in terms of analog signal quality the key factor is how much surface contact is being made for the electrical transmission and a RCA connector, for instance, seems to be a better standard for conveying those signals. Would consoles that have multi-out ports benefit from having direct out RCA (Audio/Video) or mini-DIN (S-video) connections instead of multi-out connections?

That should about do it. Thanks for your time!

NFG

1.  These kinds of questions often lead to silly purist pursuits, but the answer is: CRTs.

3. I would say that the quality, length and path of your wire is more important than the connector.  Also, the quality of the solder joints INSIDE the connector are important too.  The shape of the connector is essentially irrelevant: The Dreamcast and PS1 connectors are just fine, but SNK and Sega screwed up the video and/or audio output of the NeoGeo and Genesis respectively, so the shape of the connector is again less important than the signal being created.

zedrein

Quote from: Lawrence on July 12, 2009, 03:30:22 PM
1.  These kinds of questions often lead to silly purist pursuits, but the answer is: CRTs.

I suppose I should have elaborated on that question. I assumed that during that time LCD or plasma displays weren't commonly used in game production studios. But what I really wondered is if said studios used VGA or "progressive" CRT screens. I obviously have very little knowledge concerning the process that games go through during development before they are "ported" onto a media that a console can read, but I always assumed that they would be developed on displays that were the same resolution and type that most consumers would be playing them on.

NFG

Depends on the era, and the studio.  Sometimes they had VGA monitors, sometimes they created games on systems that did not resemble the hardware they PLAYED on at all.  Imagine writing a game on a monochrome monitor for a full-colour console, or creating 3D images that you couldn't render except on the real hardware.  What they used to create the game is a big old hairy mystery.

zedrein

#4
↑ Wow, that is truly fascinating, I would love to see how some of these games that I love so much looked during development. One last question for the time being: I've read the fabulous post over on nfggames.com about CRT televisions and why they are superior to modern, fixed pixel displays for viewing classic video games. One thing I couldn't quite wrap my mind around was that in the post the author says something to the effect that horizontal resolution really can't be measured on that sort of display. I found this interesting because there exists a wide variety of television calibration media that would use definite borders and absolute resolutions to correctly calibrate the horizontal and vertical alignment of the image. If the horizontal resolution has really no "correct" point to calibrate it to, then how can we be sure that is the right way to align the image for that display?

kendrick

Simple answer. At the time, broadcast TV came in one resolution only. Most televisions were calibrated to display about 90 percent of that image (with the rest left under the border in what's called the 'overscan' area.) However, that doesn't negate the fact that most CRTs can accept many different resolutions. It's just that for each region, standard broadcast TV used only the one. That's how it's possible for there to be a standard.

zedrein

That is quite compelling, young man. But what exactly is said resolution? 640x480? Perhaps even 512x480? Don't tell me it's as high as 720x480!!?

NFG

Vertical resolution is easy.  RS-170 video is 525 horizontal lines, even lines on every second field, odd lines on every first field.

The horizontal resolution cannot be known.  Here's why:

Originally the video standard was simply black and white.  A single beam swept across the screen creating light and dark segments.  Now, there's no mask, no stripes of phosphor, so the resolution was, in essence, infinitely variable.  This standard, now known as RS-170, has its resolution defined thusly:

QuoteHorizontal resolution depends on the camera and other video system components. Since it is an analog signal, the exact number is not critical; it just limits the detail that can be resolved. The horizontal resolution of an analog video image is limited by the signal quality, as determined by all hardware--the video camera, storage medium (if used), intervening cables and circuitry, and display technology. Black and white cameras and CRT display tubes can resolve detail approaching or exceeding 1000 video lines. Typical resolution specs are on the order of 400-700 elements per line. This empirically defined quantity is the number of pairs of black and white parallel lines that could be counted across the display monitor at the limit of detection by a human observer.

So, basically, based on a number of unknown things, the resolution could change. 

Now, NTSC video (RS-170a) uses a 'colour burst' before each line, basically a high-speed sqiggle signal that applies colour across the entire line.  This colour resolution is lower than the black and white resolution, as a matter of necessity really: less time is given to the colour burst than the rest of the line.  So the question is now: Which resolution are you asking about, the black and white hi-res signal, or the low-res colour signal?  VHS cassettes have a horizontal resolution of like 30 pixels, and that would vary based on the colours involved!  LDs could do about 120 colour pixels, and S-video could do about 140.  Modern gear makes these numbers a little outdated, but still: what number do you want?

Colour TVs, with a finite number of coloured phosphor strips, have an upper limit of resolution, but even then the peak input res probably doesn't match the TVs capabilities, so what is the resolution: the input res or the max capable res on the screen?  And which screen?  Larger screens had more phosphor elements than smaller ones, but could still display the same image.  Really small screens tended to lose more to overscan, and how do you count THAT?

There's no standard horizontal resolution.  There can't be.  So your question, I'm sorry, cannot be answered.

viletim

zedrein,

I had a look through my library of obsolete technical books and found some information on CRT display resolution. Here are the relevant pages from Electronic Displays by Sol Sherr (1979).

http://users.tpg.com.au/vile88/temp/ED_excerpt-CRT_resolution.rar


zedrein

#9
EDIT: Previous question was too assy.