HA! I've dreamed up a question that may stump even you, Lawrence! You claimed in a prior thread that when a CRT display is fed, let's say, a 240 line image and it is a CRT that has over over 1,000 lines of vertical resolution (or more phosphors) that the illuminated lines, or phosphors, will not change in width, the only thing that will change will be that there are more non-illuminated phosphors between the illuminated phosphors which will make the darkened "scanlines" we speak of more defined. But what will happen while sending a true 480 lines of interlaced resolution to a high resolution display? I can't see there being visible non-illuminated lines between every field...that would just look bizarre! Also, what sort of math goes into scaling a 224 line image onto a CRT that has 600 lines of resolution? Obviously 224 is not divisible by 600, so there can't be an even amount of non-illuminated lines between every illuminated line (in other words between some illuminated lines there are more non-illuminated lines than others), making for an awkward looking picture.
Basically what I am getting at is this: I don't understand how a CRT can scale certain resolutions without making the image look uneven or "patchy" unless they change the actual dimensions of the incoming line (i.e. change the width to accommodate the screen making the picture more symmetrical)
I can see how perfect a standard 480 line display can scale a 224 line image: There are simply 16 lines on top and bottom of the screen that are not illuminated at all making for a letterbox looking picture. But when sending that same 224 line image to a screen with 600 lines, I don't see how that screen can scale in the same neat and even way.