LCDs ARE interlaced!

Started by blackevilweredragon, February 22, 2007, 01:37:39 PM

Previous topic - Next topic

blackevilweredragon

This has been a long project for me, and these pictures are NOT the only proof.  It's just the only proof I can explain right now.

I did a year long study, to see if LCDs CAN interlace, as everyone says they can't, because they don't even refresh.

However, I didn't feel this was true, because there seemed to be no way the limited ribbon cable connecting LCDs, could show the WHOLE picture at a given time (usually 60 times a second).

So, After dissecting, and experimenting, I found the BEST way to capture this in the act, was to use a high-speed camera with high ISO settings (film too!)

And what I found, is that LCDs CAN be interlaced, infact some popular devices HAVE interlaced LCDs, though, not on BIG LCDs anymore (they do a very different technique)

The following use interlaced LCDs:
iPod Video
PSP
Gameboy Advance (and SP model)
Gameboy DS (both screens)
Most cell phones
Most camera built-in viewfinder LCD

And here's the small, but very apparent proof..

Here, in this shot, from my iPod Video, it's not as viewable, as the backlight washes the effect of interlacing out, you can sorta see it on the kids shirt..  One field IS slightly brighter than the other.  This is because at that very second, only that certain field (lines), were being "lit" (lack of a better word as it's not a CRT).
http://blackevilweredragon.spymac.com/ipodscan.jpg

But here, on the GBA picture, as it's not a backlight display, and relies on a light source in FRONT of it, the flash from the high-speed camera was more than enough to make the interlacing very apparent..
http://blackevilweredragon.spymac.com/gbascan.jpg

And, for refresh rates and LCDs:  A lot of video experts agree that LCDs do not have a refresh rate..  But after this evidence, that becomes moot, and LCDs are proven to very well, have refresh rates..

NFG

#1
That's not interlacing, that's just how they draw the screen.  My LCD PC monitor uses this kind of flicker to increase the number of available colours (many/most LCDs do not actually show 16 million colours, they just flicker two other colours to approximate a third).

What you describe is interlacing in the same way a flickering 60Hz lightbulb is interlaced.  It's NOT interlacing.

Interlacing on TVs is used to approximate a higher resolution in two passes, where an LCD uses more than one pass to refresh the pixels.

LCDs don't have STANDARDIZED refresh rates, but they still refresh.  Most PC monitors are locked to 60Hz - go ahead and try to change that in your setup, you typically can't.  They have to have a refresh rate, to suggest otherwise is insanity!  They're not capable of re-drawing the screen a billion times per second, and they clearly draw the screen more than once per second, so...  Even if it's not called a 'refresh' in the same way a CRT refreshes the screen it still has to draw the screen a given number of times per second.

blackevilweredragon

QuoteThat's not interlacing, that's just how they draw the screen.  My LCD PC monitor uses this kind of flicker to increase the number of available colours (many/most LCDs do not actually show 16 million colours, they just flicker two other colours to approximate a third).

What you describe is interlacing in the same way a flickering 60Hz lightbulb is interlaced.  It's NOT interlacing.

Interlacing on TVs is used to approximate a higher resolution in two passes, where an LCD uses more than one pass to refresh the pixels.
that's just the thing..  I know what you are describing, but that's not what's taking effect here...

what you are describing would leave an effect like this:  http://blackevilweredragon.spymac.com/biglcdscan.jpg

On my GBA, I pulled one wire off the ribbon (i will not identify this wire to make sure people don't try and destroy their GBA), and the effect stopped, only even fields showed..  i removed the wire that told the LCD controller, where to put the fields at a given time...

blackevilweredragon

#3
Oh, and in the case of the iPod video, it's way of doing more colors is a random dithering pattern, THAT i can get a picture of too, in dark scenes..

EDIT:  http://blackevilweredragon.spymac.com/ipodcolor.jpg

notice in the arm, you can see some small random dithering taking place, that's how the iPod achieves more colors...

but that's completely separate from the interlacing effect im seeing..  don't worry, i didn't waste 1 year of studying this without taking those things into consideration ;)

This page even backs it up that the GBA LCD is infact interlaced..

http://www.beyond3d.com/forum/archive/index.php/t-2385.html
QuoteActually, the AGB screen outputs 240x160 *interlaced*. However, I can't think of any reason why this device wouldn't output the full 240x160 pixels every 60th of a second, since you code the AGB like the screen was "progressive" anyway.

viletim!

No, LCD displays are not normally interlaced. The only reason to generate interlaced video these days is to be compatible with the TV system. Interlacing looks horrible! No one in their right mind would incorperate into a modern (ie, post vavle era) display technology.

Maybe you're seeing is an artefact of an inversion (anti-polarisation) technique. Have a read of this page.

I'd ignore any advice from 'video experts' who claim LCDs don't refresh. They might be confusing LCD technology with flip book technology :).

blackevilweredragon

QuoteNo, LCD displays are not normally interlaced. The only reason to generate interlaced video these days is to be compatible with the TV system. Interlacing looks horrible! No one in their right mind would incorperate into a modern (ie, post vavle era) display technology.

Maybe you're seeing is an artefact of an inversion (anti-polarisation) technique. Have a read of this page.

I'd ignore any advice from 'video experts' who claim LCDs don't refresh. They might be confusing LCD technology with flip book technology :).
Well that's the thing..  On any old Gameboy Advance, switch it off.. Notice a field will be displayed and fading away, while another field will be completely gone..

And here's my iPod video, super closeup playing a 60fps video with this girls arms flaring around very fast:  http://blackevilweredragon.spymac.com/ipodinterlace.jpg

that looks a lot like interlacing..  the video it was playing was 320x240 non-interlaced...

Guest

I noticed the samething long time ago on my GBA (Non Backlit) when playing out side using the sun to light the screen. I also noticed the samething on my Cell phones and digital camera. It looks just like interlacing. As for the ipod video it looks like a tv show which would be originaly recoded interlaced then deinterlaced and scaled down to 240p for the ipod. Those are the same artifacts u see when watching a "high motion" dvd deinterlaced with "weave"

Joe Redifer

As for your picture ( http://blackevilweredragon.spymac.com/ipodinterlace.jpg ) that looks a lot like a progressive image to me that was not properly deinterlaced.  I see this all the time when interlaced video is brought into a progressive environment.  I even see this on TV where the local news originates in 480i and the channel itself is broadcast in 720p.  Notice on your picture how both the odd and even "fields" are the same brightness.

Examples of this can be seen anytime an interlaced frame is displayed on a progressive screen.  Like so:  http://pixelcraze.film-tech.net/crap/omnimax.jpg
The screen just doesn't have the ability to make every other line appear as a different moment in time.  Instead it displays them all at once, and we get "combing" artifacts.  No fun.  Interlacing is evil.

blackevilweredragon

QuoteAs for your picture ( http://blackevilweredragon.spymac.com/ipodinterlace.jpg ) that looks a lot like a progressive image to me that was not properly deinterlaced.  I see this all the time when interlaced video is brought into a progressive environment.  I even see this on TV where the local news originates in 480i and the channel itself is broadcast in 720p.  Notice on your picture how both the odd and even "fields" are the same brightness.

Examples of this can be seen anytime an interlaced frame is displayed on a progressive screen.  Like so:  http://pixelcraze.film-tech.net/crap/omnimax.jpg
The screen just doesn't have the ability to make every other line appear as a different moment in time.  Instead it displays them all at once, and we get "combing" artifacts.  No fun.  Interlacing is evil.
if the ipod LCD did show 640x480, this would be true, but it doesn't..

Take an interlaced 640x480 image and scale it down to ipods 320x480..  with whatever image application, the interlacing is gone, and it's been automatically deinterlaced by a technical fact that you scaled down to the fields vertical size..

i told you, i put EVERYTHING into consideration...  the video WAS non-interlaced anyway, it came from a non-interlaced source to begin with..

and to the other post, yes, exactly, bring a GBA out into the sun-light, MORE proof it's interlaced..  that's how i stumbled into this theory...

NFG

As mentioned in that link from viletim, it's NOT interlaced, it just refreshes the screen every second line at a time.  What you are seeing is not proof of interlacing, it's proof that the LCD doesn't refresh every line on every pass.


blackevilweredragon

QuoteAs mentioned in that link from viletim, it's NOT interlaced, it just refreshes the screen every second line at a time.  What you are seeing is not proof of interlacing, it's proof that the LCD doesn't refresh every line on every pass.
but why is it only doing one field at a time, then doing the other?  when moving my mouse cursor on my laptop, and take a picture at high ISO speeds (im talking a thousand for an ISO), you see split fields on a moving mouse cursor..

NFG

Because that's how it works.  Perhaps it's more energy efficient, perhaps it reduces the amount of screen-drawing hardware, I don't know.  Interlacing is a way of increasing the apparent resolution, the LCD isn't making higher-res from this action, it's economizing something somewhere.

blackevilweredragon

QuoteBecause that's how it works.  Perhaps it's more energy efficient, perhaps it reduces the amount of screen-drawing hardware, I don't know.  Interlacing is a way of increasing the apparent resolution, the LCD isn't making higher-res from this action, it's economizing something somewhere.
bandwidth, like i stated...

and this is the other reasons older TVs were interlaced (got this information from an old TV tech)..  back in the days, they needed a way to get the high resolution over the airwaves, without too much bandwidth..  They chose interlacing it..  (originally it was 240p for broadcasting, they then decided that was crap and needed more resolution)

it's not just because it's a way of getting higher resolution out of a CRT, it's also an early form of compression..  (early QuickTime codecs interlaced at higher resolution to cut the file size down by half)

LCDs use the same because those small ribbons couldn't possibly do the whole screen at once...

NFG

#13
You may be right about bandwidth, but it's:
a. not INTERLACING
b. not increasing the resolution

I don't think bandwidth is the problem here.  Let's do some math:

38400 pixels on a GBA screen, that's 109,200 sub-pixels (red, green, blue).  Five bits per sub-pixel to achieve the total 32,768 colours.  Round that 15 bits per pixel to 16 bits.

So, 16 x 38,400 = 614,400 bits, or 76kB per screen. Figure 60fps = 36,864,000kb/s, or 4.6MB/s transfer.

That's really not that fast, especially over a wide data bus.  I mean, a USB2 port can get more than 8x that, over two wires.  8-16 (or more!) wires on an LCD will get WAY more bandwidth.


blackevilweredragon

QuoteYou may be right about bandwidth, but it's:
a. not INTERLACING
b. not increasing the resolution

I don't think bandwidth is the problem here.  Let's do some math:

38400 pixels on a GBA screen, that's 109,200 sub-pixels (red, green, blue).  Five bits per sub-pixel to achieve the total 32,768 colours.  Round that 15 bits per pixel to 16 bits.

So, 16 x 38,400 = 614,400 bits, or 76kB per screen. Figure 60fps = 36,864,000kb/s, or 4.6MB/s transfer.

That's really not that fast, especially over a wide data bus.  I mean, a USB2 port can get more than 8x that, over two wires.  8-16 (or more!) wires on an LCD will get WAY more bandwidth.
Ask Nintendo, they would even tell you it's an interlaced LCD..  Go bring the GBA out in direct sunlight and you tell me that's not interlacing..  Once you see that effect, you'll know it IS interlacing...  There's just no denying it once you see it for yourself..

kendrick

Is it functionally interlacing if we're not talking about a raster beam? On an LCD you can technically draw any pixel at any time, and you're not tied to a hardware clock the way you would be with a cathode tube. Moreover, the point of interlacing on a tube is because you don't get to select the sequence of pixel updates (as you would on, say, a vector screen.)

Sure, you can draw every other line if you want to (and if your source video happens to come from a standard broadcast source) but calling it interlacing is kind of like calling tofu a kind of cheese. Yes, it's a fermented protein product served in blocks and slices, but it's still not functionally cheese because we don't use it that way.

-KKC

blackevilweredragon

#16
QuoteIs it functionally interlacing if we're not talking about a raster beam? On an LCD you can technically draw any pixel at any time, and you're not tied to a hardware clock the way you would be with a cathode tube. Moreover, the point of interlacing on a tube is because you don't get to select the sequence of pixel updates (as you would on, say, a vector screen.)

Sure, you can draw every other line if you want to (and if your source video happens to come from a standard broadcast source) but calling it interlacing is kind of like calling tofu a kind of cheese. Yes, it's a fermented protein product served in blocks and slices, but it's still not functionally cheese because we don't use it that way.

-KKC
it's interlacing in the fact that the ribbon that connects the LCD can't possibly update the WHOLE screen at once..  The LCD on the GBA thus sends one field, then the other, like a CRT would...

If you play a GBA on a bright sunny day outside, the LCD would actually FLICKER..  My laptops LCD flickers when I play a Sega emulator on it with scan-lines..  It just can't update the whole screen at once..

Newer/Bigger LCDs can, but the smaller embedded ones can't..  This is why DVI has multiple pins, to prevent that kinda "compression", as i call it...

In my GBA screenshot, it's obvious..  That's not a screen-door effect at all.. Notice that one field is dimmed out, because the one that's brighter was the one that was updating at that second..

I didn't waste one year studying this effect for nothing...

Using high ISO cameras, I will take a GBA picture, where one field would appear completely gone!

EDIT:  Here are the high ISO settings.

ISO 400:  http://blackevilweredragon.spymac.com/gba1.jpg

ISO 1600:  http://blackevilweredragon.spymac.com/gba2.jpg

I seriously think this is a form of interlacing..  I can't possibly push this theory any longer.  Because anymore proof I try will just get shot down..

Here's a video..  You really can't tell because my video camera sucks, but you can see scanlines moving in this video..  If you saw it in real life, the white border on the pop-up, IS flickering like a TV interlacing..

http://blackevilweredragon.spymac.com/gbaint.avi

NFG

OK, this is getting stupid.  I keep telling you why it's not interlacing and you keep saying "but it is, here's another photo!"

That page viletim linked to EXPLAINS WHAT IT IS.  GO READ IT, FFS.

Especially the part where it explains, in excruciating detail, that it's done this way to prevent 30Hz flicker.

It's not INTERLACING, it's FLICKER REDUCTION.

And so help me, if you ignore me again and repeat yourself again I'm gonna lock the thread, ban you and burn down your house.

blackevilweredragon

i wasn't seeing the whole page...  i kept loading it while at my technical college and it kept getting cut off where it shows the clock style display, and the matrixed style display..

i apologize, i wasn't seeing the full page..  though, the effect LOOKS like it's interlaced on the iPod (the "staggered" look..  and i know my video wasn't interlaced)

come on lawrence, you should know me by now, i don't try to start crap, im no GZeus (whom by the way, on other forums keeps talking trash about this forum, and about you personally---on the Sega16 forum)

NFG

Quotecome on lawrence, you should know me by now, i don't try to start crap,
No, but you're very stubborn.  =P

As for Gzeus, let him do what he likes.  As long as it's not here, I don't care.  =)

blackevilweredragon

Quote
Quotecome on lawrence, you should know me by now, i don't try to start crap,
No, but you're very stubborn.  =P

As for Gzeus, let him do what he likes.  As long as it's not here, I don't care.  =)
sometimes stubborn-ness isn't always a bad thing ;)

Endymion

bewd, you're putting a lot of effort into trying to categorize what you are seeing. But don't you think the engineers who made this thing have put at least as much thought into it? If they tell you it's not interlacing by its basic operation what reason do you have to disbelieve them? Because you see some weird effects outside on a bright sunny day that look similar to what you've seen with interlacing while using a CRT? C'mon man.

blackevilweredragon

#22
Quotebewd, you're putting a lot of effort into trying to categorize what you are seeing. But don't you think the engineers who made this thing have put at least as much thought into it? If they tell you it's not interlacing by its basic operation what reason do you have to disbelieve them? Because you see some weird effects outside on a bright sunny day that look similar to what you've seen with interlacing while using a CRT? C'mon man.
thing is, Nintendo, when asked, say it's interlaced..  that's why i pushed this theory so hard, because Nintendo said it's true...

Apple says it's not, Sony says it is (for the PSP), Nokia for my cell phone says it's "semi interlaced", whatever the heck that means...

no one seems to know for sure...

but after carefully reading that LCD page, i am 100% sure now what i am seeing is the H-Line Inversion..  which awfully looks a lot like interlacing...

kendrick

Remember, these are the same companies who invented the 'progressive' terminology for HD video that's a description of... absolutely nothing. Other BS video terms include 'blast processing' and 'mode 7' and 'true color.' Describing the Gameboy screen as interlaced loses them nothing, and they don't take a hit because we'll go buy it anyway.

The people who answer these kinds of customer queries aren't engineers. They're public relations guys who should be working for politicians, because they answer yes if they think it will earn them a competitive advantage. If you asked them if Wii could cut grass, they'd find the words to make it seem possible. And don't even get started on the two-sided nature of Sony's mouth.

-KKC

Joe Redifer

#24
Quoteit's interlacing in the fact that the ribbon that connects the LCD can't possibly update the WHOLE screen at once..

I had quite a bit typed up around 24 hours ago, but when I pressed "Submit" the site went away and did not restore itself before I went to bed.  But basically I wanted to say that I do not believe that the ribbon cable is a limited factor in the least.  I have been able to send full composite video over a single strand of an IDE cable and it worked fine just as long as the length is short (no more than a few inches).  My TurboGrafx-16 RGB mod uses a single IDE strand for each color going to the transistors and then to the DIN which I installed.  They must run from the TG-CD expansion bus all the way to the front of the TG-16 (underneath it) where the amp is, and then over to the side where the DIN is.  It's quite the long trip, actually.  No problems.

blackevilweredragon

Quote
Quoteit's interlacing in the fact that the ribbon that connects the LCD can't possibly update the WHOLE screen at once..

I had quite a bit typed up around 24 hours ago, but when I pressed "Submit" the site went away and did not restore itself before I went to bed.  But basically I wanted to say that I do not believe that the ribbon cable is a limited factor in the least.  I have been able to send full composite video over a single strand of an IDE cable and it worked fine just as long as the length is short (no more than a few inches).  My TurboGrafx-16 RGB mod uses a single IDE strand for each color going to the transistors and then to the DIN which I installed.  They must run from the TG-CD expansion bus all the way to the front of the TG-16 (underneath it) where the amp is, and then over to the side where the DIN is.  It's quite the long trip, actually.  No problems.
digital video vs analog video = not the same

blackevilweredragon

Quote'mode 7'
-KKC
Mode 7 is a real term actually.  ;)

kendrick

Yes yes, Mode 7 describes the Super Nintendo graphics state where a scrolling texture can be rotated in 3D. But the term has been misused to describe the same type of graphics on the Gameboy Advance and in completely unrelated Sega hardware. It also doesn't tell you anything about what's actually happening, only that it's an arbitrary label on an arbitrary list of graphics modes. As Nintendo doesn't set the standard for describing 3D graphics, they don't get to decide that mode 7 means anything to the rest of the world.

-KKC

blackevilweredragon

QuoteYes yes, Mode 7 describes the Super Nintendo graphics state where a scrolling texture can be rotated in 3D. But the term has been misused to describe the same type of graphics on the Gameboy Advance and in completely unrelated Sega hardware. It also doesn't tell you anything about what's actually happening, only that it's an arbitrary label on an arbitrary list of graphics modes. As Nintendo doesn't set the standard for describing 3D graphics, they don't get to decide that mode 7 means anything to the rest of the world.

-KKC
i know that it only applies to the SNES, but in the SNES, mode 7 is a term for the graphics hardware, mainly in programming it, to tell the SNES how to display a sprite, and how it's to manipulate it (more like what "set" of commands you want available, in my own terms)

Joe Redifer

#29
How do you know they are transmitting the video to the LCD digitally?  To me that would seem incredibly inefficient for a budget-priced system.

Also, while speaking of "Mode 7", that only refers to backgrounds.  The SNES cannot scale and/or rotate a sprite.  

blackevilweredragon

QuoteHow do you know they are transmitting the video to the LCD digitally?  To me that would seem incredibly inefficient for a budget-priced system.

Also, while speaking of "Mode 7", that only refers to backgrounds.  The SNES cannot scale and/or rotate a sprite.
because an LCD doesn't understand analog signals..  it NEEDS digital signals that IT knows...

NFG

QuoteHow do you know they are transmitting the video to the LCD digitally? To me that would seem incredibly inefficient for a budget-priced system.
Systems that use, for example, analogue RGB or composite to send video to the LCD are the exception, not the rule.  Usually these were existing systems, like the PC Engine or MegaDrive or Saturn or PS1, and the LCD converted this signal to digital before use.

It's more economical to keep it digital the whole way, like all the Nintendo portabls.

blackevilweredragon

Quote
QuoteHow do you know they are transmitting the video to the LCD digitally? To me that would seem incredibly inefficient for a budget-priced system.
Systems that use, for example, analogue RGB or composite to send video to the LCD are the exception, not the rule.  Usually these were existing systems, like the PC Engine or MegaDrive or Saturn or PS1, and the LCD converted this signal to digital before use.

It's more economical to keep it digital the whole way, like all the Nintendo portabls.
Exactly..

I was doing some more research into LCDs, and is it me, or is the GameGear analog-style though?  It almost seems like it's NOT digital.  (perhaps how the TV tuner worked on it)

Joe Redifer

#33
OK then.  LCD's certainly aren't my specialty.

But don't you think that the subject title of this thread should read "LCDs CAN interlace!" instead of "ARE interlaced"?  My 1680x1050 Apple Cinema Display (one of the best monitors I've ever seen, LCD or not) happens to be LCD and certainly is not interlaced.

NFG

The subject of this discussion might more accurately be "LCDs ARE interlaced!  (not!)"

Jon8RFC

I don't mean to start up any arguments, but I'd like to ask a bit on this subject (found this thread while searching the forums for something else).  The link viletim posted is dead, but I'm curious to understand this a bit more.  Like I said, I don't mean to start up something negative--are some cell phone LCDs interlaced?  I ask some because I don't notice the phenomenon on all cell phone LCDs.  What I notice is when I rotate the phone flat, along the X-Y axes, I see something very similar (or identical?) to what I see when I rotate my hand against an interlaced tv display, or rotate my head in a similar manner...however, I don't see this on all cell phone LCDs, and can only see it if the display, in relation to my eyes, are rotated in continuous motion.

I can't find good matches in google for inversion antipolarization.  I'm familiar with polarized and nonpolarized light, so expanding beyond that is what I'm curious about since inversion antipolarization sounds like a 180degree cancellation rather than a 90degree cancellation...I'm confused.

Also, how is refreshing every second line different from from a field (or every other line, simultaneously)?  The terminology used there is throwing me off.

RGB32E

#36
Sounds like how some LCDs are driven so that they use less battery power...  ???  Also, the PVML2300 has an interlaced mode... ;D

http://pro.sony.com/bbsc/ssr/cat-monitors/cat-broadcastevaluationmonitors/product-PVML2300/
QuoteVariety of display modes (Interlace Display, Black Frame Insertion)

But typically HDTVs and monitors are not driven in an interlaced mode.  I suppose the distinction can easily be confused, and from the posts on this thread, I don't think some will ever get it...  :P

On top of that bewd likes to run with misinterpreted information.... ick!  ;D

NFG

I real interlacing CRT delays the sync of every second frame so that the TV is a little farther down the screen before drawing it.  LCDs cannot magically shift their pixels around, they're fixed, so any reference to 'interlacing support' would refer only to interlace emulation or interlacing support.

The flickering you see on your LCD screen is not interlacing, even if it looks like it.

cgm

The flickering he is seeing could be the back light. Remember that LCD back lights are usually AC driven (via inverter) florescent bulbs that flicker similar to big ceiling lights.

l_oliveira

I noticed that if you reduce the voltage on the LCD driver (GBA SP and NDS have pots to set that up under the  battery cover) the LCD starts to "flicker".
Perhaps lower voltage causes it to slow down. Nothing to do with the topic but that might cause a LCD to "look like" an interlaced display if it was displaying interlaced video.

LCDs without built in analog video circuitry require parallel digital RGB which depending on how many colors it can do, can ammount up to 32 wires just for the data,
without count power and synchronization signals.
Some LCDs also have a SPI/serial interface that allow the host system to connect and set up other parameters such as contrast and color temperature offset.
Others use a serial low voltage differential interface to encode the RGB data and save on wires while keeping the same bandwidth as the old bunch of wires.


Now, the point is that the screen is drawn using a flow of serial data similar to raster video and as mentioned previously the pixels are fixed so there's no point on drawing odd/even lines because the panel is usually fast enough to not need such method.
The panel is only drawn on a method similar to a "raster" because it would be impractical to draw all pixels at the same time due to the way it communicates with the source of video. But surely the way it works internally could allow for such if the chips that drive the LCD panel rows and columns had memory for a few video frames.  :)