VGA kit Linux for PS2

Started by Lagi, January 17, 2007, 04:42:25 AM

Previous topic - Next topic

Lagi

Hi everybody,

For many weeks, a question has been burning my lips. Do you know if all the games when you get the vga cord in bundle with the linux kit for ps2 are displayed in progressive scan ? Because to my knowledge, there are not a lot of games which are compatible progressive scan, at least in France. All I know is Soul Calibur 2, Jak and Daxter, Tekken Tag, Shadow of the Colossus, the japanese version of Gran Turismo 4 which displays 1080i.

But concerning the other games, is there an automatic conversion for VGA display when we hook up the ps2 to a crt monitor with games like Devil may Cry for instance or the Final Fantasy series ?

Moreover, something interesting that I could have seen is that with a component official cord, the Playstation Games work, which is not the case for the first Playstation. So if you want to have a component picture on Playstation games, just buy a PS2.

I guess everybody here knew it. I go out. lol

Thanks.

RARusk

The VGA cable is simply an adapter that allows the PS2 to be connected to a monitor. It does not make anything Progressive Scan. The monitor that you attach to the PS2 must be a Sync-On-Green monitor.

As you listed, there are a few games that have the ability to display in Progressive Scan. There is also a recently released product called Xploder HDTV that will allow some other games that do not have the ability to display Progressive Scan on their own to be played in Progressive Scan. Xploder also has just released (or re-released) their own VGA cable for use with their product.
Console hacking is like sex. For best results you got to know where to poke.....

Dr.Wily

#2
QuoteIt does not make anything Progressive Scan

VGA is progressive. If game support progressive scan, the VGA cord is suitable for a VGA monitor to display picture. But only if the game support progressive scan. If the game is only in interlaced mode, VGA is useless. Moreover, many CRT monitor support SOG signal. For LCD it's another story...
@+

       Dr.Wily

Simm's Club - French LAN Gaming (PC & Consoles) : http://www.asso-sc.com


RGB32E

#3
Also, to get VGA output, you must change the system settings in your PS2 from component to RGB.  

With the PS3, its a little different... since you can setup the system to output VGA for EVERYTHING (PS1, PS2, and PS3 titles)!!!!  The same cable can be used between the PS2 and PS3 for VGA.  I have a breakout cable ready, if you'd like to purchase one.  I also have the original sony VGA cable that came with my PS2 Linux kit  :) .

HENCE, if you want to play ALL of your PS1/2/3 titles via VGA, buy a PS3 (if you haven't already), and buy my cable!!!  :P  

qz33

Could you please tell me more about the PS3's VGA capabilities?  Does it render all games (ps1,2,and 3) playable on a VGA monitor?  How does it look for the different generations of games?  

I thought some interlaced games are madde with special features so they have to be displayed interlaced to show properly.  Just please give any information you have.

Endymion

Interesting, I just tried to take my PS3 to VGA and it didn't want to do it. How exactly did you get it going RGB32E?

antron

QuoteIf you install a Matrix Infinity modchip (or a clone of said chip), you can play any and every game in 480p on a standard monitor/HDTV using a VGA cable (if you enable the VGA mode in the chip's bios menu).
I am reading that not every game will come up in 480p.  And that some have major glitches, or don't work at all.

RGB32E

#7
QuoteInteresting, I just tried to take my PS3 to VGA and it didn't want to do it. How exactly did you get it going RGB32E?
Ok, a friend brought over his PS3, and I connected it to one of my setups... here's what I remember.

1. Go into the video settings mode.
2. Choose the Multi-AV output/SCART
3. Select 480p
4. Confirm/test the selection.

Result: VGA output for all game media types! PS1, 2, 3.

I can't guarrantee the 100% accuracy (its the basic idea though), as it was a month or so ago (but it does work, if you follow the right steps).  Also, the process is much easier if you feed composite/s-video, to another monitor/TV - since you won't necessarily be able to see anything prior to the correct settings.  I used one of sony's multi-av adapters between the system and the RGB cable.  Hope this helps.... :)

You turkies and your mod chips! :P

And once you set the PS3 to VGA (SoG) mode, it remembers the setting, and the dash board displays at that res.

The sucky thing though is that you can't select above 480p.  Perhaps its related to copy protection - for DVD and blu-ray.  (NO RGB for these above 480p on their console!)  But I haven't confirmed the DVD/BR play back in RGB though... It might not be possible at all! :(

Guest

i am looking for a way to play PS2 imports in vga.  also with the a hard drive i can eliminate disc loading.  
so, back to the PS2.  is the matrix chip better than the blaster disc?

kendrick

Guest, if your question isn't directly related to the VGA features of the Linux kit, please start a new thread in the Import Mods section of the forum.

-KKC

RGB32E

#10
QuoteCould you please tell me more about the PS3's VGA capabilities?  Does it render all games (ps1,2,and 3) playable on a VGA monitor?  How does it look for the different generations of games? 

I thought some interlaced games are made with special features so they have to be displayed interlaced to show properly.  Just please give any information you have.
For the PS1, scaling takes place to stretch the (256~320)x(224~240) frame buffer to 640x480 (non-interlaced).  The last version of the PS1 emu (part of the PS3) I saw has scaling artifacts.  This is similar to computer emus that "stretch" to a given screen resolution.  Hopefully a pixel ratio saving mode will become available through system updates in the future (if not recently).

For the PS2, all 400~480 line games run as though they were 480p/VGA natively.  480p capable games prompt the user to enable progressive scan (480p) automatically.  I haven't tried any low rez games...

The PS3 titles look great via VGA.  It is what it is... just like connecting a DC via VGA, or a PC via VGA.... so yeah... :P

Endymion

#11
QuoteFor the PS1, scaling takes place to stretch the (256~320)x(224~240) frame buffer to 640x480 (non-interlaced).

For the PS2, all 400~480 line games run as though they were 480p/VGA natively.  480p capable games prompt the user to enable progressive scan (480p) automatically.
This was the subject of a lot of heated debate for a few months at ars technica, and believe me, I was one of the participants.

The PS3 does not make PS1 or PS2 games progressive. What it does instead, unless you specifically UNselect 480p from your video settings, is deinterlace all of the older, interlaced titles. Without exception, this causes PS1 and PS2 games to appear a great deal softer than they do on a PS2. This is also why you still have to start the standard PS2-settings for individual games that support progressive scan, either by button combination at startup or through an in-game menu.

If you have ever used D-Scaler with your PC, you know what deinterlacing is, and if you have used it with some of your games you should have an idea of why it is undesirable. The image is fuzzy when it's bad, and just "too soft" when it's good. Why the hell Sony did it this way is beyond me. XPloder has proven that it's a simple matter to just draw the entire frame properly, and XPloder even works when loaded on the PS3--it looks a hell of a lot superior to the PS3's deinterlacing. Yet, for some reason, even when using progressive scan for older titles on the PS3, the image is still softer than on the PS2. The only time there is any honest upside to running a PS2 game on the PS3 is when that title does not support progressive scan natively, and when XPloder cannot enable it by force. One such title for instance, Path Of Neo, actually looks a little nicer, but make no mistake, it is a way, way softer image, something that I just can't get into after spending four grand on a good HDTV.

If there is any intelligence going on at Sony, they will eliminate the deinterlacing with the emulator. It is really aggravating to me.

RGB32E

#12
Endymion, how are you connecting your PS3 (component and not VGA (SoG)?  Also, Sony had released an update that improved the PS1 and 2 emualtion...  So it sounds like you haven't updated your system?

Also, some HDTVs have a hard time with 480p.  Even the relatively new Pioneer Elite 50" plasma (~$8000) produces a soft picture from a 480p source (e.g. component).

Furthermore, if the system wasn't outputing VGA (non-interlaced signal) when playing any PS 1 or 2 game, the 27" computer monitor (that doesn't accept interlaced RGB) I had the PS3 connected to wouldn't be able to display an image (a 480i signal)!  So, are you trying to state something that just isn't true?

Endymion

#13
QuoteEndymion, how are you connecting your PS3 (component and not VGA (SoG)?

I use the component cable, but HDMI is the same. Additionally, I have VGA cables, but I've never had to use them on the PS2 because my set can take Sync-on-Green VGA (or 15KHz Sync-on-Green RGB) over the same lines that component comes in on. I cannot get this to work with the PS3 via the component cables. I haven't tried my VGA cables--but I really shouldn't have to. Why the VGA mode doesn't work with my PS3 with component cables is a mystery to me, and I'll look at it again later, but again, I really shouldn't have to use the VGA cable. Not when the PS2 worked with that mode just over the component lines. Additionally, my system is running the latest firmware update from just a week or two ago.

QuoteAlso, some HDTVs have a hard time with 480p. Even the relatively new Pioneer Elite 50" plasma (~$8000) produces a soft picture from a 480p source (e.g. component).

My HDTV does not have a hard time with 480p. More specifically, and easily provable, is how the picture is different between the PS2 and PS3. If you're ever in the south Florida area, you're invited to stop by. I guarantee you, you will see what I'm talking about. If my Panasonic plasma set produced a soft image from 480p, it would do this across all my consoles--and it does not.

QuoteFurthermore, if the system wasn't outputing VGA (non-interlaced signal) when playing any PS 1 or 2 game, the 27" computer monitor (that doesn't accept interlaced RGB) I had the PS3 connected to wouldn't be able to display an image (a 480i signal)!

Here's your problem: You don't understand what I told you. So go back and read it again. I did not say that the PS3 was not outputting a progressive or non-interlaced signal. That sentence is a double negative. In other words, it is outputting a progressive, non-interlaced signal. What it is not doing is rendering this signal before broadcasting it in the same way that the PS2 does. A game console creates an interlaced signal by rendering a frame then eliminating odd lines in that frame, then rendering a frame and eliminating the even lines, and repeating this process ad infinitum. A game console creates a progressive signal in exactly the same way--it renders the frame and then draws the entire frame line by line to the screen. The PS3, when running interlaced PS2 games, does something entirely different--it interlaces them just as the PS2 does, then deinterlaces it, rather than simply eliminating the line removal that creates the interlacing in the first place.

You can prove this to yourself with a PS2, a PS3, and any one game that supports interlacing natively. All you need to do is start up the PS2 running the game in progressive mode, then compare it with the same game running on the PS3 with 480p checked on in your PS3 settings but without enabling the game's native progressive scan. If what you purport were true, that the PS3 automatically turns all PS2 (and PS1) interlaced games into progressive scan, then there would be no difference between the image that you are seeing from your PS2 with progressive on via the game's enabler, and the PS3 without the game's enabler setting on. But in fact there is a difference, and you can even see it with only a PS3 just by enabling the progressive mode in-game.

QuoteSo, are you trying to state something that just isn't true?

Absolutely positively not. This is something we arrived at after discussions with developers with documentation over at ars technica's forum. The PS3 specifically and explicitly deinterlaces all PS1 and PS2 games unless you uncheck the 480p option in your PS3's video settings. Deinterlacing unfortunately is not the same as progressive scan in the way it has been used in all of your PS2 games. It involves sending a progressive signal to your television, but it is only done after merging two completely interlaced fields together to create a single, much blurrier frame than if the entire frame were drawn in a single pass.

Here is wikipedia's article on deinterlacing. It is "an improvement" over interlacing, sure. But it is far from the ideal that we've had in the past, and a very boneheaded solution to just drawing everything to the screen progressively. You can clearly see how the image is degraded in the process of deinterlacing.

blackevilweredragon

#14
South Florida?  I live in Oviedo, don't know how close that is though...  (i don't know the mapping of Florida)..

If you came by this way, bring the PS3, I'd be interested to see how it deinterlaces it internally from it's own GPU..  (i have a PS3, but only S-Video for it)

(Btw, most PS1 games are non-interlaced to begin with, but some, like Tekkan, I believe, are interlaced)

Endymion

#15
Yes, most PS2 games are interlaced, and most of them work progressive scan perfectly if you force it too. There are only a handful of titles that look better than their interlaced modes this way, and none of them look as nice as they do in progressive mode. That's what makes the PS3 deinterlacing so frustrating. Really hoping they fix this with the emulation.

blackevilweredragon

yea, i know the PS2 games are interlaced, I was referring to the PS1 ;)

i'm surprised the PS3 just don't force it into non-interlaced, because, even on the PS2, isn't an image "rendered" as a non-interlaced image?  Then sent to the TV encoder which sends it out as interlaced..  That's my understanding on the PS2..  (i never got into PS2 programming)

RGB32E

Well, regardless of anyone's claimed technical understanding of what the PS3 does under the hood, the picture output via VGA (system set to 480p only) looks great.  From the PS1 and 2 games I played, the resulting image was extremely sharp (and quite pleasing).   It's unfortunate that you do not share the same results as I have with the PS3 (bad resulting image quality on your setup).  :(

Endymion, could you try connecting your PS3 via VGA (SoG @ 480p) and let me know if things change for you.

qz33

Just to confirm guys:

Is the av multi-out on the PS3 the same as the PS2 and PS1?
Can I use say a component PS2 cable on the PS3 and vice versa?

RGB32E

#19
QuoteJust to confirm guys:

Is the av multi-out on the PS3 the same as the PS2 and PS1?
Can I use say a component PS2 cable on the PS3 and vice versa?
Yes, that is correct.  Note: the PS1 does not output component video, only RGB on the respective pins.

So, if you have a computer monitor that uses BNC connectors and accepts SoG, you could buy 3 RCAF-BNCM adapters and connect them to the R, G, and B RCA leads... :)
(example of such adapters http://cgi.ebay.com/4-x-18K-gold-plated-RC...QQcmdZViewItem)

LiuKahr

QuoteThe sucky thing though is that you can't select above 480p.  Perhaps its related to copy protection - for DVD and blu-ray.  (NO RGB for these above 480p on their console!)  But I haven't confirmed the DVD/BR play back in RGB though... It might not be possible at all! :(
@RGB32E: Sounds like a silly question, but when you said that you could not get higher resolution than 480p with VGA on PS3, you meant even for PS3 games ?
That would definitely sucks indeed.

If true, I don't think this is related to any content protection thince YUV goes up to 1080p on PS3. Maybe related to screen protection (EDID-like stuff to autodetect max resolution of a screen).

RGB32E

Quote@RGB32E: Sounds like a silly question, but when you said that you could not get higher resolution than 480p with VGA on PS3, you meant even for PS3 games ?
That would definitely sucks indeed.

If true, I don't think this is related to any content protection thince YUV goes up to 1080p on PS3. Maybe related to screen protection (EDID-like stuff to autodetect max resolution of a screen).

But, look on the bright side, if you're h
Assuming the Multi-AV Out/SCART menu hasn't changed... progressive RGB (VGA/SoG) output from the PS3 is limited to VGA/480p - even for PS3 titles.  

This is not a EDID issue, as no DDC data lines/ect are involved.  The PS3 doesn't have a standard VGA connection (if you've taken a look), just R, G, and B video outputs on the Multi-AV connector.  Hence, no monitor data is reported to the PS3.

Component video (YUV) and HDMI both have copy protection schemes (macrovision and HDCP respectively).  The RGB output from the PS3 does not have copy protection...  Even the PS2 and Xbox (NTSC at least) switch to component (from RGB) when DVD playback occurs...

Besides, look on the bright side  B) , the lower resolution games run on the PS3 (or 360 for that matter), the higher frame rate you'll get (from most all games)!  Finally, VGA output from the PS3 on any large CRT computer/presentation monitor 27"+ looks awesome (even if it is only 640x480).

Endymion

QuoteBesides, look on the bright side  B) , the lower resolution games run on the PS3 (or 360 for that matter), the higher frame rate you'll get (from most all games)!
Well, no. The 360 development target is 720p. Every 360 game renders in 720p. Even XBox 1 games emulate in 720p. So if you use 480i or 480p, the game is rendering 720p and using the scaler chip to scale it down to 480 lines, it does the same thing in reverse if you select 1080i or 1080p so the horsepower is the same in all instances.

Sony really screwed up not adding a simple hardware scaler chip to the PS3.

RGB32E

QuoteWell, no. The 360 development target is 720p. Every 360 game renders in 720p. Even XBox 1 games emulate in 720p. So if you use 480i or 480p, the game is rendering 720p and using the scaler chip to scale it down to 480 lines, it does the same thing in reverse if you select 1080i or 1080p so the horsepower is the same in all instances.

Sony really screwed up not adding a simple hardware scaler chip to the PS3.
Well, yes.  While it may be a 360 TCR to support 720p and 480i, it doesn't indicate that all games render to a 1280x768 back buffer.  It is the developer's choice to support different back buffer sizes for the respective output resolution.  Hence, your assertion is a misnomer.

Have you ever hooked up a 360 with the VGA cable?  Have you ever tried setting the system to various resolutions for the sake of experimentation?  Your post seems to indicate that you haven't.  Furthermore, games such as Saint's Row support 640x480 fullscreen (not letter boxing and scaling a 1280x720/768 back buffer to 640x480).  This game happens to run a hell of a lot faster (higher frame rate) when running with a VGA cable at 640x480 (as opposed to 720p, ect).  Also, EA's NHL game can run at 1024x768 fullscreen and gives no evidence of a post render scaling.... (another example).

The higher the resolution rendered, the lower the framerate... this applies to most all platforms.


blackevilweredragon

#24
I'm going to have to agree with Endymion.  I remember hearing that the 360 is stuck in 720p, and that's all it does..  It then uses a regular scaler to go to other resolutions..

If you want me to verify this, I can check if this is true by seeing the picture from the unit itself.  Do this, with the SAME CRT monitor (i use the scanlines as reference for what I do), and a very good camera, and in focus, take a picture of it running in 1280x720, then take another picture of the same game in 640x480..

If it's scaling, there will be some anti-aliasing on polygon edges, if it's NOT scaling, there will be jaggies...

Why?  Because they don't use anti-aliasing for game consoles, because it can penalize the performance.  And scaling, can yield artificial anti-aliasing when going to lower resolutions...  I will use this as a check to see what is true..  (i use the scanlines to see how sharp the image is, and to see where every line is...)

(EDIT:  My understanding is that FSAA is only used in Xbox 1 compatibility mode.)

RGB32E

QuoteI'm going to have to agree with Endymion.  I remember hearing that the 360 is stuck in 720p, and that's all it does..  It then uses a regular scaler to go to other resolutions..

If you want me to verify this, I can check if this is true by seeing the picture from the unit itself.  Do this, with the SAME CRT monitor (i use the scanlines as reference for what I do), and a very good camera, and in focus, take a picture of it running in 1280x720, then take another picture of the same game in 640x480..

If it's scaling, there will be some anti-aliasing on polygon edges, if it's NOT scaling, there will be jaggies...

Why?  Because they don't use anti-aliasing for game consoles, because it can penalize the performance.  And scaling, can yield artificial anti-aliasing when going to lower resolutions...  I will use this as a check to see what is true..  (i use the scanlines to see how sharp the image is, and to see where every line is...)
Agree with Endymion's speculation?  You didn't read my posts correctly... and based off your last series of posts, you lack credibility (as you made many completely incorrect statements).  And if you do have a hard time explaining things (excuse), don't obfuscate the topic... Also, this has NOTHING to do with anti-aliasing, and numerous games have used anti-aliasing...!

blackevilweredragon

#26
Excuse me?  Based on your replies, neither do you, but Endymion has more credibility here, I trust him when it comes to video stuff (like other people)..

And you apparently mis-read mine, so don't insult me because YOU read me wrong..  Downscaling causes artificial anti-aliasing...  And from what I have seen in store demos, the 360 in native 720p, has no FSAA (as I said, this is from what I SAW), and if it IS downscaling, there would be artificial anti-aliasing..  If it wasn't, and was running on the same engine, there would still be no FSAA..  If you fail to understand that again, I'll make a picture demo for you...

EDIT:  Here..  This artificial anti-aliasing is because the downscaler applies filtering to the image, so it doesn't appear nasty..


Endymion

#27
Oh boy this is going to be good.

QuoteWell, yes.  While it may be a 360 TCR to support 720p and 480i, it doesn't indicate that all games render to a 1280x768 back buffer.

You are right--it's just Microsoft's development standard. That's all. Oh and it's 1280x720.

QuoteIt is the developer's choice to support different back buffer sizes for the respective output resolution.  Hence, your assertion is a misnomer.

No, it isn't. Seriously.

You are correct that this is something the developer has to activate. But you are incorrect in asserting by way of assumption that there are wild variants to this, and you are even more incorrect in stating that a game displaying a lower resolution from the 360 is rendering at a lower resolution, ergo giving higher framerates. There is one and only one game that I know of that runs at a lower render than 1280x720, PGR3. This one was made at a slightly lower resolution which Ana scales up to 720--you'll forgive me if I don't have the exact resolution of that one game but I'll call a card from Lawrence's deck and tell you that it's very easy to google. Anyway, that is the only exception to the rule that I'm aware of, and Microsoft is very stringent about their standards. Even XBox Live Arcade games absolutely must render at 720p.

QuoteHave you ever hooked up a 360 with the VGA cable?

Yes.

QuoteHave you ever tried setting the system to various resolutions for the sake of experimentation?

Yes.

QuoteYour post seems to indicate that you haven't.

Your post seems to indicate that you don't understand what scaling is.

QuoteFurthermore, games such as Saint's Row support 640x480 fullscreen (not letter boxing and scaling a 1280x720/768 back buffer to 640x480).  This game happens to run a hell of a lot faster (higher frame rate) when running with a VGA cable at 640x480 (as opposed to 720p, ect).  Also, EA's NHL game can run at 1024x768 fullscreen and gives no evidence of a post render scaling.... (another example).

You are experiencing placebo. Seriously.

QuoteThe higher the resolution rendered, the lower the framerate... this applies to most all platforms.

Except that when different resolutions are used, the scaler chip is used, not just to scale but to clip and size to the resolution that you are setting to. Do not try to pretend that internal renders have changed just because you have a VGA cable that communicates with your monitor for a native resolution, or a CRT that can use any resolution you like. Whether you like it or not, this chip is where those resolutions come from. You do not seem to understand the purpose of scaling, certainly not where it applies to the 360. Quake 4 has the same exact framerate problems at all resolutions precisely because the game is rendering at 720p even if you display it at 640x480. Hell, Prince Of Persia, an XBox 1 game, slows down when emulated because it is being rendered at 720p.

Let me ask you this, what kind of hell do you think you would be in for if you designed a console for the masses, included networking gameplay, got that all hammered down, then found that some players were pwning everyone because they used an old TV set? Wouldn't your high-end HDTV/VGA-capable piece of equipment come under fire? There is not anybody complaining about unfair pwnage in Rainbow 6 Vegas on their high-end HDTV because their competitors are gaming on a 20 year old CRT, but you might as well expect that to happen once PS3 multiplayer becomes popular.

Here's an edit to keep this thread on target (if that's possible) with respect to what's come before regarding the PS3 and deinterlacing of PS2 titles, several people at neogaf noticed this the same day I did, right when the update was released. And they're right. The image is fuzzy, soft, blurry, however you want to describe it, it's considerably less sharp and also not as bright as the PS2 with the same game. Lots of comparison pictures in the thread above.

blackevilweredragon

Not to mention, the 360 isn't the only game to be scaled..

On the old Xbox, the Matrix game..  It may run in 1080i, but it's actually rendering 960x540 (this can be seen because every vertical pixel is doubled)..

In this case, the 360 ALWAYS uses scaling...  For the exact reason Endymion used, to keep the FPS constant...

RGB32E

What I stated before is that the FPS vs resolution is NOT consistent! :P  I've encountered this on numerous games.  So, your point has been invalidated.

Endymion

#30
Right, and you enable your framerate counter how? I think you're going to have to let us know this, because I have seen slowdown at the same points in all my games at any resolution or display I play at which pretty quickly nullifies your point--and lets Microsoft's development standards alone unscathed.

RGB32E

#31
No framerate counter needed.  Its fairly obvious when a game runs at a lower framerate...  If a given game chugs (choppy framerate) more at a higher resolution, then it is obvious that there is a FPS vs. screen resolution differential.  Enough!  :o

Are you implying that you have read the 360 XDK documentation? lol...

blackevilweredragon

you don't have to have the development docs to know you MUST render in 720p..  simple Googling even says so...  FROM game developers too!

tell me this, in a game, at higher resolutions, do HUDs get smaller?

RGB32E

Yeah, and there are plenty of sites out there that state the SNES can only display upto 256 onscreen colors... another fallacy....

And nothing either of you (B and E) have stated changes the fact that there are games on the PS3 and 360 that perform better at lower resolutions...

Endymion

#34
That's not necessarily true. It's not always obvious when a framerate is lower due to the nature of 3D animation. When certain objects animate in certain ways coupled with movement of the camera, the movement that you see can nullify the other or make its draw redundant. This is not exactly an isolated or rare event either. If you have no framerate counter you literally have no way to tell what speed the thing is animating at, because a faster framerate can have redundant frames that make it appear to move slowly--even though the frame is being rendered.

This is one reason that the only accurate way to benchmark framerate is to run a timedemo, where you run a scripted event that plays through along with, shocker, a frame counter. Do you get it now?

Now, I'm not saying that I have a frame counter for my 360 any more than you do. But I am saying that the 360 has a hardware scaler, and that Microsoft requires all developers to render their games at 720p. Both of these are true, factual, and easily verified. And when I see the opening movie* on Quake 4 slow down at the same exact points for the same exact duration on my standard/composite CRT, my CRT VGA monitor, my widescreen LCD monitor and my 50" plasma screen regardless of the resolution I set, I find Microsoft's third party development requirements a lot more believable than your posts.

Here's an article you really need to read. Please do so.

[size=8]*For the anal retentive, I wrote "movie" for the sake of brevity, realizing already that an actual clip of movie footage would not be rendered by the 360 polygonal engine--but it isn't a movie in that sense, it is in fact a scripted event rendered by the Quake 4 engine just like the rest of the game. I just picked it because it's a good example of very consistent slowdown across resolutions and because a rendered cinematic sequence is as close as we'll ever see to an actual timedemo on a console, minus the frame counter.[/size]

RGB32E

#35
QuoteThat's not necessarily true. It's not always obvious when a framerate is lower due to the nature of 3D animation. When certain objects animate in certain ways coupled with movement of the camera, the movement that you see can nullify the other or make its draw redundant. This is not exactly an isolated or rare event either. If you have no framerate counter you literally have no way to tell what speed the thing is animating at, because a faster framerate can have redundant frames that make it appear to move slowly--even though the frame is being rendered.

This is one reason that the only accurate way to benchmark framerate is to run a timedemo, where you run a scripted event that plays through along with, shocker, a frame counter. Do you get it now?

Now, I'm not saying that I have a frame counter for my 360 any more than you do. But I am saying that the 360 has a hardware scaler, and that Microsoft requires all developers to render their games at 720p. Both of these are true, factual, and easily verified. And when I see the opening movie* on Quake 4 slow down at the same exact points for the same exact duration on my standard/composite CRT, my CRT VGA monitor, my widescreen LCD monitor and my 50" plasma screen regardless of the resolution I set, I find Microsoft's third party development requirements a lot more believable than your posts.

Here's an article you really need to read. Please do so.

[size=8]*For the anal retentive, I wrote "movie" for the sake of brevity, realizing already that an actual clip of movie footage would not be rendered by the 360 polygonal engine--but it isn't a movie in that sense, it is in fact a scripted event rendered by the Quake 4 engine just like the rest of the game. I just picked it because it's a good example of very consistent slowdown across resolutions and because a rendered cinematic sequence is as close as we'll ever see to an actual timedemo on a console, minus the frame counter.[/size]
lol... Nice editorial piece.  You're confusing the whole 720p thing.  Yes, MS requires each 360 title to "support" 720p, but does not prevent a given developer from supporting other resolutions (natively without up/down sampling the back buffer).  I know quite a bit more about programming, developing, and supporting game engines than you may realize...  :D

Furthermore, I've benchmarked plenty of PC games over the years.... no placebo! lol

blackevilweredragon

Endymion, think we should tell him about the "Ana" chip in the 360?

Endymion

#37
I already did, the link I gave to ars technica, if he bothered to read it, is all about the Ana chip which is a scaler. It exists to scale the resolution of all 360 games. So I guess the next puzzler for our vacuous friend is to ask, what do you think this chip does, since you are unconvinced that it scales resolutions? What good reason do you have for Prince Of Persia slowing down at 480p? I mean, I know that it's emulation--but the game still slows down at the same exact spots in 720p.

All RGB32E is really doing here is posturing. I'm unconvinced of his personal or professional experience if he can't acknowledge the existence and operation of a simple chip. He probably thinks that all scaling will result in artifacting for all I know, he's hinted at it here already.

Lonefox

Hi guys, need a little help basically i have the vga adapter that came with my linux kit plugged into my ps3 then into my tft monitor, it seems to work on 720p but everythings green. Is there anyway to fix that?

Endymion

The system outputs sync-on-green, so you need to remove the sync from the green line and use it on a separate one. Alternately you can use a monitor that supports sync-on-green, but these can be hard to find and of course undesirable if you have a monitor you want to use already.

Search here for threads on using the LM1881, it's often used to separate sync.