Screwing About With Megadrive Rgb

Started by Martin, March 20, 2005, 01:42:58 AM

Previous topic - Next topic

Martin

I got bored today, and started seeing what effect using different PSUs would have on the RGB output of my MegaDrive. (obviously though, I made sure they were all 10 V or under only using varying amp ratings).

1.2 Amps (supplied with console) gave optimum performance.
Anything less than 1 amp caused a hideus buzzzing sound and flickering on the image (which gets worse the more you lower the amp rating).
1.3 Amps caused the display to turn into a white line that scrolls slowly down the screen.

I'm no genius, but the main reason I made this topic was: why does 0.1 amp extra have that big an effect on the image?
It's stupid.
[span style=\'font-size:14pt;line-height:100%\']barenakedladies[/font][/span]

Aidan

#1
Sounds like you've been using unregulated power supplies. If that's true, then the voltage they actually puts out is related to the current. If the current is low, then the voltage is usually rather higher than the sticker on the power supply. For example, I have a 10V power supply that outputs 18V when nothing's connected. Connect up a device that takes the power supply to nearly it's maximum current, and the voltage is a bit below 10V.

In the case of not enough power, the power supply ends up struggling to meet the power requirements for the Megadrive. When that happens, you end up with 100Hz (if you're in the UK) ripple on top of the DC, which is what's causing the noise and image effects.

The 1.3A supply sounds like it's actually putting more voltage out than you expect. This can have unexpected side effects too, as it can cause parts to malfunction and overheat.
[ Not an authoritive source of information. ]