Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Monochrome VGA output 400 x 240 pixels

Seen the VGA output on a bunch of retro projects.

I've also seen that there are VGA to HDMI converters available.

But can those do any random non standard res like this one, or are most (or all) of them limited to standard historical resolutions?



There are two aspects to consider. They are can a VGA->HDMI adapter support 400 pixels wide, and can it support 240 lines.

VGA has no pixel clock. A line is a set of continuous waveforms for R, G, and B. This means that there is no problem with a mis-matched horizontal resolution, except that the edges may be ugly.

If a VGA->HDMI convert supports 240 lines is a bigger question, and the answer is less certain. Some might not support it. Vintage VGA cards could be convinced to do all sort of weird line rates, much stranger than 240, so a good VGA->HDMI adapter should support it happily.


Adapters specifically for 8-bit and 16-bit consoles will probably do 240 lines, since 240p was what most of those used.

The OSSC can probably do it. I don't think any of the retrotink boxes support separate sync (which is what VGA provides), which is a shame because those are a great bang-for-the-buck.


The Rt5x can handle various sync pulses (Sync On Green, Sync On Luma, SCART sync) and the RT4K can handle "Just about anything"


Can it handle separate hsync and vsync signals though? That's what VGA outputs.


> Vintage VGA cards could be convinced to do all sort of weird line rates

Yeah, with XFree86 Modelines :-)


Or Mode X in some DOS games.

https://en.wikipedia.org/wiki/Mode_X


A DVI* signal is logically very similar to a VGA one, but with a different physical layer. The analogue colour signals are replaced with TMDS encoded digital ones, but the pixel clock and sync signal work more or less the same way.

I would guess that a simple VGA to DVI converter simply syncs to the VGA pixel clock, samples the analogue colours, and outputs the digitally encoded values with the same timings.

From a quick look, the oscillator in this machine's schematic runs at 16 MHz. I assume that the pixel clock is derived from this. The DVI specification has a minimum pixel clock of 25 MHz so you couldn't produce a valid DVI stream from this without buffering the pixels and retiming the output in summer way. Well, I suppose since the pixel clock isn't explicit on the VGA cable you could have an imaginary clock which is higher by doubling pixels horizontally.

Ultimately though, success probably varies depending on the converter and the display used. There are quite a lot of standard VESA modes and you can often get away with generating something close-ish to spec.

For more exotic video signals you can use devices like the RGBtoHDMI: https://github.com/hoglet67/RGBtoHDMI

It decodes the input signal into a framebuffer and uses the Raspberry Pi's video core to output the result.

* HDMI is, broadly speaking, a proprietary extension of DVI. You can feed DVI signals though an HDMI connector and it will display anyway.


> HDMI is, broadly speaking, a proprietary extension of DVI

It is but it's becoming hard to find monitors with dvi connectors, so i asked about hdmi which should be more common.

How you get your ancient analog output to a display with only digital in is becoming a problem. I don't know shit about how good or bad your average solution is so I ask.

[I don't have anything like the toy we're talking about in this thread, but I have a 486 with a Trident 512kb vga card and Syndicate on it in a closet]


You can get a DVI to HDMI adapter which costs less than a coffee.

Adding an HDMI connector is more likely to get you into expensive licensing discussions than DVI is.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: