There are two aspects to consider. They are can a VGA->HDMI adapter support 400 pixels wide, and can it support 240 lines.
VGA has no pixel clock. A line is a set of continuous waveforms for R, G, and B. This means that there is no problem with a mis-matched horizontal resolution, except that the edges may be ugly.
If a VGA->HDMI convert supports 240 lines is a bigger question, and the answer is less certain. Some might not support it. Vintage VGA cards could be convinced to do all sort of weird line rates, much stranger than 240, so a good VGA->HDMI adapter should support it happily.
Adapters specifically for 8-bit and 16-bit consoles will probably do 240 lines, since 240p was what most of those used.
The OSSC can probably do it. I don't think any of the retrotink boxes support separate sync (which is what VGA provides), which is a shame because those are a great bang-for-the-buck.
A DVI* signal is logically very similar to a VGA one, but with a different physical layer. The analogue colour signals are replaced with TMDS encoded digital ones, but the pixel clock and sync signal work more or less the same way.
I would guess that a simple VGA to DVI converter simply syncs to the VGA pixel clock, samples the analogue colours, and outputs the digitally encoded values with the same timings.
From a quick look, the oscillator in this machine's schematic runs at 16 MHz. I assume that the pixel clock is derived from this. The DVI specification has a minimum pixel clock of 25 MHz so you couldn't produce a valid DVI stream from this without buffering the pixels and retiming the output in summer way. Well, I suppose since the pixel clock isn't explicit on the VGA cable you could have an imaginary clock which is higher by doubling pixels horizontally.
Ultimately though, success probably varies depending on the converter and the display used. There are quite a lot of standard VESA modes and you can often get away with generating something close-ish to spec.
> HDMI is, broadly speaking, a proprietary extension of DVI
It is but it's becoming hard to find monitors with dvi connectors, so i asked about hdmi which should be more common.
How you get your ancient analog output to a display with only digital in is becoming a problem. I don't know shit about how good or bad your average solution is so I ask.
[I don't have anything like the toy we're talking about in this thread, but I have a 486 with a Trident 512kb vga card and Syndicate on it in a closet]
Seen the VGA output on a bunch of retro projects.
I've also seen that there are VGA to HDMI converters available.
But can those do any random non standard res like this one, or are most (or all) of them limited to standard historical resolutions?