Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rust hello world app for Windows 95, cross-compiled from Linux, no MSVC (hails.org)
168 points by sohkamyung on May 13, 2023 | hide | past | favorite | 105 comments


It's quite possible to develop in Rust for Windows without using Windows.

Try my open source "ui-mock".[1] This is a test of the cross-platform stack. Just get the repository with "git clone", and make sure you have Rust installed for target "x86_64-pc-windows-gnu". See the Cargo.toml file for build instructions.

This is a game-type user interface. It's just some menus and a 3D cube. It doesn't do much, but it exercises all the lower levels. This allows debugging cross-platform problems in a simple environment. The main crates used are winit (cross-plaform window event handling), wgpu (cross-plaform GPU handling), rfd (cross-platform file dialogs), keychain (cross-platform password storage), egui (Rust-native menus and dialogs), and rend3 (safe interface to wgpu). For graphics, it uses Vulkan, so it will run on Windows back to the last release of Windows 7. Not Windows 95, though; it's 64-bit. It will also run under Wine, so you don't even need a Windows system to test.

My metaverse client uses the same stack. It's compiled on Linux, and runs on both Linux and Windows. So I'm building a high-performance 3D graphics program for Windows without even owning a Windows system or using any Microsoft software.

[1] https://github.com/John-Nagle/ui-mock


With xwin you can download a copy of the MSVC sysroot and use it to build Windows applications on Linux. LLVM reimplemented the entire MSVC toolchain and can thus link with MSVC libraries.


I compile BGFX + SDL2 for use alongside Rust on WSL without using MSVC and can run the generated .exe directly from the command line. Works fine - the package management is a bit of a pain but not terrible (two commands to install dependencies instead of just `cargo build`), but overall it has worked absolutely perfectly.


Those are more mainstream solutions than what I'm doing. I'm working to beat C++ OpenGL game performance. Most applications don't need to do that. And they don't need to idle at 100% of one CPU, redrawing every frame, as games do.

Good old gtk is available cross-platform for Rust. If you're not doing a graphics intense program, that can be a good choice.


I just can't help but point out how cool the Windows 95 user interface was. Partially inspired by the NeXT OS, its clarity endlessly resonates with me.


A full-blown OS with an office suite fit on something like 100MB drive with everything opening and responding to user input in a snappy manner. And it didn't show random ads or try to track your every action. You could chose when and if to install any updates. All the while running on a 486 with a few megs of RAM. How did we get to the current state of software?

Edit: I remember a few places still running 3.11 well into the 00s because it just worked for the given tasks, got out of the way and didn't require a super-computer to run. The performance difference was especially jarring when Vista came out and barely worked on contemporary hardware.


> How did we get to the current state of software?

Though playing mp3 was a heavy task consuming most of the CPU. It was hard to do anything while listening to music.

Or think about how pressing Ctrl-S once a minute became a reflex, because MS Office crashed so often.

Hell, anyone remembers how unstable Windows ME was? Or how desktop Linuxes were practically non-existent?

Current state of software is _amazing_ compared to what was available at that time.


Windows NT 4.0 existed at the same time with the same UI and without the instability. Effortless MP3 playback was primarily made possible by hardware advances, not software ones.


From my memories NT branch was unstable until at least XP SP2. Not sure if it was because of the OS itself, or because most of the software and drivers were written for Windows 98, and required few years to catch up.


Windows NT was rock solid until Windows XP was released, but GPU performance was meh and Windows games generally didn't work well unless they were written to target NT. As I recall, there was one commercial graphical strategy game for NT at the time.

XP broke the subsystem isolation of NT in favor of enabling Windows non-NT types of drivers with higher levels of kernel/GPU efficiency due to less isolation. This also brought in the ability of the drivers to crash the kernel.

And that, my friends, was the end of rock solid Windows NT stability.

I was on the Windows team during the XP SP2 years. Stability was somewhat reachieved by an intense focus on defensive coding to detect rogue driver and the establishment of the WHQL, or Windows Hardware Quality Lab. WHQL was basically an investment in driver analysis tools and a moderately sized team inside MSFT who's sole job was to debug and fix other people's drivers.

It's not a good replacement for isolation though, and it requires sustained continuous effort by both MSFT and the windows hardware partners, which imho hasn't continued.


Windows 2000 was fairly solid as long as the device drivers weren't too exotic.


> Though playing mp3 was a heavy task consuming most of the CPU. It was hard to do anything while listening to music

Cyrix MX166 PR200 with 32 MB of RAM was playing MP3 just fine while compiling the (linux) kernel and being snappy in the same time.

The SW has just become more bloated. SW engineers those days are not constrained by resources and this leads to a lot of bad practice.


I remember in the 90s having to actually expand my MP3s back into WAVs so my poor CPU could play them in realtime.


In the 2000s once talked my mom into getting me an old Mac Centris 610 from eBay. I attempted some C programming, but I had the most success reencoding MP3s from the Final Fantasy 8 OST and copying them over to the Mac on floppies lol


Two comments in this thread (this one and a sibling) mention needing to save often due to programs crashing, as if reliability of the software is solved by showing ads and tracking activity, or is a function of the size of the program or the speed of the computers or software, the things FeistySkink had mentioned.


Not sure about the ads, but telemetry definitely increases reliability. It allows to debug very specific issues with a comprehensive information about user environment. Source: debugged proprietary app multiple times using telemetry information. Sometimes the issues discovered were not reproducible without recreating some very specific environments.

Surely there are downsides to telemetry and tracking, but they can be useful for development.


MS would like to dissagree. There are bugs which are not closed after years (mouse registered double clicks, blacked out parts of GUI, programms starting on inexisting screens).


Snappy is not what my memory of the IO on those machines was


486 was certainly not snappy, though Win95 did technically run even on a 386 with 4MB of RAM (i know because i tried it back when Win95 was new).

However on the Pentium MMX@200MHz with 32MB of RAM i had ~1996/97 it was very snappy. On the same machine i also ran my first Linux, Caldera OpenLinux 2.3 in 1999, with KDE1 where it also felt very fast.


I find it hard to believe disk was snappy compared to modern devices, since original comment mentioned IO. Cpu/memory are down to program behavior in addition to hw. Disk, more size and hw. And HW improvements have dwarfed size increases (rather, facilitating them to some degree).


Depends on what sort of disk I/O you have in mind. Floppy disks were never snappy, but HDD disks were fine for the most part until the OS started using them for virtual memory. Keep in mind that things were much smaller these days - i remember thinking that my 2GB HDD would never fill up :-P.

Also in the last few years disk I/O optimization took a massive nosedive, especially under Windows. My 2013 laptop has a ~5400rpm HDD was so unusable with Windows 10 that even the login screen crossfade animation stuttered from disk I/O and updates took literally hours. Meanwhile Windows 8, released only a few years prior, was able to boot my older 2008 laptop from power on to desktop in ~22 seconds (i was so impressed at the time that i uploaded a video of the boot process [0]).

If you are used to the recent awful I/O performance it can be hard to imagine things not always being like that.

Besides you can also check a bunch of YouTube videos with old computers running Win95, here is a short one i found that goes through the bootup process and clicks around the OS for a bit.[1] The computer in the video is actually slower than the one i personally had too.

[0] https://www.youtube.com/watch?v=Ti3LQHXZ0Qg

[1] https://www.youtube.com/watch?v=yfh5ZcDhdZA


There was plenty of software and games that could fit on a floppy or two.


Haha I had the exact same thought. I still hear that “brrbrbrrrbrre” of the HDD arm frantically trying to deal with the overload of doing anything whatsoever.


I do miss the sounds of older, more mechanical machines though. You could be doing something else in the room while your Very Long Task was running, waiting for the silence that signaled it finishing.


The IO at the time was terrible, also because,if memory serves me right, plenty of disks/controllers didn't support DMA transfers.

But if you had a few megabytes to spare for caching (not everybody did, for sure) the machines would be snappy, because the UI was.


I don't recall it being snappy and I also recall it being crashy. But it was a lot better than what came before it. Given the hibernation Apple was in at the time, it was the best alternative. Between the 95 launch and OS X 10.3 was the only time I used Windows as my daily driver.


> Given the hibernation Apple was in at the time,

System 7 and Mac OS 8 were pretty good from a UI perspective, but no memory protection and no preemptive multi tasking was not great. For a long time "MacOS 9" was synonymous with classic Mac, but I always remember 8 being the last tolerable one, by the time 9 came around the foundations were more clearly inadequate, it was like the WinME of Mac OS, tiding over until OS X.

> OS X 10.3

Yes. I remember some commentary on the first few versions of OS X was that they were resource hogs.


it was a lot worse than the sgi workstations i was using in 01994

also, it was a lot worse than the bsdi bsd/os ("bsd/386") workstations in the same room with them, and those were the same hardware win95 ran on. and it was a lot worse than my friend's linux box


I'm talking about general UI responsiveness. Perhaps this is just nostalgia.


Watching 90's GUI programs is always fun because you can see the graphics as they are drawn, which I don't seem to remember from back then.


If you're nostalgic for that, there's plenty of modern software with a watch-it-paint-itself UI. <coughSagecough>


I have never heard anyone call Sage "modern software" before : - )


Now we have spinners that twirl while we wait for graphics to transmit from another continent.


And GUIs which redraw the whole screen when the cursor blinks. /s


It is. Windows UIs didn't get Amiga levels of responsive until Windows 98SE or so.


And then lost them again during the gap between Vista's launch and SSDs becoming ubiquitous...


I had some of the first SLC SSDs. It didn't help at all. Windows 7 was a massive improvement though.


I have a SSD on my work machine. Win 10 is still slow.


There's some truth to it just for vendor-supported graphics driver reasons alone.


Fast on a Pentium 75, dreadfully slow on 486/25


A 486 with a few megs of RAM coukd not run Windows 95 in a snappy manner. It would swap like bugfuck. To do Windows 95 usably, you probably need 8 MiB minimum, 12 or 16 MiB ideally.

Windows 3.x you could get away with 4 MiB. And that and Office 1.0 could fit in 100 MiB. Even then, we complained about it.


linux at the time could theoretically run in 2 mebibytes but worked a lot better in 4


According to Wikipedia, Office 95 required 28 to 88 MB of storage.


> How did we get to the current state of software?

You mean how did we get to a point where :

- I don't have to mash Ctrl+s because software crashers constantly

- segfaults are not a daily occurrence

- any random executable can't crash the OS (hell modern windows can survive a GPU driver crash)

- I can use the internet relatively safely

- I don't need to restart my device after every install

etc. ?


Completely agree. Modern software is so much better than old software. Fuck the "ooh it runs on 100MB of RAM" I don't fucking care. All I care about is that it's so much more reliable than before, I can actually get my shit done now.


On rare occasions that I have to boot Windows to run some government-compatible software on a 32-core, 64GB RAM and I-don't-even-care-to-check-how-fast SSD, I have to listen to it firing on all cilinders while Defender fights Malicious Software something and Windows Updates while I'm typing some text in a barely responsive page. Truly getting my shit done.

Edit: Forgot all the random software doing auto updates and notifying me about it as I go.


That's the price of living in a world where you're connected to the internet by default - a huge amount of performance goes towards security (sandboxing/scanning/rules). Windows 9x didn't even have a firewall FFS.

Windows has to chose defaults for a huge audience, if you daily drove it it would probably just work. Way better than 95 would.


Eh, no. Linux and bsd had process isolation and ran packet-filtering with little effort on contemporary hardware.

You should also be aware that early android/ios phones did all this in "modern times" on limited hardware by today's standards.


> if you daily drove it it would probably just work

More like, "if you daily drove it you'd get used to all the awfulness".


No, you can add defender exceptions/disable it and updates are staggered.


> > How did we get to the current state of software?

> You mean how did we get to a point where :

> - I don't have to mash Ctrl+s because software crashers constantly

Office 365 would like to dissagree. Not constantly but not reliable either.

> - segfaults are not a daily occurrence

Just because Teams just dies without writing something on the console.

> - any random executable can't crash the OS (hell modern windows can survive a GPU driver crash)

Yes, but a docking station is the last straw. Power management on USB is hard.

> - I can use the internet relatively safely

Relatively

> - I don't need to restart my device after every install

But after every update ...

> etc. ?

We are living on a planet that's revolving and evolving ...


Hot take: developers today are nowhere near as talented as those of 1995. Every single dev in 1995 was the equal of a principal SDE today. You had to be, because unless your code met that bar, it just didn't work.

There were no guardrails. There was no borrow checker. There was no ASAN. There was no valgrind. Christ, you barely had a debugger. You couldn't copy and paste answers from ChatGPT or Stack Overflow: you either RTFM and comprehended it, or your code didn't work at all.

Today, we spend an enormous amount of compute, memory, and latency compensating for the modal developer being unable to do safe memory management or concurrency.


Windows 95/98 had a lot of problems due to its DOS legacy. A better focus for discussion of what was possible at the time would be BeOS or QNX.


We got to the current state because you left out most of what the OS does now. That's like comparing a 787 to a Cessna, and stating "airplanes used to be light and easy on fuel - how did we get to the current state of the airplane?

I have a work account that I can use to log into all my devices. All my docs and even the stuff on my desktop is magically synced and backed up, and I even have versions. I can open a visio on my laptop in the kitchen, draw some stuff while I wait for the coffee, then go sit on my desk and keep working on that file. When I go for a refill, it's still open in the kitchen and I keep working. Yes - one document, open at the same time on 2 machines, changes being synced live on the screen w/o closing.

When I change my preferences to not select the entire word at a time, it magically saves in every outlook install on all my machines. Plug something in and the driver for it is magically downloaded and installed. Have disk? Remember that? Different sizes and shapes and density monitors? Monitors - many of them.

Gramma can use the computer now - you don't need a geek. Yes, the gramma interface is annoying for the tech user, but the tech user can easily put in a new window manager or tweak registry - the new dumbed down default opened computers to everyone - not just geeks. Tracking? Hmm, don't see any. Maybe because you can (a) turn it off, and (b) you can install a version w/o it. Yes, if you install the consumer version of Windows made for gramma, it will track everything. That's because it's for gramma, and tracking helps make the next version easier for gramma.

You also don't want gramma to be able to turn off updates, because she will. Updates on my machine? They were set to download and notify - never install, by default. But no, you don't get that on the gramma home version.

Do you ever get into a cessna and complain they don't have room for all the 40 members of your group? That's not a valid complaint. That's a bad choice of right tool for the job.

The OS is big because the 787 is big. And I say all that, as someone who hates MS and Windows with a deep, long, hard, smelly brown passion.


Does any of this require a supercomputer? How has the computing paradigm fundamentally change in 30 years?


This is something that I've wondered about before. I've seen many people with that opinion: UIs in the past, although dated looking now, were very intuitive, in that every interactive thing looked interactive.

Contrast that with amorphous, flat, "clean" designs we have today.

So what could we do to have something that looks nice and modern, but is also intuitive, with clear affordances?


Windows 95 was developed over several years with a lot of research and user studies. The primary objective of all this work was so that a person without prior exposure to computers could figure out how to use it. The software was designed to help the user accomplish the tasks they have set out to do before sitting down at the computer.

With most modern mainstream interfaces, both the audience and the objectives have changed. The overwhelming majority of the audience is now computer-literate and can be counted on to figure out how to use the basic features regardless of their quality. And the design objectives for many interfaces have changed to manipulating the users to do what the developer wants them to do, with the user's tasks relegated to a side quest.

Another factor is that Windows 95 was developed for "install it and leave it" operation, with the next upgrade coming in a year or two. A lot of attention to detail was unavoidable because you could not push an update a week after release to patch a bug. At the same time, Microsoft was offering real live person phone support for its products, so every UI flaw cost them a lot in support costs.

Also, one of the design requirements was to fully support only-keyboard operation, and a lot of QA effort went into ensuring this was true. This meant that every UI element had to have a keyboard cue. This duplication of accessibility resulted in a UI which was much more thoroughly evaluated for its efficiency.

Finally, I think there was a different culture around design. I think a higher portion of people in the UI field were there for personal and passion reasons, and would have been embarrassed to have anything to do with the type of blundering buffoon of an interface that is common in today's computing.


Excellent observations and analysis. Thank you. Indeed, there is so much going on under the tip of the iceberg.


I've been in development for 24 years and now I have issues with some very strange UIs. Where do I click to post?

Old UIs are consistent. Buttons are buttons, labels are labels, and so on.

New UIs are inconsistent. In the facebook interface, the post button is not a button, it's a label. It may have changed since I last used it. It also occurs in native programs.

In older UIs, button size makes sense. Today every app recreates the buttons, with absurd sizes.

Contrast. Designers love low contrast in the new UIs. Here in HN it is sometimes very difficult to read the text.

As a developer, I participated in several meetings about new UIs. "Beauty" is the keyword. The beauty seems to be in the rounded corners and the use of labels instead of buttons. Yes, I created a bunch of shitty UIs.

But most importantly, dark patterns don't exist in old UIs. Dark patterns are the worst thing about the new UIs.


Even with old UIs there were definitely mis-use of UI elements. Clickable labels that opened things, buttons made of images. Apps that were basically an imagemap that you clicked on. Think early media apps like Realplayer.

Although I don't know if you'd really classify these as "old UIs" - they were definitely old, and had a UI. You probably mean "use native platform widgets were possible and any new widgets look as native as possible".

But then some apps bridged this gap nicely. Winamp was very "non native" but the default skin was consistent, clean, and fast.


> Even with old UIs there were definitely mis-use of UI elements

Yes, but they were exceptions. Interface Hall of Shame has some examples http://hallofshame.gp.co.at/

> Winamp was very "non native" but the default skin was consistent, clean, and fast.

Visual aside, Winamp follows certain standards. Buttons are buttons, for example.


In the 90's, the concern was teaching new users how to compute. Personal computers for the home were still relatively new, and many people were using them for the very first time. Everyone was concerned with usability.

In the 20's, the concern is onboarding users that already have devices and pay money to use software. We don't think as much about familiarity, teaching, etc.


That's it, perfect.

Perhaps the most important thing is to keep the user on the platform, avoiding his departure as much as possible. Thus, a "unique" interface helps with this.


That would make sense if those unique UIs didn't themselves change fairly drastically every few years. I loved the older iOS devices because of the physical home button - I could give it to my grandma and say, "if you ever get lost, press this and you'll be back to home screen" - and it worked wonderfully... until they removed the physical button. On the latest iPad, you have to swipe from below the edge of the screen instead, and do it fast enough (if you do it slow you get the task switcher instead of home). Needless to say, this requires relearning.


At one of my jobs I worked on change for the sake of change... there was no real meaning to the change, just the boss wanted change. I don't have a good theory about it, but it's now very common. Apple, facebook, to it all the time.

Is it to create a "fresh product" feeling? I don't know.


> in that every interactive thing looked interactive

I agree in general though i'm not sure Win95 was that consistent. Buttons did look 3D but menus - also interactive - did not. Similarly with the context menu icon at the top left, it was interactive but didn't look like it. Meanwhile Win98 had toolbar buttons become flat.

There was an attempt with Win3.0 (i think) to define a form-follows-function approach where "if something can be interacted with it should be 3D" but even that didn't went far (checkboxes were not 3D nor were edit boxes). By late Win3.1 the ctl3d (i think) DLL was introduced to make everything look 3D regardless of function.

In comparison to today, sure things were better, but it isn't like there was any point in time where things were consistent in terms of control/widget look following its function.


Not every control was 3D, but for example in the menus you had mouseover highlighting that would suggest that you can click the item. And if for some control it wasn’t mouseover highlighting, then it was a tooltip or a changing mouse pointer.


That was in Windows 98, Windows 95 did not have any mouse hover highlighting. And from a form-follows-function perspective it is still inferior to having a distinct visual style for interactive elements without needing any interaction from the user like mouse hovering.


I’m sure there are many ways to do a restyling, similar to the Apple Aqua UI [0] for example, that could create a fresh look, while maintaining the clear affordances.

[0] https://en.wikipedia.org/wiki/Aqua_(user_interface)

The issue, I suspect, is that the young UI graphic designers working at Microsoft today mostly didn’t grow up using PCs and therefore don’t really know what their missing.


Currently running Chicago95 on Linux! https://github.com/grassmunk/Chicago95


As someone who used Windows 95 when it came out, it struck me at the time as an off-brand Macintosh, making it feel kind of cheap (a feeling that never went away). I had also been using X Windows by way of SGI's workstations around that time, and IRIX felt clean, if a little old-school, by contrast.


95 felt amazing if you were used to 3.1. I had some exposure to Mac then and I liked it a lot, but somehow I always saw it as a different thing, rather than Windows being a knockoff. (Obviously there was a lot knocked off. It was really funny to me that they introduced a "recycle bin" to copy the Mac "trash".)


Windows 2000 was the pinnacle of Windows UX, change my mind.


Why would I change your mind when you speak the truth?

I do wonder what the Apple equivalent to Windows 2000 would be, maybe Mac OS 8?


The Platinum theme used by MacOS 8-9 was probably the closest-looking to that Windows 95 look and feel. And since Windows 2000 was a combination of that old look and feel on top of a modern kernel (NT), one might argue that the Mac OS X Developer Preview was the nearest Apple equivalent.

The first two Developer Preview versions of Mac OS X used the Platinum theme, not the Aqua theme that ultimately shipped with 10.0.

You can see screenshots here: https://guidebookgallery.org/screenshots/macosxdp1


The Windows interfaces peaked with Windows 2000 which is similar to Windows 95. It has been downhill since.


Not to mention those beautiful tiling wallpapers. I'm partial to the simplicity of the blue rivets, though it may just be nostalgia...

https://windowswallpaper.miraheze.org/wiki/Windows_95


Windows 95 and 98 are one of the best things that I have ever come across when it comes to Software.


What about the regular crashes that always happened at the absolute worst time possible? For me, Windows 2000 was a godsend from a stability perspective. I ran it until it was truly no longer viable.


I swear. The first thing which came to my mind too. So clear. So precise. Why did we take a worse turn here seriously? Is there any research on this?


> No MSVC involvement whatsoever

> all I had to do was extract a bunch of headers and link libraries from the Visual C++ 6 SDK

Hmm . . .


I wonder whether it's actually legal to extract & independently redistribute these libraries. https://github.com/haileys/vc6-sys/tree/main/sdk/lib


Those files just reflect the publicly available Windows SDK, so it is pretty much ok to extract them - they reflect just a public domain knowledge. Actually, you can even create your own libraries with special tools like IMPLIB.


The license says you cannot redistribute the debug version of the libraries.

That’s okay though because the build tools are freely available from Microsoft. Don’t bundle them - just tell your users where to get them or auto-download them with wget or something like that.


That you can’t distribute the specific compilation of publicly available knowledge is different from recompiling (even all) that same knowledge into your own work.

It’s why you can publish and have copyright over a book of quotes from dead people, but no copyright over the individual quotes themselves (which are in the public domain).


in the usa (and i think in berne convention countries in general) some quotes are long enough to be copyrightable, and the copyright can last decades after the author's death


SDK headers and linker symbols for core system libraries would enjoy very strong fair use protections, though. You can thank Oracle and Google for that.


some fair use protections, but not very strong ones


The problem with Windows 9x is the 'ANSI' APIs everywhere, and no support for Unicode.

Oddly enough, we are actually going backwards from using the UTF-16 APIs back to using 'ANSI' APIs, and simultaneously setting the process code page to UTF-8. Any time you use ANSI APIs, there is a hidden conversion step (allocation, conversion, copying) where your text is converted into UTF-16.


> The problem with Windows 9x is the 'ANSI' APIs everywhere, and no support for Unicode.

Didn't you just need to install https://en.wikipedia.org/wiki/Microsoft_Layer_for_Unicode aka UNICOWS.DLL to fix that?


if you're interested in cross-compiling win32 programs from linux without msvc you might be interested in http://canonical.org/~kragen/sw/w32dry/

i wrote this 15 years ago but it still works on this recent ubuntu system if you supply a couple of variables to make

    make run CC=i686-w64-mingw32-gcc WINE=wine
no rust tho

in debian/ubuntu the package to install to get i686-w64-mingw32-gcc and the attendant header files and libraries is gcc-mingw-w64-i686

amusingly the executables i built 15 years ago still run on current gnu/linux; not sure if that would be true for things linked with libx11 and libxext, certainly not libgtk


Wow, fantastic!

> amusingly the executables i built 15 years ago still run on current gnu/linux; not sure if that would be true for things linked with libx11 and libxext, certainly not libgtk

win32 is how you write software you want to run forever thanks to the large user base.

I'm hoping for a wine cosmopolitan binary that would render to sixels: that would run on about everything, with qemu ensuring it keeps working if/when arm64 or mips64 take over amd64


glad you enjoy it!

sixel is not very widely supported, and the baud rate of a vt340+ is not really adequate for a gui


I can play videos with mpv and a sixel output, so I think the framerate could be enough with some minor tweaks (like how RDP does)


what are you displaying the sixel graphics on?

because the maximum baud rate of a vt340 is i think 38400 baud, which is 3840 six-bit characters including the start and stop bits, so with four bitplanes (16 colors per pixel) you only get 5760 pixels per second; a full 800×480 frame thus takes a bit over a minute to paint, minus whatever areas the rle compression can remove

i think sending a byte stream of raw pixels with some framing information is a reasonable thing to do for many applications, even at 32bpp, but you can do a lot better than sixel

x11 for example


I think he is doing it using `xterm` meaning it is pointless as you can just use x11 output, it's just interesting.


I wrote part of a Markdown editor in Rust, with the intention of bringing it to Windows 95. I have a lot of newer work I need to bring in, but the old version can be found here: https://github.com/julian-goldsmith/rustpad

It should build fine as-is for plain i586-mingw, but W95 requires a custom toolchain with a modified core crate.


Rust officially supports kernel >= 3.2, which is admittedly old enough to cover any real world usage scenario. That being said I wonder if someone tried to get a Rust program (or even better the compiler and toolchain) to run on an old 90s Slackware or something? That'd be an article I'd love to read.


I'm assuming cross-compiled from a way more modern computer. Or could a Pentium I complete a Rust run?


IIRC you can run rustc on ancient hardware if you compile it yourself. You can't however, compile the compiler on 32 bit hardware due to the 4GB RAM limit


I would guess that the first problem you'd encounter is getting LLVM to work on that Pentium I.


The title begs the question, how do you compile Rust with MSVC?


MSVC is the default compiler backend for Rust on windows. You can also switch to use GCC tho: https://rust-lang.github.io/rustup/installation/windows.html


Unless you're using one of the gcc rust compiler reimplementations it only ever uses LLVM. MSVC can't compile rust, and the rust compiler can't use MSVC to compile things. What rust & LLVM support is various ABIs on windows, meaning your rust code can link windows libraries (and the other way around).


Depending on how far you want to go: https://news.ycombinator.com/item?id=31112273




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: