An HTML file virtually unchanged in fifteen years still renders cleanly in a modern browser. Backwards compatibility across open standards is a wonderful thing.
Old MS Office file formats were never engineered to be future proof in any way, and they're a huge headache for Microsoft, and they do indeed behave oddly in new software, and Microsoft cannot wait to get rid of them, and nobody in their right mind can blame them, honestly.
I'd like to expand on this. The Microsoft Word team is the only group of people that ever had any say over what goes in a Word document. In the old days, a Word document was nothing more than the serialized version of what Word had in memory. Photoshop files are the same way. As soon as you're not the only one with a say over the file format, you turn it into a standard. And you step up to make it a good standard because otherwise nobody's going to use it.
To keep going. Any standard that is not open will not open itself for multiple implementations (see old MS standards with word xml format). The problem of course is that if I cannot implement your standard, you are still the only consumer. So really:
Standards with producers/consumers from different independent teams from different organizations with different goals are awesome.
"Open standards /rock/." is a bit overblown, don't you think? There are still very real problems with open (read: sort of "open") standards. In the end we all just want something that works. There are indeed a great deal of proprietary solutions which satisfy this criteria, and continue to do so for many years.
We're much better off with formats that were documented with inter compatibility in mind in the first place. Historically, it's been quite difficult to build products that work with MSO formats as MS not only did not help meaningfully but actively seeks to hinder others from doing so - plus the format is buggy and odd. Read what Wikipedia has to say on the history of the .doc format:
>Some specifications for Microsoft Office 97 binary file formats were published in 1997 under a restrictive license, but these specifications were removed from online download in 1999. Specifications of later versions of Microsoft Office binary file formats were not publicly available. The DOC format specification was available from Microsoft on request since 2006 under restrictive RAND-Z terms until February 2008. Sun Microsystems and OpenOffice.org reverse engineered the file format. Microsoft released a .DOC format specification under the Microsoft Open Specification Promise. However, this specification does not describe all of the features used by DOC format and reverse engineered work remains necessary.
I don't disagree. I'm just sensitive to the "open standards /rock/" thing. It glosses over a lot of nuance. What I'm trying to suggest is that something being an "open standard" does not necessarily make it better. Historically, there have been a number of open standards which have been failures for technical reasons, political reasons (within the community behind the standard), no good reason, etc.
Yeah, you point out the winners, think back 15 years ago. How many competitors were there just in the realm of word processing? Every format but the eventual winner (which couldn't be determined at the time) is now completely useless, whereas an abundance of old dud open image formats can still be opened (which is nice because some are really easy to programmatically generate)
I don't think that's true. I can still open WordPerfect docs from 15 years ago, with less faff than it takes for, say, VRML (which was an open standard).
He's simply saying that backwards compatibility and open standards are good things in that they don't break old sites/docs/etc...
You could have just as easily said the same thing and ended with "So I don't really see your point about backwards compatibility in this case".
Backwards compatibility is often the forced result of open standards because when multiple vendors adopt those standards it is in their best interest to not make breaking changes.
This in turn is(obviously) a good thing in that things designed for those standards will remain functional indefinitely.
A .doc file created 5 minutes ago in Word can't be assumed to open correctly in any other version of Word without testing.
I used to work at a research institute and the people who used Word always spent more time getting their papers to look as nice as the LaTeX users, particularly if you crossed platforms or had change tracking enabled (hello undeletable footnote in the middle of the page…). The barrier to entry was higher but if you needed reliably consistent formatting there wasn't really a better option, with the exception of HTML5 + CSS + your favorite MathML shim if you're submitting to a really web-centric journal.
> emacs outshines all other editing software in approximately the same way that the noonday sun does the stars. It is not just bigger and brighter; it simply makes everything else vanish [1]
I think if you replace "emacs" with "TeX" and "editing software" with "document formatting system", the quote holds just as well.
PDF was not an open standard until 2008. Just saying. It could have been abandoned by Acrobat/Adobe while it was still proprietary and you would have been left with unreadable files in 2013. That's not the first time these things happen either.
HTML is way more solid in that aspect: as long as you have a browser you should be able to open the document on any browser, any platform and get a very similar access to contents. There's nothing much comparable here... your "doc" example does not make much sense either on platforms where official Microsoft software is not available (Unix, anyone?)
It was not an open standard, but it was nonetheless rigorously documented. If you wanted to modify a PDF (as I did in 2006) it was fairly simple to create an editor in the language of your choice, using only those documents as a guide.
You're lucky you only have to read PDF or DOC files (or as somebody said: "the winners").
These last two months, I've manually parsed old databases[0], some dating back 2001. Locked formats with unknown specs, schemas embedded in applications (good luck digging that out with IDA), data storing that makes no sense, losing your hair out,...
[0] Drop me a mail if you want to reverse-engineer an HyperfileSQL DB from around Y2K, I can help on that if you pay me with Red Bull.
At least it is for the contents that somewhat makes it understandable, being it's mostly listing of data.
What I'm talking about is table layout being used for the sole purpose of aesthetic reasons. And those I've seen so many of those sites using bad design decisions every corner (texts embedded entirely as a graphic, etc.) and they are usually one speaks out their concern about SEO...
Tables are honestly just more convenient than CSS hacks sometimes. In this case, by hacks I mean when the feeling becomes clear that you're spending more time bending CSS to do what you wanted to do with a table than the time it would've taken to maintain a table.
I've never understood the argument that "semantic" tags are easier to render for screenreaders. Like a screenreader knows that <em> is for emphasis, but couldn't possibly render <i> the same way it renders <em>. Or knows that a <div> must be for layout but couldn't possibly treat <table> the same way (modern UI toolkits build tables out of divs, so it's not like div-users semantically distinguish between layout and actual tabular data).
It's not the tags themselves that are "semantic" but how they are used. The semantic web is a content-first approach, as opposed to using tags to design the structure of your page. If you're adding div's to create a specific layout, you're doing it wrong.
That said, I think that CSS is inadequate for the task of controlling the layout of elements so there are very few truly "semantic" layouts.
It's the same problem as deciding whether an API is RESTful enough or not. Eventually, pragmatism wins over dogma. Still, I think (like REST) the semantic web is a good ideal to shoot for.
>knows that a <div> must be for layout but couldn't possibly treat <table> the same way //
How then would you treat actual tables, like <table noreally="true"><tr><th> ... ?
Screenreaders treat tables differently because they carry semantic weight - table headings, rows and such relate to one another in a way that mere areas of a web-page do not.
It's not the table tag that's bad, people don't [shouldn't!] use divs for marking up tabular data.
I worked in IT around '96 and was lucky enough to commandeer a 21" 1600x1200 CRT at that time. 2d was ok, but the Pentium 90/133? I had could power Quake at 1280x1024 at about 5-10fps, so I had to crank it down several notches.
I was half expecting some if the words to blink at me, but, of course, support for the blink tag has been removed from modern browsers. Hooray for opinionated code.
What that doesn't mention is how much better the game looked when you ran it on something like a 3dfx Voodoo card. I remember when I upgraded from an S3 card to a Voodoo 3, the game suddenly had things like windshield glare and partially transparent water, and much higher texture resolutions.
I had a Pentium III with S3 savage, the thing ran anything, I remember being pleasantly surprised when I could play hardware accelerated Need For Speed 4 for example..
Haha so true. I remember I had an S3 Virge DX back then, and it was actually slower to enable 3D 'acceleration' in the few games that supported the card, than using software rendering. It didn't add much if anything to the quality of the visuals either ;-)
Now compare this with modern mobile devices which run slowly and can do a fraction of what PCs were able to do 15 years ago having 10x lower clock rates.
1. Compare apples to apples. What could something the size of a phone do in 1998?
2. The current resolution of the iPhone 5 is 1280 x 2272. The game ran on 800x600 with a fraction of the colors,
3. That game sold for what... 35 bucks? Are you comparing what two developers make in a month and a half, selling for 2 bucks with a AAA budget game? Yes, an HTML5 game is going to be slower, but compare one of the top games for the mobile systems and the mobile game today is more impressive. Perhaps 'Need For Speed: Most Wanted'?
4. Clock Rate Isn't Everything. It never has been. Further, the equivalent Pentium 133 laptop is lasting 1.5 hours max with a battery the size of a half dozen iphones. Power matters.
5. Could a mobile-sized device, sans phone (FCC issues) be created that is tuned for speed instead of battery, with a lower grade resolution (compromise for battery and speed), that was faster? Sure. Would anyone buy it? Probably not. Engineering is about compromises.
And yes, I'd like to be able to use my computer as a general purpose device, but nobody has stepped up to the plate yet. I mean, we could start with a Cyanogen mod and build up from there, but it's a bigger effort than it looks.
But I'll tell you what, let's put it on the backlog. ;)
Drank the retina Koolaid? I couldn't believe it myself when I started doing the math. The Nexus 5 has 1.5x the detail for each screen point compared to a Retina iPhone. The iPhone's @2x retina assets are outclassed (somewhat) by the @3x assets of a 1080p phone...
Still doesn't change the fact that 15 years ago "we"(as in, programmers, industry, whatever) were much, much more efficient with limited resources than what we are now. Some people consider it a good thing, some of us consider it a bad thing.
"were much, much more efficient with limited resources"
You are cherry picking which resources you care about. What about power consumption? What about heat? What about physical size? What about noise? I remember the fans in those old PC's, mine sounded like a jet taking off.
1-2 gb of ram is insane luxury compared to say, the Amiga which typically had 1000 times less - 512k or 1mb of main ram and the same amount of 'chip' ram (for video/audio processing).
I'm not a mobile device expert, but programmers don't have to think that much about 512k on an iphone, do they? On an Amiga it was often all you had for the entire system including the OS.
Yeah but the Amiga drove 640x256 at 16 colors (4 bits), so around 80kb. Contrast that with the newest iPhone which drives 640x1136 at 24-bit color, so around 2.08mb. This is 26x the number of pixels. Keep in mind you also have a networking stack and lots of semi-realtime sensor data.
Not to mention that the demands on the system in terms of features and performance are so much higher.
26x the number of pixels, but you have over 200 times the ram available and that's assuming your app only gets 100mb (and that AmigaOS takes up zero resources). The default Amiga had an 8 MHz, 16 bit processor which again is less than 100 times what's available now on an average device. I just can't see modern devices being construed to be nearly as resource restrained as personal computers from the 80s.
Well I consider my time a limited resource. In trade for my developer time I can use an extra 8MB of ram for a little html web game I am making I think it is okay.
Are you saying that mobile devices aren't capable of playing games of the same graphical and technical complexity as a Windows 95 game? I assure you that isn't true.
No, what I'm saying is that you will never see an iPad displaying the same complex scene a 500W-PC with a double-SLI card has no problem of taking care of. There is and there will always be a large gap between what you can do in fixed based devices and mobile systems, because of their inherent design goals.
Huh? My Galaxy S3 is noticeably more powerful than my Micron Pentium II system from 1998, in terms of specs as well as actual performance. The fact that it's running Android and not Win98 doesn't hurt.
Win9x is not a portable OS. Because of its DOS and Windows heritage, the underpinnings are very much tied to x86 and would be difficult to port to ARM. (Although not as difficult as OS/2, which uses the dreaded ring 1.)
Whereas Windows NT has always been portable. Except that Microsoft has already ported NT to ARM.
Indeed, I realize this. My comment was mainly facetious, I admit. Porting DOS/Win98 would be totally silly anyway unless you needed a blue screen generator.
Porting ReactOS would start to make sense, if anyone wanted that - say, there was a desire to run windows programs on modern mobile hardware without licensing fees (rendering farm? Most use cases I can imagine would be better served by Linux anyway). Otherwise... Emulation such has DOSBOX makes the most sense for anyone who desires to run legacy Win32 software.
A 28.8Kbps Internet Connection used to be considered fast!
Optional:
28.8 or higher baud modem for head-to-head play
For LAN play a local area network supporting either TCP/IP or IPX protocols is required
Internet access required for internet play; connect time charges may apply
Microphone for instantaneous voice communication during head-to-head play
Graphics accelerator card compatible with the Microsoft Direct3D® API (for Windows 95 only)
Compatibles:
Supports force feedback hardware compaticle with the Microsoft DirectInput® API (for Windows 95 only)
Supports MMX technology
Supports Pentium II processor platform with AGP
In a way, that modem might still be considered good: the head-to-head play meant that one player had their modem dial the other player's modem, so the link that was established was a direct 28.8k symmetric connection with no IP routers in between. As long as you weren't going long-distance, you would have sufficient bandwidth and latency that was good, not great, but very consistent because the PSTN cares about that in a way that ISPs don't.
I remember doing that but it was a shit. We didn't have mobile phones back then as a side channel so the outcome was basically designing a manual protocol in case of failure. This consisted of stuff like assigning who was responsible for calling who when it dropped and what to do when someone else in the house picked up a phone.
Um, yep. I remember being stuck on 14.4 back in the day.
Waiting for webpages to load... Well, it was like waiting for webpages to load on an edge connection. Waiting 2-3 min per page load, basically. 2 hours for a browser upgrade download, you know the ones chrome does in the background now... And it might not successfully download the first time. Ahh, those were the days. Napster could take 20min to download an MP3 and still corrupt it.
Motocross Madness and Midtown Madness were both such fundamental pieces of my childhood. I wish I could convey the amount of nostalgia I have. Motocross Madness was the first time I actively tried to use a computer to 'do' something. I had to program my own tracks, and it was insanely hard at the time, but god were those the glory days.
I'm with you that game was amazing. The track designer was such fun. Really wish they would have continued the series. The track designer was really easy to learn I feel compared to the current modding tools that come with games. And it had enough stuff to keep me for a long while. Must have played that game for a few years.
I basically know the geography of Chicago, having only been there in real life a handful of times, because of my time playing Midtown Madness and the various MS Flight Simulators.
I actually got a bit sad during a trip a few weeks ago when I saw that Meigs Field had been demolished.
Wow. Motocross Madness brings back some incredible memories. I had a force-feedback controller with it too. Loved that moving the controller actually moved the bike. Good times.
The first reason I ever set up a VPN connection was to play MCM with randos across the internet. The first version only supported multiplayer over LAN, so you had to configure a PPTP client and find a "server" to join that had others who were similarly interested in playing. This happened for other games, too (THPS2 for the PC was a big one for me), but I remember being blown away by how many people were essentially hacking around the LAN restriction.
Try refreshing the page and you'll get a different sound-track. Made me curious about the implementation.
Interestingly, they are using a custom random function
today=new Date();
jran=today.getTime();
var number=5;
var random_number=1;
var bgsnd="/games/random/bgsnd.wav";
var images="/games/random/intro.JPG";
var sizes=" width=602 height=228";
function randomizeNumber()
{
ia=9301;
ic=49297;
im=233280;
jran = (jran*ia+ic) % im;
random_number=Math.ceil( (jran/(im*1.0)) *number);
if (random_number==1){
bgsnd="audio/intro/yeeha.wav";}
if (random_number==2){
bgsnd="audio/intro/crash_10.wav";}
if (random_number==3){
bgsnd="audio/intro/heybuddy.wav";}
if (random_number==4){
bgsnd="audio/intro/lovethatmud.wav";}
if (random_number==5){
bgsnd="audio/intro/webintro.wav";}
}
And then they run it when the page opens
randomizeNumber();
document.open();
if (version == "n3" || version == "n4"){
document.write("<embed src="+bgsnd+" autostart=true hidden=true></embed>");}
if (version == "e3" || version == "e4"){
document.write("<bgsound src="+bgsnd+">");}
document.close();
Only one of the group's members, Rio DiAngelo/Richard Ford, did not kill himself: weeks before the suicides, in December 1996, DiAngelo agreed with Applewhite to leave the group so he could ensure future dissemination of Heaven's Gate videos and literature. He videotaped the mansion in Rancho Santa Fe; however, the tape was not shown to police until 2002, five years after the event.
Nostalgia overload. I absolutely loved this game back in the day. Surprised to see you can still even download the demo of the game even. I'm guessing this was one of those sites Microsoft forgot about. Look at those system requirements, back when PC power was still measured in megahertz and ram was still referred to in megabytes, crazy.
Bill Clinton Wants to Put "Big Brother" in Your Computer
Bill Clinton believes in bureaucratic micro-management of
the information economy. Within his first 100 days as
President, Bill Clinton proposed the Clipper Chip -- a
secret government-controlled encryption algorithm -- and a
companion key escrow system where two government agencies
would hold a copy of the keys for every Clipper user. Since
then Bill Clinton has released updated versions of
encryption proposals which insist that the government hold
a key to individual's private data communications.
When I read campaign stances on this site last year I was really surprised how socially moderate their campaign was. I'm glad the site is still around.
Bob Dole Will Protect the Constitutional Liberty of Internet Users
Bob Dole is concerned about children accessing unsuitable material when using the Internet. But strict censorship of the Internet is not the answer. Bob Dole believes that parents should take responsibility for the material that their children view, and he wants to encourage technology which allows those decisions to be made within each home.
Throughout his Senate career, Bob Dole has fought to protect the Constitutional liberty of Americans:
Bob Dole is a supporter of the Pro-CODE bill that limits the federal government's control of encryption and user keys. It permits the export of software that includes encryption if the software is easily available in this country.
Bob Dole strongly supports the observations made in the recent National Research Council report that widespread use of encryption to promote information security outweighs the difficulties encrypted communications place on law enforcement. Economic espionage from foreign countries and companies is a serious threat, and Bob Dole believes Americans should have the right to guard themselves using encryption.
Bob Dole supported the Senate hearings on Internet copyright laws. The hearings provided suggestions from information creators, Internet and on-line service providers, librarians and Internet users on developing compromises that balance the rights and needs of all participants.
Bob Dole fought for provisions in the Telecommunications Act of 1996 that encourage parents to take responsibility for the Internet material which their children view.
Bob Dole helped pass the Bayh-Dole act of 1980 which helped create the biotechnology industry by allowing inventions from federal research dollars to be commercially developed.
As President, Bob Dole will:
Promote policies that ensure that the United States remains the world leader in technological innovation.
Reject heavy-handed big-government regulations of cyberspace.
Promote policies that preserve and advance the openness and decentralization of computer-based communications.
Preserve and protect American citizens' right to privacy and the need for secure communications.
> When I read campaign stances on this site last year I was really surprised how socially moderate their campaign was.
If you were surprised by Dole/Kemp from 1996, you should look at when Bob Dole ran for Vice President -- Ford/Dole in 1976.
The 1976 Republican platform [1] featured: Environmental protection, willingness to "negotiate differences" with foreign countries, "vigorous" antitrust enforcement, federally-funded child nutrition programs, support for the Equal Rights Amendment, better access for the disabled, urban development, railroad electrification, recycling, increased funding for the arts and humanities.
On the other hand, "The Republican Party opposes compulsory national health insurance." So that part isn't new. Also, the usual Republican stuff -- lower taxes, bigger military, opposition to abortion, etc.
It feels DOSsy, to force the file extension to bend to your 3-character limit rather than let it all hang out. Then again, it highlights how accustomed we are now to hiding our filenames with folders and web apps.
Yup, came here to post this link. I literally was just playing the trial version hosted on this website YESTERDAY. It plays perfectly under WINE in Mac OS 10.9 on my MacBook Air.
Yeah, I immediately thought of that too. I've been to that webpage in the past 1-2 years, it's a game I used to play with friends every once in a while.
AFAIK, the original homepage for "Black Hawk Down", which was a long-form serial newspaper story before it became a book and a movie, still looks as I remember it back in 1997:
I actually owe my programming career to MTM. When I was ~10-11 years old, I held many tournaments. At some point, my tournament grew to the point where I couldn't manage signups day of, and tried to make a website using frontpage + some form of extensions (angelfire? tripod?) to have early signups. I eventually learned javascript and a tiny bit of perl.
Looking back at that point in time, I did a horrible job of managing those tournaments (Sometimes my pre-teen responsibilities took priority and made me miss some tournaments. Oh and I seem to remember storing passwords in clear text). However I look at myself today and am extremely happy to be a successful developer, which may have never happened if it hadn't been for the MTM1/MTM2 games and websites, which I am certain I probably copied a ton of "codes" from back then.
Probably because it's a few static HTML files that are stored with dozens (if not hundreds) of other single purpose sites, whose infrastructure is managed programmatically. (i.e. they have an intern run a script when they need to migrate it to a new cluster, and that's the extent of it)
A client of mine has a site with 4000+ individual pages, and no content management system.
In the corporate world, you just don't touch stuff if you don't need to. There's no time to go and find web pages that haven't changed in 10 years and figure out what to do about them.
It isn't so crazy. Imagine that 3900 of those 4000 files never change. Every few years there is a redesign on a section of the site, but some sections haven't changed in 7-8 years.
It's easy to manage a large site that only makes a few changes per week.
I remember buying the steering wheel for Midtown Madness, which was larger than many desktop monitors today and required a near-bolting to your desk, and feeling like I was on the cutting edge of gaming.
That reminded me why I generally always leave my sound muted unless I need it for something specific. Ugh, remember the old days when every page had a midi embedded in it?
Very strange that they retain the page when the payload doesn't link anymore. The pages must turn up in the link-checks and be specifically flagged to remain unfixed?
The original Monster Truck Madness, along with the original Motocross Madness (with dedicated motion sensitive Microsoft Sidewinder controller), were probably the games I played the most.
If I knew then what I know now, I'd be rich. I sometimes wonder what it would be like if I found myself transported back to the late 90's with a copy of jQuery. I'd blow their tiny minds I would.
This was one of the first PC "open world" GTA style games where I spent more time trying to drive my monster truck up impossible obstacles and exploiting the game physics than I did ever racing.
That's sort of the definition of abandonware - the publisher has apparently given up trying to make money off the thing but also not explicitly released it for free either.
The past two weeks there have been at least one and usually two or more Microsoft related articles. This is even more annoying than the drivel posted by the NY Times.