Hacker Newsnew | past | comments | ask | show | jobs | submit | knorker's commentslogin

I've been promoted several times for adding business value, complex or simple in implementation.

Saving every team from doing X manually, saving so and so many engineer hours per year. At cost Y per year instead in one team. The higher X and lower Y the better.

Backups did not exist. Now they do and work every time.

I wrote the top customer appreciated feature, associated with an increase with X increase in whatever metric.

"It was really hard" shouldn't come into it, or not as much by far.


How does it factor in? How doesn't it?

If Iran had deployable nukes, would they get invaded?

Name a country that got bombed to credibly destroy the government, and had nukes. I'll wait.


It likely wouldn't be kinetic, but nukes didn't stop us from chipping away at the Soviet Union.

I could be wrong, but I don't buy the public story that this is about regime change. You don't topple a government with air superiority alone, and you don't do it in a matter of days. I also don't expect the US would be okay letting the Iranian people pick who comes next. We have a history of installing puppets and that similarly doesn't happen only via bombing runs.


> I also don't expect the US would be okay letting the Iranian people pick who comes next.

Khamenei: Dead. Ahmadinejad: Dead.

Maybe the US/Israel have a list of people they plan on making "ineligible" to run Iran? Then let the people choose from whoever's left?

We'll see how many people get ticked off the checklist, but if it's a long list then I would say regime change is a more plausible story.

Will it be as obvious as Karzai? Probably not.


> It likely wouldn't be kinetic, but nukes didn't stop us from chipping away at the Soviet Union.

So all it stops is kinetic attacks? Do you not think that's a big deal? I'm pretty sure Iran and Khamenei think that's a big deal.

What do you mean by "How does that factor in here right now?"?

It pretty obviously does. How does it NOT?

> I don't buy the public story that this is about regime change.

They had Khamenei killed.

But this is also a topic very different from the nukes one.


> If Iran had deployable nukes, would they get invaded?

Honestly, maybe? Like if we had high confidence we knew where they were, and Israel consented to the attack, I could absolutely see the U.S. trying to take it out in storage.

If Iran had a nuke that could hit the U.S., I'd say no. But that's a stretch from "deployable nukes."

> Name a country that got bombed to credibly destroy the government, and had nukes

Pedantically, Ukraine.


> if we had high confidence we knew where they were

That's a very big gamble. They only need to have hide one on a cargo ship and the attacker is going to have a Very Bad Day.

Nobody's made that gamble yet. Yes, there's been kinetics between India and Pakistan, and Iran sending missiles at Israel, but not a credible threat to the state.

> Pedantically, Ukraine.

Not sure when you mean. Did they get bombed while they had physical control in the early 90s? They never had operational control, but now that they're being bombed they don't have even physical control of nukes.


I think you're overestimating the intelligence of most criminals. And their gun logistics discipline.

If it were possible to do, it'd help.

Also, removing the marking mechanism would be a process crime. Process crimes are very useful for catching criminals.

Are you against serial numbers on guns too? You can always file those down.


I guess it's gonna have to have it, now.

What? Nowhere do they stipulate you have to add that. They just say if you do account setup, then you need to provide such an interface.

I would say that I understand all the levels down to (but not including) what it means for electron to repel another particle of negative charge.

But what is not possible is to understand all these levels at the same time. And that has many implications.

Humans we have limits on working memory, and if I need to swap in L1 cache logic, then I can't think of TCP congestion windows, CWDM, multiple inheritance, and QoS at the same time. But I wonder what superpowers AI can bring, not because it's necessarily smarter, but because we can increase the working memory across abstraction layers.


I think they are referring to the 1997 hack/leak: https://www.wired.com/1997/01/hackers-hack-crack-steal-quake...


> The first batches of Quake executables, quake.exe and vquake.exe were programmed on HP 712-60 running NeXT and cross-compiled with DJGPP running on a DEC Alpha server 2100A.

Is that accurate? I thought DJGPP only ran on and for PC compatible x86. ID had Alpha for things like running qbps and light and vis (these took for--ever to run, so the alpha SMP was really useful), but for building the actual DOS binaries, surely this was DJGPP on x86 PC?

Was DJGPP able to run on Alpha for cross compilation? I'm skeptical, but I could be wrong.

Edit: Actually it looks like you could. But did they? https://www.delorie.com/djgpp/v2faq/faq22_9.html


I asked John Carmack and he told me they did.

There is also an interview of Dave Tayor explicitly mentioning compiling Quake on the Alpha in 20s (source: https://www.gamers.org/dhs/usavisit/dallas.html#:~:text=comp... I don't think he meant running qbsp or vis or light.


> he told me they did.

This is when they (or at least Carmack) was doing development on Next? So were those the DOS builds?


I thought the same thing. There wouldn't be a huge advantage to cross-compiling in this instance since the target platform can happily run the compiler?


Running your builds on a much larger, higher performance server — using a real, decent, stable multi-user OS with proper networking — is a huge advantage.


Yes, but the gains may be lost in the logistics of shipping the build binary back to the PC for actual execution.

An incremental build of C (not C++) code is pretty fast, and was pretty fast back then too.

In q1source.zip this article links to is only 198k lines spread across 384 files. The largest file is 3391 lines. Though the linked q1source.zip is QW and WinQuake, so not exactly the DJGPP build. (quote the README: "The original dos version of Quake should also be buildable from these sources, but we didn't bother trying").

It's just not that big a codebase, even by 1990s standards. It was written by just a small team of amazing coders.

I mean correct me if you have actual data to prove me wrong, but my memory at the time is that build times were really not a problem. C is just really fast to build. Even back in, was it 1997, when the source code was found laying around on an ftp server or something: https://www.wired.com/1997/01/hackers-hack-crack-steal-quake...


"Shipping" wouldn't be a problem, they could just run it from a network drive. Their PCs were networked, they needed to test deathmatches after all ;)

And the compilation speed difference wouldn't be small. The HP workstations they were using were "entry level" systems with (at max spec) a 100MHz CPU. Their Alpha server had four CPUs running at probably 275MHz. I know which system I would choose for compiles.


> "Shipping" wouldn't be a problem, they could just run it from a network drive.

This is exactly the shipping I'm talking about. The gains would be so miniscule (because, again, and incremental compile was never actually slow even on the PC) and the network overhead adds up. Especially back then.

> just run it from a network drive.

It still needs to be transferred to run.

> I know which system I would choose for compiles.

All else equal, perhaps. But were you actually a developer in the 90s?


Whats the problem? 1997? They were probably using 10BaseTX network, its 10Mbit... Using Novel Netware would allow you to trasnfer data at 1MB/s.. quake.exe is < 0.5MB.. so trasnfer will take around 1 sec..


Not sure what you mean by "problem". I said miniscule cancels out miniscule.


Networking in that era was not a problem. I also don’t know why you’re so steadfast in claiming that builds on local PCs were anything but painfully slow.

It’s also not just a question of local builds for development — people wanted centralized build servers to produce canonical regular builds. Given the choice between a PC and large Sun, DEC, or SGI hardware, the only rational choice was the big iron.

To think that local builds were fast, and that networking was a problem, leads me to question either your memory, whether you were there, or if you simply had an extremely non-representative developer experience in the 90s.


Again, I have no idea what you mean by networking being a "problem".


You keep claiming it somehow incurred substantial overhead relative to the potential gains from building on a large server.

Networking was a solved problem by the mid 90s, and moving the game executable and assets across the wire would have taken ~45 seconds on 10BaseT, and ~4 seconds on 100BaseT. Between Samba, NFS, and Netware, supporting DOS clients was trivial.

Large, multi-CPU systems — with PCI, gigabytes of RAM, and fast SCSI disks (often in striped RAID-0 configurations) — were not marginally faster than a desktop PC. The difference was night and day.

Did you actively work with big iron servers and ethernet deployments in the 90s? I ask because your recollection just does not remotely match my experience of that decade. My first job was deploying a campus-wide 10Base-T network and dual ISDN uplink in ~1993; by 1995 I was working as a software engineer at companies shipping for Solaris/IRIX/HP-UX/OpenServer/UnixWare/Digital UNIX/Windows NT/et al (and by the late 90s, Linux and FreeBSD).


Ok that's not what I said. So we'll just leave it there.


That's exactly what you said, and it was incorrect:

> This is exactly the shipping I'm talking about. The gains would be so miniscule (because, again, and incremental compile was never actually slow even on the PC) and the network overhead adds up. Especially back then.

The network overhead was negligible. The gains were enormous.


>> I said miniscule cancels out miniscule.

> You keep claiming it somehow incurred substantial overhead

This is going nowhere. You keep putting words in my mouth. Final message.


Jesus Christ. Networking was cheap. Local builds on a PC were expensive. You are pedantic, foolish, and wrong.

Were you even a developer in the 90s? Are you trying to annoy people?


> I mean correct me if you have actual data to prove me wrong, but my memory at the time is that build times were really not a problem.

I never had cause to build quake, but my Linux kernel builds took something like 3-4 hours on an i486. It was a bit better on the dual socket pentium I had at work, but it was still painfully slow.

I specifically remember setting up gcc cross toolchains to build Linux binaries on our big iron ultrasparc machines because the performance difference was so huge — more CPUs, much faster disks, and lots more RAM.

That gap disappeared pretty quickly as we headed into the 2000s, but in 1997 it was still very large.


I remember two huge speedups back in the day: `gcc -pipe` and `make -j`.

`gcc -pipe` worked best when you had gobs of RAM. Disk I/O was so slow, especially compared to DRAM, that the ability to bypass all those temp file steps was a god-send. So you'd always opt for the pipeline if you could fill memory.

`make -j` was the easiest parallel processing hack ever. As long as you had multiple CPUs or cores, `make -j` would fill them up and keep them all busy as much as possible. Now, you could place artificial limits such as `-j4` or `-j8` if you wanted to hold back some resources or keep interactivity. But the parallelism was another god-send when you had a big compile job.

It was often a standard but informal benchmark to see how fast your system could rebuild a Linux kernel, or a distro of XFree86.


> Linux kernel builds took something like 3-4 hours on an i486

From cold, or from modified config.h, sure. But also keep in mind that the Pentium came out in 1993.


I assumed that obviously this is very clever performance art to show how even the healthiest food can be turned into brainrotting sludge.

But then I look at the comments, and it really looks like some people want this.

Now I'm depressed.


Yeah, that was the intention.


The fact that you have to be more specific than "Scott" says a lot.


That’s more likely just you.

Anyone who knows Apple knows who “Scott” is referring to. Scott Forstall.


Heh, I assumed he was referring to "Scott the Woz" Scott Wozniak, a vintage-gaming youtuber. I assumed that the GP took a more literal attack on "only one 'Woz'", hile you took a more symbolic "only one engineer of such quality". In the context of Apple, sure "Scott" is Scott Forstall, but that's not necessarily the context.


I could be wrong then if that was their reference. I was in the mindset of foundational Apple leaders, not other Woz’s outside the Apple hemisphere.

EDIT: reading this again, now thinking you are right and they are just being snarky about the “one Woz in the world” existing.


Woz is not just "some guy at apple". He's a force in his own right to the point of being bigger than Apple in some ways.

"Woz" is googlable. His name doesn't need context. "Larry" could be Ellison or Page. "Scott" could be Forstall or Adams.

Who played Scott Forstall in the movie?

Anyway, other comments proven it's not just me, too.


That's crazy because I assumed they were obviously talking about Apple's first CEO.

For "Scott Apple" search string, Google agrees with me and the forstall guy is just a secondary mention.


For me he will always be “Scotty”. “Scott” at Apple will almost always imply Scott Forstall.


My first computer was an Apple IIGS and everything since then has been a Mac. "Scott" doesn't bring anyone specific to mind for me. Maybe that connection is automatic for newcomers who immediately think "iPhone" when they hear "Apple."


I had a very long career at Apple. I have also met and spent time with Woz on multiple occasions. I have some bias here.

Possibly my assumption was incorrectly based more on people who actually worked at Apple vs what the normal public thinks of when they hear “Scott” and “Woz” in the context of Apple.


It would make sense that people on the inside would be a lot more aware of him. Forstall was obviously a pretty big name in the community but not to the point of getting a shorthand name like that. And he was mostly forgotten pretty quickly after he left.


That auto flip back and forth between before and after is the most annoying thing I've seen since the blink tag was removed.


yeah I would like to read the code before it switches but nope


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: