Hacker Newsnew | past | comments | ask | show | jobs | submit | more passivegains's commentslogin

There's something beautiful about the kid using performance as language. They've hit upon the greater truth that reading and speech are important because text and the spoken word are powerful mediums, but what truly matters is what they allow us to express to each other.

Though I have worked with children enough to sympathize with the not-beautiful part of this story too. (also that book sounds rad as hell.)


I was going to reply about how New Zealand is as far from almost everywhere else as the US, but I found out something way more interesting: Other than servers in Australia and New Zealand itself, the closest ones actually are in the US, just 3,000km north in American Samoa. Basically right next door. (I need to go back to work before my boss walks by and sees me screwing around on Google Maps, but I'm pretty sure the next closest are in French Polynesia.)


and then populating document to cover all the aspects of selected law

I'm fairly certain lawyers exist specifically because this is infeasible.


Not only that but: which subsection of law is specific to this case? What additional documentary requirements come with citing that law? What’s the presentation strategy for the target opponent/lawyer/court?


if someone, for example, exchanges funds with a foreign nation to evade sanctions while they illegally occupy another, it really is their state's business.


Only because the state asserts their existence with violence. We certainly have little-to-no say in how the states in which we live behave, but we're all subject to their whims.

Personally, I have little patience for the pretense that the geopolitical theatre we're all subjected to reflects the people who live in the states represented in such theatre. Baudrillard had it right all along.


Only because the state asserts their existence with violence.

I'm comfortable saying if a foreign nation rolls military vehicles across a border and starts shelling apartment buildings, those buildings' residents have every right to assert their existence with violence.

We certainly have little-to-no say in how the states in which we live behave, but we're all subject to their whims.

I can't say for certain how much my state listens to me specifically, but if they do, I'd tell them their whims should be such that other states can't do the roll-across-border-blow-up-apartments thing. (to anyone, not just me.) Those are good whims. Everyone should be subject to those whims.


> I'm comfortable saying if a foreign nation rolls military vehicles across a border and starts shelling apartment buildings, those buildings' residents have every right to assert their existence with violence.

Of course, I have nothing against sovereignty. I just think states greatly overestimate what they're owed for simply not mowing down their own citizens. If my state causes another state to invade me, of course I'll side with the invader—my own state has failed me.


the author is pretty up front about this: I would like to live in a world where professors consider Prolog worth knowing and put in the effort. That probably isn’t realistic. He clearly wishes he's wrong, which makes three of us.


Right, but thinking you can change how people teach a subject without their believing it's worth knowing seems unrealistic, too!


there's competing specs, holy wars, etc. but POSIX is kind of like what you're describing. popular distros are usually mostly-but-not-completely compliant.


There was also the https://en.wikipedia.org/wiki/Linux_Standard_Base , but unfortunately it wasn't particularly successful.


apt won't on its own, but if you're using the official images there's probably a service running that's calling it, probably for security patches etc.

The bigger problem is upgrading packages deliberately but being surprised by the results. My team's current favorite is the upgrade process itself suddenly having new interactive prompts breaking our scripts.


> My team's current favorite is the upgrade process itself suddenly having new interactive prompts breaking our scripts.

This is how dpkg and apt have worked in Debian and Ubuntu pretty much since their inception. Look into debconf, dpkg and ucf configuration to learn how to integrate these with your automation. The mechanisms for this have existed for decades now and have not substantially changed in that time.


dpkg grew knowledge of Source lists suddenly?

If you're installing software from Debian/Ubuntu repos, you can only use aptitude or apt to my knowledge. Other tools give you the ability to install DEB files you already have, and manage what's on your system currently. And aptitude and apt are both well known for never having had a "stable" scriptable interface. In fact they themselves tell you that their commands are not stable and should not be used for scripting, despite no alternative mode or application existing.

Recently Ubuntu moved to apt 3 as well, which massively overhauled the tool from apt 2. All those scripts people wrote to use apt 2 (because there was no alternative) broke recently when they now had to use apt 3.


Your understanding is just outright wrong. The `apt` command has an unstable interface so that it can improve the CLI without worrying about breaking scripts. The `apt-get` command is the stable interface for scripts. `apt` was created after `apt-get` became ossified exactly because the developers work hard to keep the interface for scripts stable.

> In fact they themselves tell you that their commands are not stable and should not be used for scripting, despite no alternative mode or application existing.

No, that's just the apt command, not the apt-get command, and the manpage for apt tells you exactly how to do this instead. It's clearly documented, so your "despite no alternative mode or application existing" claim is simply ignorant.

Please read the documentation and learn how to use the tooling before criticizing it and misleading others with claims that are outright wrong.


I thought DEBIAN_FRONTEND=noninteractive was supposed to avoid that?


That and a couple of -o's, like conf old (not sure, on my phone)


> wifi for servers

I urge you to reconsider. Potential customers and applicants will immediately go all in and the "but this doesn't make any sense" step will never happen. We'll be reading blog posts like "I'm hooked on PCP (the Post-Cabling Paradigm)" from Google SRE's with comments praising the downfall of "Big Ethernet" while the emissions from datacenters vaporize flocks of migrating birds.


Yes! The servers will look lit without those ugly cables. Zero-downtime cross-rack migrations will become possible.


Cannot tell you how many uncomfortable conversations I have had trying to explain that you still need to run cables for power to wireless cameras (the outdoor cams work off of solar quite well, though).


They're wireless, not wirefree.


None of the wireless cameras at my house have power cables. They run for months on a charge.


Our bullet cams pull a steady 30W ad they record 24/7 (factory floor). Our newer 4k and PTZ cams are coming in pulling nearly 60W.


Same with game cameras.

But resolution + recording time matter.


But - hear me out - if all of your servers did have wifi - and it was usually disabled - but you could enable it, move the server, then disable it - that might be something?

I know having a redundant server is better - but there's something to it.

Also, this reminds me of a post I read a while ago about them moving a server from one building to another without unplugging it or something.

[Found it: https://news.ycombinator.com/item?id=24059243]


We should apply to the next YC batch as cofounders.


Why just Wifi? With ARM chips you can also have LTE in your servers! (Better check with the NSA first - maybe all the existing Intel and AMD SoCs already have some form of wireless comm built-into it?).


thank you for digging up the link, it was everything I hoped it would be. that one's going in my bookmarks.

obviously a second server and a reverse proxy or something would be less jank, but your idea definitely has merit. it's kind of like using two points of contact while climbing something.



It would take some downtime (or at least a really long extension cord and redundant power supplies for swapping one-by-one) because you'd still need to connect it to power.

The solution is obvious: Qi inductive wireless charge coils on the outside of the server cases.


You swap it over to a desktop UPS that can come along with the server during the physical move. No downtime as long as you have redundant PSUs and you can walk fast enough to beat the battery draining.


Or integrated miniature diesel generators.


VW makes an excellent 2.0L TDI that would be a good fit. It would take up about half a rack. You could integrate a fuel cell (say 5 gallons) so that it can function when it is detached from the fuel delivery fans.

If you don't want to worry about piping diesel around your server farm, we could go with compressed air and use air turbines with generators on them. Clean and efficient energy transmission without wires!


Does it automatically detects when you are running a benchmark and switches to a special benchmark mode?


That’s what the gearbox is for! It will automatically kick into high gear


The idea of having a miniscule diesel generator in the server chassis is almost worth doing it right now. Still have fond memories of our miniature steam engine, but I figure that would be a bit impractical.


Or nuclear reactor


what I wouldn't give for a 1U RTG...


Wireless fiber is the way of the future


Put a micro UPS in the server and just haul it around campus still online


> I'm guessing you can't just make an air-submarine.

I hadn't realized I wanted this until I found out I couldn't have it :(


programmers ("developers," if you prefer) have trouble with "second order" thinking. we integrate X technology in Y way, maybe with some Z optimization, and that'll solve the problem.

okay, but, like... will it?

is there new maintenance stuff you've completely ignored? (I've noticed this is more common when maintenance is someone else's job.) is it completely new and none of us know about it so we get blindsided unless everything goes exactly right every time? do we get visibility into what it's doing? can we fix it when (not if, when) it breaks? can everyone work on it or is it impossible for anyone but the person who set it up? they're good at thinking up things that should fix the problem but less good at things that will.

I'm a fan of cross-functional feature teams because others in the software engineering ecosystem like QA, systems people, ops, etc. tend not to have this problem. programmers are accountable to the other stakeholders up front, bad ideas are handled in real time, and- this is the most important part- everyone learns. (I won't say all systems people are cantankerous bastards... but the mistakes they've harangued me for are usually the mistakes I don't make twice.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: