Hacker Newsnew | past | comments | ask | show | jobs | submit | speed_spread's commentslogin

If you run it on someone else's computer it becomes Lemon Stealing

How long till Canada wires up gooses brains and straps then with bombs for the ultimate biodrones? They already swarm naturally in attack formation!

We already have rocket powered autonomous drones capable of destroying fighter planes, they're called missiles.

Ooh that's super interesting. I assume you shared the recipe with the irix community? I remember keeping Netscape up to date on my Indy was already a struggle in 2002.

The biggest selling point of Java is that you can easily find programmers that know it. They will need some training to do HFT style code but you'll still pay them less than C++ prima donnas and they'll churn out reasonably robust code in good time.

In theory the water stays clean and can be reused. But I assume these cheapskates will go for evaporative cooling everytime? Then yeah, we need laws against that.


I guess when you're dissipating upwards of a gigawatt of power at a single site boiling water starts to look attractive. It's a pretty impressive curveball; I definitely would never have predicted "an evil corporation boils off all the local drinking water" to be a legitimate concern. I'm pretty sure that's too absurd a plot point for even a children's movie.


> an evil corporation boils off all the local drinking water

Nestle jumps into my mind whenever I want to think of an evil corporation and water together.


I keep hearing people claiming that water is just as much as issue as energy for operating these DCs, but that just doesn't make any sense to me. However, I haven't had to step inside a DC for almost two decades.


Continuously dissipating 1 gigawatt of energy by boiling room temperature water would require approximately 1.38 million liters of water per hour.

Seems like the environmentally responsible thing to do be to build the datacenter near the coast and use the waste heat to desalinate water. Or at least dissipate the heat into the ocean rather than boiling off an inland freshwater supply.


And kill the local aquatic life as you raise the temp beyond their happy place?


Setting aside a small patch of ocean for the task seems like a much better plan than the current practice. Provided you dump it in a place with a decent current any adversely affected area should be exceedingly small.

Keep in mind that the sun is constantly dumping energy on us. Absorption averaged across the entire earth is ~200 W/m^2. Assuming I didn't misplace some zeros somewhere then a gigawatt corresponds to ~5 km^2 of ocean surface. That's the daily flux. Penetration falls off exponentially so 75% of that only ever makes it ~10 m down.

I think the takeaway here is the utterly incomprehensible scale of the ocean.


run it through a turbine and generate electricity to power the datacenter - infinite energy and infinite ai unlocked.


This idea is probably more worth it in middle eastern countries given that 90% of their water comes from Desalination Plants. But given the recent war within region, I don't really expect Datacenters to be built within the region for quite a long time.


Yeah because it's cheaper they go evaporative. That's an easy fix by just making it more expensive.

People talk about the water usage like it's an intrinsic feature of datacenters; it's not. You just make it more expensive so they are forced to conserve. But you wait till you have to so you don't push them to build elsewhere.


You left out overthrowing governments with customized targeted propaganda, jamming citizen discussion with noise, artificially creating and nourishing contrarian cells in democratic societies. The machines will now be programming people.


But is a lawnmower a predator to the grass?


A joke that got a lot of us high paying jobs later on.


Cool kids had C64s. I had every other boring, flawed model. Tandy MC-10. TI-99, ZX80 (not even 81!) and some other CoCo with chiclet keys. Now I know the 6809 is actually pretty interesting but back then without video or graphic chips there wasn't much you could do as a 12 year old.

Weirdly the most fun I had was with the BASIC programmable SHARP PC-xxxx line. I still have my PC-1350 somewhere.


That 6809 bewitched my middle school self. Having already learnt Z80 assembly language, the 6809 just looked so much more elegant. It had index registers that were actually useful! It had position independent code! It could do multiplication in one instruction! So when faced with choosing a CoCo or a C64 .. of course I chose the machine with the MUL instruction. Naturally, within mere months, that horrid 32x16 black on green display forced the harsh realization that a computer is more than just the CPU, that the support chips could actually be far more interesting. Who cares about a multiply instruction, when you could have sprites and 3 voice sound?


My worst hardware choice (later) was to save with a monochrome VGA screen to afford a 24pin Fujitsu dot matrix vs the 9pin Epson. It forged the person I am today.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: