3 years from now: Was this written by a reputable AI? Yes? Great - let's publish now. No? Sorry - we don't have time to do QA on this - please resubmit with an AI-edited version.
# The Belt.
def belt():
# return itertools.cycle([1,2,3,4,5])
x = 1
while True:
yield x
x += 1
if x > 5:
x = 1
# The +1
def add1(iterable):
for x in iterable:
yield x + 1
# The +y
def add(y):
def _(iterable):
for x in iterable:
yield x + y
return _
# The Oddifier
def oddonly(iterable):
for x in iterable:
if x % 2 == 1:
yield x
# The Reaper
def partition(n):
assert n > 0
def _(iterable):
batch = []
for x in iterable:
batch.append(x)
if len(batch) == n:
yield batch
batch = []
if batch:
yield batch
return _
# The Composer
def compose(iterable, t1, t2):
yield from t2(t1(iterable))
# The Plumber
def pipe(iterable, p):
for xform in p:
iterable = xform(iterable)
yield from iterable
# The Press.
def printall(iterator):
import time
for x in iterator:
print(x, end=", ", flush=True)
time.sleep(0.3)
# printall(pipe(belt(), [add(1)])) # 2, 3, 4, 5, 6, 2, 3, 4...
# printall(pipe(belt(), [add(1), oddonly])) # 3, 5, 3, 5, ...
printall(pipe(belt(), [add(1), oddonly, partition(3)])) # [3, 5, 3], [5, 3, 5], ...
The human brain consumes 20% of calories - a balance of brain vs brawn that has proven effective. I'd argue we'd be much better off with radically more energy (as a percentage of global energy) going to data centers. I'd also argue that this equilibrium will find itself.
Planetary economies are just a different kind of organism, that at this point is a cyborg.
Thinking about the ideal energy balance devoted to planetary cognition vs planetary kinetics seems like a fascinating way to model the world. The main thing that makes humans so dominant is that we took the risk of devoting more of our energy towards cognition and as a result discovered magical ways to leverage and exponentially increase our kinetic power (e.g. bow and arrow).
What happens when we start devoting more resources towards cognition at a planetary scale?
It's an interesting possibility for sure but to me these concepts are not linked. With the recent LK-99 craze, I learned that theoretical optimal efficiency for computing is many many orders of magnitude higher than today. So: chips theoretically can get much more efficient. If we find a 1000x more efficient computer, do you still think we need to throw the same 20% of our resources towards it? What would we let those 1000x more capable computers do? The question we need to ask is: what can we do with computing, what would it give us and how much energy does that cost.
I don't want to sound like "384kb is enough for everybody" but saying there's a fixed percentage of energy that should go towards computing is weird to me.
There's not a fixed percentage. There's an optimal balance that likely changes depending on the environment.
But you do sound like you're saying "384kb is enough for everybody". The reason to devote more resources to cognition is is precisely because we can't imagine the possibilities that exist with more clever thinking applied to our limited resources. In the same way, an ape with 10% energy allocated towards cognition (guessing) can't even begin to imagine the magic that gets unlocked by its ancestors that gambled on 20%. Hell, apes can look at us now and still can't understand us.
In this conversation, you're the ape who's blindly suggesting there's little worthwhile in expanding resources towards global computation, and I'm the ape who's blindly suggesting there probably is. Neither of us can honestly predict what might happen, good or bad.
Long-running processes have their uses but they are not the only tool or the end of the evolution. "Serverless lambda" is one example of how moving back to one-off processes provides a different kind of value.
I did know of those, I'm just assuming its going to cause some sort of problem with network calls and sockets and what not (they mention this specifically as part of the caveats)
Replying to myself. I guess I read that a bit too quickly:
> In Node.js, the answer involves a WebSocket to TCP socket proxy, Asyncify, and patching deep PHP internals like php_select. It's complex, but there's a reward. The Node.js-targeted PHP build can request web APIs, install composer packages, and even connect to a MySQL server.
- "Vital" and "crucial" both indicate a required element.
- "Key" and "important" both indicate notable elements.
- "Key" can also be used to indicate a required element, i.e. "keystone", however it is not always used in this way.
- "Key" can also be used to indicate "there is only one" though it is not always used in this way.
When "key" is meant as "singular & required & notable" it probably carries the most weight of all these words. However, since it is not always used in that way, I personally tend to give it less weight.
In casual use I would order them like this (strongest to weakest): crucial, vital, key, important.
In formal use I would order them like this (strongest to weakest): key, vital, crucial, important.