Hacker Newsnew | past | comments | ask | show | jobs | submit | pdmccormick's commentslogin

"Become close friends with the software you use all the time." That is a beautifully evocative phrase for a lovely idea, thank you for sharing.

If I may share an idea for which I don't have nearly as nice and succinct a summation, but I've come to view my personal computing environments through the lens of being a garden. I spend so much time within them, working, learning, playing and writing. I can see the different seasons of my life reflected through naming conventions, directory structures, scripts I've written and bookmarks I've long ignored. There are new things I want to try and explore in the spring when I hopefully have a bit more free time. I have planted seeds while children slept in my arms or in the next room, and I have enabled their dreams with the fruits of my labour. I would even say I have occasionally communed with the close and holy darkness on long, late nights.

In time everything I have created will return to dust, and probably no one will ever know this garden as I have. But it has still been a place of growth and blessing.


That is a very nice metaphor indeed!


Conceptually, can you break your processing up into a more or less "pure" functional core, surrounded by some gooey, imperative, state-dependent input loading and output effecting stages? For each processing stage, implement functions of well-defined inputs and outputs, with any global side effects clearly stated (i.e. updating a customer record, sending an email) Then factor all the imperative-ish querying (that is to say, anything dependent on external state such as is stored in a database) to the earlier phases, recognizing that some of the querying is going to be data-dependent ("if customer type X, fetch the limits for type X accounts"). The output of these phases should be a sequence of intermediate records that contain all the necessary data to drive the subsequent ones.

Whenever there is an action decision point ("we will be sending an email to this customer"), instead of actually performing that step right then and there, emit a kind of deferred-intent action data object, e.g. "OverageEmailData(customerID, email, name, usage, limits)". Finally, the later phases are also highly imperative, and actually perform the intended actions that have global visibility and mutate state in durable data stores.

You will need to consider some transactional semantics, such as, what if the customer records change during the course of running this process? Or, what if my process fails half-way through sending customer emails? It is helpful if your queries can be point-in-time based, as in "query customer usage as-of the start time for this overall process". That way you can update your process, re-run it with the same inputs as of the last time you ran it, and see what your updates changed in terms of the output.

If those initial querying phases take a long time to run because they are computationally or database query heavy, then during your development, run those once and dump the intermediate output records. Then you can reload them to use as inputs into an isolated later phase of the processing. Or you can manually filter those intermediates down to a more useful representative set (i.e. a small number of customers of each type).

Also, its really helpful to track the stateful processing of the action steps (i.e. for an email, track state as Queued, Sending, Success, Fail). If you have a bug that only bites during a later step in the processing, you can fix it and resume from where you left off (or only re-run for the affected failed actions). Also, by tracking the globally affecting actions you can actually take the results of previous runs into account during subsequent ones ("if we sent an overage email to this customer within the past 7 days, skip sending another one for now"). You now have a log of the stateful effects of your processing, which you can also query ("how many overage emails have been sent, and what numbers did they include?")

Good luck! Don't go overboard with functional purity, but just remember, state mutations now can usually be turned into data that can be applied later.


Bravo lol


"Eldritch vectors" is a perfect descriptor, thank you.


That seems like a lot of money. How quickly can sustainable capacity be built up in terms of building power plants, data center construction, silicon design and fabrication, etc.? Are these industries about to experience stratospheric growth, followed by a massive and painful adjustment, or does this represent a printing press or industrial revolution like inflection point?

Would anyone like to found a startup doing high-security embedded systems infrastructure? Peter at my username dot com if you’d like to connect.


Almost nothing in tech is sustainable outside of gold recycling.


HTTPSSH.

Why not just SSH/QUIC, what does the HTTP/3 layer add that QUIC doesn’t already have?


QuickShell - it should be called


Quicshell*


QSH?


At least that isn’t an existing ham radio Q-code!


That's already a project (library for building a desktop environment).


The ability to use HTTP authentication methods, HTTP headers, etc?


easy access to reverse proxies


“It sounds like a bunch of DJ’s dared each other to set their drum machines to BPM=1000”

That has been my favorite line from this for decades (at least that’s how I remember it going).


Was that his gabber jab? His happy hardcore description was quite funny too but I can’t remember.


Could you be more specific about where you think the brain rot is? I thought the issues regarding bcachefs and Linux kernel development revolved around respecting conventions for code freezes and release candidates. It seemed more about getting along socially than technical objections to the technology.


I love projects like these. They touch upon so many low level aspects of Unix userlands. I appreciate how systemd ventured beyond classical SysV and POSIX, and explored how Linux kernel specific functionality could be put to good use. But I also hope that it is not the last word, and that new ideas and innovations in this space can be further explored.

Recently I implemented a manufacturing-time device provisioning process that consisted of a Linux kernel (with the efistub), netbooted directly from the UEFI firmware, with a compiled-in bundled initramfs with a single init binary written in Go as the entire userland. It's very freeing when the entire operating environment consists of code you import from packages and directly program in your high level language of choice, as opposed to interacting with the system through subprocesses and myriad whacky and wonderfully different text configuration files.


I wonder how many more orders of magnitude of precision will be realistically possible. I wonder if we'd ever be able to use gravity to "see" things at non-cosmological scales, like if you could resolve the gravitational waves and interference patterns caused by a person walking by.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: