> it took until 2005 or so for Ericsson to confess that they had made a mistake
Impressive that someone was able to make that call and accept the situation, after investing half a decade moving to Java. Also says something about the staying power of Erlang and its paradigm, that the company was able to re-adopt it again.
Well, the IT bubble had burst and Sun was basically two thirds down the sewer at the time. Re-adopting something you had built and proven in the early days of cell phones probably looked like very reasonable risk management.
I'm surprised not many people have pointed at this simpler explanation. It's not about providing a faster better service to the customers and citizens. It's about using their privileged position as the dominant players in the market and government, to secure their power and control by taking over a competing paradigm (cryptocurrency) that threaten their dominance. In the process they will neuter the concept to fit the existing power structure.
Yup. As someone who generally believes in a lot of cryptocurrency ideals while acknowledging the vast amounts of mess -- right now e.g. "widespread adoption of Bitcoin, an inferior technology, by banks and other established players" is the precise playbook to destroy what crypto ideally stands for.
In the README they have example usage ranging from 8-bit emulators, games, command-line tools, integration with Dear ImGui and other libraries, bindings to languages like Zig, Rust, Odin, Nim.
> WebAssembly is a 'first-class citizen', one important motivation for the Sokol headers is to provide a collection of cross-platform APIs with a minimal footprint on the web platform while still being useful.
The next logical step, perhaps ethically questionable, seems to be growing human brains for computational purposes (parallel or quantum) with high bandwidth and very efficient power consumption.
I don't know where I got such a dark sense of humor. I find it deeply troubling that scientists are growing "organoids", little human brains, for computational purposes. Headline from 2023:
> Computer chip with built-in human brain tissue gets military funding
The project called DishBrain was spun into the startup, Cortical Labs.
> World's first 'body in a box' biological computer uses human brain cells with silicon-based computing
> Cortical Labs said the CL1 will be available from June, priced at around $35,000.
> The use of human neurons in computing raises questions about the future of AI development. Biological computers like the CL1 could provide advantages over conventional AI models, particularly in terms of learning efficiency and energy consumption.
> Ethical concerns also arise from the use of human-derived brain cells in technology. While the neurons used in the CL1 are lab-grown and lack consciousness, further advancements in the field may require guidelines to address moral and regulatory issues.
Unfortunately we are much farther from growing a human brain than we are from a scaling up an LLM to a 29 megawatt-consuming behemoth.
With growing a brain, we barely know where to begin. Not in terms of growing a few neurons in a petri dish. Nourishing the complex interconnecting structure of neurons that is a human brain is nowhere even on the horizon. Much less growing the structure from cells. At least with the LLM/AI techniques we have control over the entire processing pipeline.
You are confusing organoids with "growing a brain". Organoids are a handful of cells of a given type derived from pluripotent stem cells and growing together. A neural organoid is nothing at all like a brain -- not even a brain slice. It is a loose connection of cells that have just enough context to somewhat behave natively or just not croak immediately (which is what most individual stem cells do when they differentiate in a petri dish).
It's like calling a 1 bit half-adder circuit a computer.
Organoids are very interesting scientifically because we will need to start with organoids to grow any sort of biological system. And they do behave closer to native than individual cells so they can be used to research things like cell metabolism and drug response. But they are not anywhere close to an organ. And unfortunately they aren't even close enough to replace animal testing, yet.
Welcome to the flight, this is your captain speaking. Just want to let you know our entire flight system was vibe coded to the strict standards you expect from our industry, iterated and refined in a virtual environment over twenty virtual-years, with no fallible human eyes reviewing it - even if it were possible to review the mountain of impenetrable machine-generated code. The pilot will be controlling the plane via a cutting-edge LLM interface, prompt-engineering our way to our overseas destination. Relax, get comfortable, and pray to the collective intelligence distilled from Reddit posts.
I fell off your train of thought about halfway through, but I agree with the main point that there's way too much unnecessary churn in the web dev world, 90% is about right. Just busy work, changing APIs, forcing new and untested paradigms onto library users, major version upgrades that expect everyone to rewrite their code..
Intentionally or unconsciously, much of the work is about ensuring there will always be demand for more work. Or else there's a risk of naturally falling apart over time. Why would you build it that way!?
Impressive that someone was able to make that call and accept the situation, after investing half a decade moving to Java. Also says something about the staying power of Erlang and its paradigm, that the company was able to re-adopt it again.
reply