> A spec is an envelope that contains all programs that comply. Creating this spec is often going to be harder than writing a single compliant program.
This perfectly explains the feeling I had when, 20 years into my career, I had to start writing specs. I could never quite put my finger on why it was harder than coding. My greater familiarity with coding didn't seem a sufficient explanation.
When writing a line of spec, I had to consider how it might be interpreted in the context of different implementations of other lines of spec - combinatorial nightmare.
But code is just a spec as far as, say, a C compiler is concerned. The compiler is free to implement the assembly however it likes. Writing that spec is definitely easier than writing the assembly (Fred Brookes said this, so it must be true).
C has a simpler mapping to assembly than most languages, so you are doing most of the high level translation when writing C. But even C compilers have considerable scope for weirdness, hence projects like CompCert.
But much of the code we run today is JIT executed, and that leaves ample room for exploiting with weird machines. Eg the TOCTOU in the Corina exploit.
Even at this very low level, full coverage specs require years of careful formal methods work. We have no hope of doing it at for vibe coding, everything will be iterative, and if TDD helps then good, but specs are by no means easier than code.
> But code is just a spec as far as, say, a C compiler is concerned. The compiler is free to implement the assembly however it likes.
Not at all. Code is formal, and going from C to assembly os deterministic. While the rules may be complex, they are still rules, and the compiler can’t stray away from them.
Writing C code is easier than writing assembly because it’s an easier notation for thinking. It’s like when hammering a nail. Doing it with a rock is hard. But using a hammer is better. You’re still hitting the nail with a hard surface, but the handle, which is more suitable for a human hand, makes the process easier.
So programming languages are not about the machine performance, they are about easier reasoning. And good programmers can get you to a level above that with proper abstraction and architecture, and give you concepts that directly map to the problem space (Gui frameworks instead of directly rendering to a framebuffer, OS syscall instead of messing with hardware,…).
> Code is formal, and going from C to assembly is deterministic.
OK, this is the main thing. Going from C to assembly is not deterministic in a sense because different compilers can produce different output. But the behaviour of the generated assembly is always the same. This isn't true of a spec.
Why is this a weird definition of determinism? Could you please define what you mean when you say deterministic?
A C program does not identify a single assembly program. It identifies a set of assembly programs. This fits the pretty standard definition of non-determinism.
A difference between natural language and C code is that natural language does not have a formal semantics. Having no formal semantics is a very different problem from having a semantics that admits a well-defined set of interpretations.
Determinism implies that the same input will result in the same output.
I agree with you that for a single C program there’s a set of assembly code that satisfies it. But by choosing a compiler, an architecture and a set of flags,… you will always get the same assembly code. If you decide to randomize them, then you can no longer guarantee a specific result, but you can still guarantee sets of result. Which is the definition of non-determinism.
Formalism is orthogonal as its about having well defined sets and transformation. LLMs are formal because it’s a finite set of weights and tokens ad the operations are well defined. But the prompt -> tokens -> tokens -> code transformation is non- deterministic in most tools (claude, chatgpt). And the relation between the input and the output os a mathematical one, not a semantic one.
I see. Then we’re on the same page. My follow up question is: why do we care if the LLM is deterministic?
Hypothetically, if we could guarantee a semantic relationship between the input and output we wouldn’t care if the LLM was deterministic. For instance, if I give the LLM a lean theorem and it instantiates a program and a mechanical proof that the program conforms to the lean theorem, I just don’t care about determinism. Edit: this is equivalent to me not caring very much about which particular conformant C compiler I pick
And my understanding of LLMs is that they actually are functions and the observed randomness is an artifact of how we use them. If you had the weights and the hardware, you could run the frontier models deterministically. But I don’t think you’d be satisfied even if you could do that. Edit: this is maybe analogous to picking a particular C compiler that does not promise conformance
There are valid concerns with LLMs but I’m not convinced non-determinism is the thing we should care about.
Non-determinism is not what people really care about.
If you remove randomness (temperature) from an LLM, It's going to be deterministic. But the relation between inputs and outputs is inscrutable (too many parameters) and there's no practical way prove the relation between a certain prompt and the output unless you run it.
Then you add randomness on top of that and the whole thing is a chaotic mess. Due to being formal, I believe generated code has a high probability of being correct (syntax) and generic patterns can be replicated easily. But the higher level concerns (the domain) and the nebulous concepts like maintainability, security,... is harder to replicate. Also correctness (logic) is hard to prove as you're unfamiliar with the code.
So do you disagree with the parent that “code is just a spec as far as the C compiler is concerned”?
Maybe it’s important to agree on what a spec is. For instance, do you agree that a spec can be just as formal as code implementing the spec? For instance, if the spec is written in a machine parseable logic do you take that to be a spec? Or are you taking a spec to be written in natural language?
I suspect some of your disagreement with the parent is about this definition. I’m just trying to understand because for me it’s uncontroversial to claim C code is a spec for assembly code, but maybe the issue is that I am not thinking of specs in the colloquial way.
The abstract begins, "Growing evidence supports early eating to control appetite and energy balance". What does that mean? My unskilled reading of it is that there is recent evidence that eating breakfast helps with weight loss. But I'm confused because there was a 2019 meta-analysis that found that eating breakfast does NOT help with weight loss. https://www.bmj.com/content/364/bmj.l42
You might want to read the rest of the studies - or at least try to no misrepresent before commenting. There are elementary differences in the studies.
Study 1: "What to Eat", Specific Demographics, Primary Clinical Trial, Mechanistic/Physiological Outcome, Conclusion: High-Protein breakfast is superior for suppressing appetite and maintaining satiety, while a High-Fiber breakfast promotes better weight loss and a healthier gut microbiome,
Study 2: Whether to Eat, Broad Demographics, Systematic Review (meta analysis), Broad Clinical Outcomes, Conclusion: eating breakfast increases total daily energy intake compared to skipping it, and that skipping breakfast resulted in slightly greater weight loss.
I wasn't comparing the two studies. I was just asking what the first sentence of the abstract of study 1 meant when it appears to be a false statement given study 2's result. Usually this is because I don't understand something and someone around here explains what I'm missing.
Even on a specific STM microcontroller (STM32G031), the LLM tools invent non-existent registers and then apologize when I point it out. And conversely, they write code for an entire algorithm (CRC, for example) when hardware support already exists on the chip.
Think of "What opcode has the value 0x3c on the Intel 8048" as a PNG image but the LLM like a very compressed JPEG. It will only get a very approximate answer. But you can give it explicit tools to look up things.
In the UK the wholesale price was about £80/MWh in 2025. The retail price was about £270/MWh + a standing charge. If you factor in the standing charge, an average user paid about £344/MWh. So the cost of generation was only about 23% of the retail price. I believe the green levies + CfDs accounted for about another 15% of the retail price.
Does this mean that if generation was free, and there were no green policy costs, our electric would still be expensive?
edit: "Network and Distribution" appears to contribute about 23% of the retail price. I guess green energy increased that cost because wind/solar are more spread out and sometimes off-shore.
We're seeing this galaxy as it was 280 million years after the Big Bang. But the universe didn't become transparent to photons until 100 million years after that (https://en.wikipedia.org/wiki/Recombination_(cosmology)). So that's impossible. Who's wrong, Recombination theory or this paper?
Yep, I think it is. The point is there's almost no history of oral peptides, other than stomachs destroying them.
FTA: "So to summarize the state of the art in oral peptide delivery: there are exactly two FDA-approved products that use permeation enhancers to get peptides into your bloodstream through your GI tract. Both achieve sub-1% bioavailability. Both required over a decade of development, thousands of clinical trial participants, and hundreds of millions of dollars."
Would a sublingual dose be possible/more effective? Research in other (um, yeah, medicinal!) compounds shows that it can be an effective pathway to the bloodstream rather than trying to survive the digestive system.
Sublingual is even harder. The sublingual mucosa is thin but selective. It strongly favors molecules that are small, lipophilic and uncharged. Semaglutide is about 8-10x too big, highly polar and charged.
Injection is really the only method with any substantial bioavailability. BUT, low (<1%) bioavailability does not necessarily mean useless.
If the drug has a relatively low marginal cost of production, and the stomach just breaks down 99% of it without side effects, you can just manufacture 100x more, give it orally, and eat the cost of the 99% that gets lost along the way.
Injectable Semaglutide/Tirzepatide (>99.8% pure) are currently sold at a profit from China for around $2-3/weekly dose. Rybelsus (oral semaglutide) is sold at roughly the same cost per milligram, even though it's made in FDA-approved facilities (you just need to take >= 40x more milligrams per month, bringing it to $1000/month in the USA)
So manufacturing oral doses 100x higher than injectable seems to be economically viable.
Ancedotal but it's really hard for me to do insufflation because of the discomfort. Of course if my life depended on it I could probably do it but otherwise I'd rather not.
Thank you, your comment made me aware of this event I didn't know. [1] I have found at least one concrete evidence you assertion is correct [2]: The Dusseldorf Agreement of March 16, 1939.
> The British historian Martin Gilbert believes that "many non-Jews resented the round-up", his opinion being supported by German witness Dr. Arthur Flehinger who recalls seeing "people crying while watching from behind their curtains". Rolf Dessauer recalls how a neighbor came forward and restored a portrait of Paul Ehrlich that had been "slashed to ribbons" by the Sturmabteilung. "He wanted it to be known that not all Germans supported Kristallnacht."
This passage is particulary eerie IMHO, since I've been reading "I don't condone this" of current world events over and over.
> In 1938, just after Kristallnacht, the psychologist Michael Müller-Claudius interviewed 41 randomly selected Nazi Party members on their attitudes towards racial persecution. Of the interviewed party members, 63% expressed extreme indignation against it, 5% expressed approval, and the remaining 32% were noncommittal.
Also particurlarly eerie to me. Yet the regime went on.
If you're looking for a source on the landslide, another commenter here posted this, that seems more reliable than wikipedia. Searching for Kofel's impact, rather than landslide, brings up nonsense because there's only pseudo-evidence for that.
It dates the landslide to about 9400 years ago (BP), so this article about the star map putting it at 5500 years ago seems to be a colourful fabrication (my bad). The author of the meteor theory apparently even tries to connect it to Sodom and Gomorrah being hit by the passing heat! Lol
Finding reliable info on this "planisphere" tablet isn't easy. As far as I can tell it was untranslated and kept a low profile until this impact story
>> It dates the landslide to about 9400 years ago (BP), so this article about the star map putting it at 5500 years ago seems to be a colourful fabrication (my bad).
Don't feel bad. Genuinely exciting if it were true.
Yeah, it was quite a compelling story, and it's at least a genuinely beautiful and intriguing tablet. The author Hempsell does have some talent though, in seemingly getting a reputable university to publish his book... I'm thinking he was quite canny in finding this attractive untranslated tablet with little else written about it, and then employing enough knowledge about a combination of different subjects (ancient Sumerian, asteroid orbits, Alpine geology) that no single reviewer was able or motivated to properly evaluate all the arguments. Or he just had a friend at the press.
That wouldn't make me happy. If the sharpie on the tape said it was bad, I'd still look at it, sniff it and probably eat it. Certain foods scare me though. eg there's a common claim that boiled rice shouldn't be kept for more than a day and then re-heated. I follow this received wisdom even though it never seems bad and I don't know anyone who got ill from eating re-heated boiled rice. On the other hand, raw chicken does not scare me because I have an uncontrollable revulsion to it when it has actually gone bad. And of course, Camembert isn't worth eating until at least a fortnight after the expiry date.
It does't tell you if it's bad, it only tells you how old it is. You get to decide if you want to eat it. It makes the decision process easier and helps to select the older leftovers that are still good but pushing it on the age.
Pounding stone seems reasonable to me. Obviously I don't have any proof or even strong evidence but I saw a video that changed my perception of what is possible. It showed two old men making a millstone with hand tools: https://www.youtube.com/watch?v=lscs5ZgNQrE. The amount of labour involved and quality of the finished item was astonishing to me. Maybe you'll think that the hideous amount of labour needed to make a simple geometric shape makes you even more convinced the Inca has some other way to achieve their even harder task. But it is a fun video anyway.
The video does not counter the parents argument about measuring fit.
What the masons in the video do is certainly impressive. Cutting organic shapes that fit perfectly together, as if they once were elastic, is another level.
Perhaps the did something similar to what dentists do when building on teeth so that the added material is not the only contract point when jaws are closed. That is, a contact sheet that leaves contact marks.
> The video does not counter the parents argument about measuring fit.
I know. I mainly just wanted to link that video because it is awesome.
The article does explain how the Inca did it - only the front edges are tight fitting. The gaps between the inside surfaces are filled with mortar. They sat the stone where it was to be placed, but with the front edge raised up by resting on some spacers, then just incrementally improved the fit of the edge and re-tried the fit. I'd have still thought that was impossible without seeing something like the video I linked - my intuition of what can be achieved with hammer and chisel was wrong.
Edit: I think that was too strong. I don't have any real knowledge of this subject. The explanation in the article seemed reasonable to me. That is all.
> Perhaps the did something similar to what dentists do when building on teeth so that the added material is not the only contract point when jaws are closed. That is, a contact sheet that leaves contact marks.
The article linked in this post mentions the possibility of „red clay“ being used for this purpose, as well as being a mortar.
This perfectly explains the feeling I had when, 20 years into my career, I had to start writing specs. I could never quite put my finger on why it was harder than coding. My greater familiarity with coding didn't seem a sufficient explanation.
When writing a line of spec, I had to consider how it might be interpreted in the context of different implementations of other lines of spec - combinatorial nightmare.
But code is just a spec as far as, say, a C compiler is concerned. The compiler is free to implement the assembly however it likes. Writing that spec is definitely easier than writing the assembly (Fred Brookes said this, so it must be true).
So why the difference?
reply