Hacker News new | past | comments | ask | show | jobs | submit login

  > Most coding doesn't need much of any math past boolean logic and very basic set operations
Coding IS math.

Not "coding uses math", I mean it is math.

  Mathematicians do not deal in objects, but in relations among objects; they are free to replace some object by others so long as the relations remain unchanged. Content to them is irrelevant; they are interested in form only.
  - Poincare[0]
I don't know how you code, but I don't think I'm aware of code that can't be reasonably explained as forming relationships between objects. The face we can trace a program seems to necessitate this.

[0] https://philosophy.stackexchange.com/questions/22440/what-di...






> Coding IS math.

Programming is an expression of logic, which is absolutely mathematics.

But then we also have to think about naming variables and classes, structuring our code so that it is more readable by other developers, and so on. That's less about formal reasoning and more about communication.

There is an engineering aspect to programming (prototyping, architecture, optimization, etc). It's a combination of mathematics and engineering. Software Engineering.


> Coding IS math.

The murky world of software patents would like a word about that.

For me, coding also feels artistic at times. But we see artistic beauty in mathematics all the time, so no wonder. But often I look at ugly code, and maybe that is subjective, but fixing that code makes it also feel prettier.

It is clear all developers have their owm artistic style, and probably that is why there is so much disagreement in the industry. Maybe we are lacking the pure mathemetical language to describe our intent in a more beutiful and precise way that is more clearly the-right-way. As in how we find beauty in simple physics equations.


Do you think mathematicians don’t have to think about naming variables and structuring proofs?

The presumably-mathematicians who wrote my Algorithms textbook (CLRS) didn't seem to think giving the variables in their algorithms useful names. They just use i, x, etc. all over the place and don't even consider giving things actual names. This makes the book much less accessible than it would be if they treated it more like most people write code.

  > They just use i, x, etc. all over the place
I do agree with your point btw, but I did want to note that there are good conventions around symbols. The brevity is heavily influenced by the medium. Variable names sucked when you had punch cards. It's still burdensome to write long names when using paper, chalkboard, whiteboard, or any system that doesn't have autocomplete.

In general, lower case letters are used as constants, excluding x,y,z,t,i,j,k (sometimes u,v,w). It isn't a hard rule, but strong preference to begin at the beginning of the alphabet for these. Capital letters usually are held for things like Variable Sets (like random variables). Greek letters need context for constants or variables. BB and Cal typefaces for sets (e.g. Real Numbers, Integers). And much more.

I think a lot of the difficulty in it is that these "rules" or patterns are generally learned through usage and often not explicitly stated. But learning them can really help read unfamiliar topics and is why "notation abuse" leads to confusion. But after all, math is all about abstraction so technically any symbol will do, but no doubt some are (significantly) better than others for communicating.

  There are two hard things in Computer Science:
    - Cache Invalidation
    - Naming Things
    - Off-by-One Errors

Picking what letters to use for what things can still be a struggle.

Mathematicians do have to deal with difficulties in naming things.


They just invent new characters.

What do you mean? Most letters used in math are just Latin and Greek letters in different fonts or stylizations.

That's how it is in applied mathematics. That's a hardcore computer science textbook. If your job is computer science research, inventing new algorithms, that is the kind of book you will get used to. There are better options for practical learning.

Of course they do, but that part is not "mathematics". It is communication. If they use the English language to write the proof that doesn't mean English is also mathematics.

Writing proofs is just as important to mathematics as writing code is to programming.

Yes it is important but that's not the point.

Structuring sentences and naming variables so that it is easier for other people to understand is less about formal mathematical reasoning, and more about communication.

You could name a variable x, y, or Banana, but it doesn't change the logic.


Neither is it "the point" in programming. You should be concerned with communication and have every right to get upset when someone is being needlessly convoluted but that's as much of a point in programming as it is in math, physics, or any domain.

I mean the reason we get mad at this is because it is someone destroying "society" in some sense. Even if that society is your team or just the set of programmers. It would be a pretty dick move were I to just use a word that significantly diverged from conventional meaning and expected you to mull it over. Similarly if I drop a unknown out of context math equation. It would be meaningless.

And I'm on your side, really! I strongly advocate for documenting. And let's be real, the conclusion of your argument more strongly argues for documentation than good variable names. Because variable names are much more constrained and much more easily misinterpreted considering how any word has multiple definitions. Surrounding code is often insufficient to derive necessary contextualization.

https://news.ycombinator.com/item?id=43874738


>Programming is an expression of logic, which is absolutely mathematics.

and also philosophy.


As in GNU vs. Microsoft? Or something more foundational in logic?

Classes are a platonic ideal representation of reality?

That's another gap: mathematical classes are ideal representation of fantasy, programming classes are leaky representation of reality.

I don't think that quote really supports coding and math being equivalent. To me, the quote provides an abstraction of math through a structuralist perspective. Language can also be viewed through this abstraction. I think coding could share the abstraction, but that doesn't make the three of these fields equivalent.

  > coding and math being equivalent
Please see lambda calculus. I mean equivalent in the way mathematicians do: that we can uniquely map everything from one set to another

This seems like the setup to a joke about how mathematicians don't know how to communicate to ordinary folks

Well a lot of people did wildly misunderstand the Poincare quote. To me is is obviously about abstraction and I think this is true for any mathematician. I thought it would also be natural for programmers considering we use "object" quite similarly, if not identically. So... maybe it is or maybe this is the joke.

"There are those who tell us that any choice from among theoretically-equivalent alternatives is merely a question of taste. These are the people who bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans. They are malicious idiots. The only punishment which could stand a chance at reforming these miscreants into decent people would be a year or two at hard labor. And not just any kind of hard labor: specifically, carrying out long division using Roman numerals." — Stanislav Datskovskiy

No one is suggesting programming with lambda calculus. But it would be naïve to think lambda calculus isn't important. They serve different purposes.

We didn't:

  bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans.

> Not "coding uses math", I mean it is math

> I mean equivalent in the way mathematicians do

That sounds like you're backing off from your original claim, probably because it is impossible to defend.

That you can use mathematics to describe code doesn't seem very different from using math to describe gravity, or the projected winner in an election, or how sound waves propagate.

Isn't the primary purpose of math to describe the world around us?

Then it shouldn't be surprising that it can also be used to describe programming.

In the real world, however, software engineering has nothing to do with mathematical abstractions 99% of the time


A programmer constructs a function from some data type to another while a mathematician constructs a function from witnesses of some proposition to another?

Though interpreting a CRUD app as a theorem (or collection of theorems) doesn’t result in an interesting theorem, and interpreting a typical theorem as a program… well, sometimes the result would be a useful program, but often it wouldn’t be.


Interpreting a CRUD apps (or fragments of them) as theorems is interesting (given a programming language and culture that doesn't suck)! e.g. if you have a function `A => ZIO[Any,Nothing,B]`, then you have reasonable certainty that barring catastrophic events like the machine going OOM or encountering a hardware failure (essentially, things that happen outside of the programming model), that given an A, you can run some IO operation that will produce a B and will not throw an exception or return an error. If you have an `A => B`, then you know that given an A, you can make a B. Sounds simple enough but in practice this is extremely useful!

It's not the type of thing that gets mathematicians excited, but from an engineering perspective, such theorems are great. You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.

It's actually the halting problem that I find is not relevant to practical programming; in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data. The hard parts have been neatly tidied away into databases and operating systems (which for practical purposes, you can usually import as "axioms").


  > It's not the type of thing that gets mathematicians excited
Says who? I've certainly seen mathematicians get excited about these kinds of things. Frequently they study Programming Languages and will talk your ear off about Category Theory.

  > You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
Sounds like math to me. A simple and imprecise math, but still math via Poincare's description.

  > in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data
In common settings. But those settings also change. You may see those uncommon settings as not practical or useful but I'd say that studying those uncommon settings is necessary for them to become practical and useful (presumably with additional benefits that the current paradigm doesn't have).

I think we're in agreement. My comment about the halting problem was meant to refer to Rice's theorem (the name was slipping my mind), which I occasionally see people use to justify the idea that you can't prove interesting facts about real-world programs. In practice, real-world programming involves constantly proving small, useful theorems. Your useful theorem (e.g. `Map[User,Seq[Account]] => Map[User,NetWorth]`) is probably not that interesting to even the category theorists, but that's fine, and there's plenty you can learn from the theorists about how to factor the proof well (e.g. as _.map(_.map(_.balance).sum)).

  > Isn't the primary purpose of math to describe the world around us?
No, that's Physics[0]. I joke that "Physics is the subset of mathematics that reflects the observable world." This is also a jab at String Theorists[1].

Physicists use math, but that doesn't mean it is math. It's not the only language at their disposal nor do they use all of math.

  > software engineering has nothing to do with mathematical abstractions 99% of the time
I'd argue that 100% of the time it has to do with mathematical abstractions. Please read the Poincare quote again. Take a moment to digest his meaning. Determine what an "object" means. What he means by "[content] is irrelevant" and why only form matters. I'll give you a lead: a class object isn't the only type of object in programming, nor is a type object. :)

[0] Technically a specific (class of) physics, but the physics that any reasonable reader knows I'm referencing. But hey, I'll be a tad pedantic.

[1] String Theory is untestable, therefore doesn't really reflect the observable world. Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so. But we're getting too meta and this joke is rarely enjoyed outside mathematician and physicist communities.


> No, that's Physics

Going on a total tangent, if you'll forgive me, and I ask purely as a curious outsider: do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?

What would have been the very beginning of math, the first human thought, or word or action, that could be called "math"? Are you able to picture this?


  > do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
I'm a bit confused. What exactly is the counterfactual[0] here? If it is hyper-specific to categorizing and describing then I think yes, those creatures could still invent math.

But my confusion is because I'm having a difficult time thinking where such things aren't also necessary consequences of just being a living being in general. I cannot think of a single creature that does not also have some world model, even if that model is very poor. My cat understands physics and math, even though her understandings are quite naive (also Wittgenstein[1] is quite wrong. I can understand my cat, even if not completely and even though she has a much harder time understanding me). More naive than say the Greeks, but they were also significantly more naive than your average math undergrad and I wouldn't say the Greeks "didn't do math".

It necessitates a threshold value and I'm not sure that this is useful framing. At least until we have a mutual understanding of what threshold we're concerned with. Frankly, we often place these contrived thresholds/barriers in continuous processes. They can be helpful but they also lead to a lot of confusion.

  > What would have been the very beginning of math
This too is hard to describe. Mull over the Poincare quote a bit. There's many thresholds we could pick from.

I could say when the some of the Greeks got tired of arguing with people who were just pulling shit out of their asses, but that'd ignore many times other civilizations independently did the same.

I could say when the first conscious creature arose (I don't know when this was). It needed to understand itself (an object) and its relationship to others. Other creatures, other things, other... objects.

I could also say the first living creature. As I said above, even a bad world model has some understanding that there are objects and relationships between them.

I could also say it always was. But then we get into a "tree falls in a forest and no one is around to hear it" type of thing (also with the prior one). Acoustic vibrations is a fine definition, but so is "what one hears".

I'd more put the line closer to "Greeks" (and probably conscious). The reason for this is formalization, and I think this is a sufficient point where there's near universal agreement. In quotes because I'll accept any point in time that can qualify with the intended distinction, which is really hard to pin-point. I'm certainly not a historian nor remotely qualified to point to a reasonable time lol. But this also seems to be a point in history often referenced as being near "the birth" and frankly I'm more interested in other questions/topics than really getting to the bottom of this one. It also seems unprovable, and I'm okay with that. I'm not so certain it matters when that happened.

To clarify, I do not think life itself necessitates this type of formalization though. I'm unsure what conditions are necessary for this to happen (as an ML researcher I am concerned with this question though), but it does seem the be a natural consequence of a sufficient level of intelligence.

I'll put it this way, if we meet an alien creature I would be astonished if they did not have math. I have no reason to believe that their math would look remotely similar to ours, and I do think there would be difficulties in communicating, but if we both understand Poincare's meaning then it'll surely make that process easier.

Sorry, I know that was long and probably confusing. I just don't have a great answer. Certainly I don't know the answer either. So all I can give are some of my thoughts.

[0] https://www.inference.vc/causal-inference-3-counterfactuals/

[1] https://existentialcomics.com/comic/245


>Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so.

That's not unique, all quantitative theories allow small modifications. Then you select parsimonious theory.


Coding is math in the sense that coding is a subset of math.

Mathematics is a very extensive field, and covers a vast amount of subjects.

For the same reason it can't be said that mathematics is equivalent to coding, as there are many things in mathematics that are not relevant for coding.

However, by far the most interesting parts of coding are definitely related to mathematics.


Sure, lambda calculus is math. To call assembly or typical imperative C math, at least in the same sense, is a bit of a stretch.

Working with different objects doesn't make it any less of math. Just because you can derive calculus from set theory (analogous to assembly or even binary here) doesn't make calculus "not math".

Math is about abstractions and relations. See the Poincare quote again.

Plus, the Programming Languages people would like to have a word with you. Two actually: Category Theory. But really, if you get them started they won't shut up. That's either a great time or a terrible time, but I think for most it is the latter.


Wheeler: It from bit. Otherwise put, every it—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. It from bit symbolizes the idea that every item of the physical world has at bottom—at a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.

That is not in contention with what I said, now is it?

Wheeler is arguing that if a tree falls in the forest, and there is nobody to hear it, that it still makes a sound because there are things that interact with the sound. But if the tree fell in a forest and there was nothing else in the universe then there is no sound because there is no observation.

It helps to read the whole thing[0] and to understand the context of the discussion. This is meta-physics and a deep discussion into what the nature of reality is. Ian Hacker has a good introduction to the subject but I find develop grave misunderstandings when they also do not have the strong math and physics background necessary to parse the words. Even people who understand the silliness of "The Secret" and that an observer need not be human often believe that this necessitates a multi-verse. A wildly convoluted solution to the problem of entropy not being invertible. Or closer to computer terms, a solution that insists that P = NP. There is information lost.

If you wanna argue that there's no difference between the word cup and a cup itself because there is no word without the observer who has the language, then yeah.

[0] https://historyofinformation.com/detail.php?id=5041


If nature is information theoretic, then physics is mathematics.

But that doesnt necessarily mean successful programmers are good at conventional math. This is why certain people in the department are identified as "math people".

I'm not sure why you'd think I disagree. It seems you understood I argued that it's unhelpful to make the distinction between math and "conventional" math

But I'll refer you to a longer conversation if it helps https://news.ycombinator.com/item?id=43872687


> Coding IS math.

> Not "coding uses math", I mean it is math.

Arguably writing a novel is math, if you use the right definition of math. But sometimes its more helpful to use more informal definitions that capture what people mean then what is technically accurate.


By that same logic you could also say that language is math. In fact I think your quote kind of disproves your point because the content/state of a program is super important in coding more than the form.

Coding used to be very close to pure math (many early computer science classes were taught in the Math Department in universities) but it has been so far abstracted from that to the point that it is its own thing and is as close to math as any other subject is.


  > By that same logic you could also say that language is math
Not quite, but the inverse is true. The language to math direction doesn't work because a lack of formalism. I can state incomprehensible sentences or words. (There's an advantage to that in some cases!) but when you do that with code you get errors and even you do it with math its just that there's no compiler or interpreter that tells at you

>I can state incomprehensible sentences or words.

since you can express paradoxes with match, perhaps not that different.


I think you misunderstand what "paradox" means. While it can mean "self-contradictory" it can also mean "contrary to one's expectation." Math uses both, but in very different contexts.

The contradiction is used in proof formulation, specifically to invalidate some claim. I don't think this is what you're implying.

The latter is what it contextually sounds like you're stating; things like the Banach-Tarksi Paradox. There's no self-contradiction in that, but it is an unexpected result and points to the need to refine certain things like the ZFC set theory.

I'd also stress that there are true statements which cannot be proven through axiomatic systems. The Halting Problem is an example of what Godel proved. But that's not contradictory, even if unexpected or frustrating.


forming relationships between objects

This is vague and doesn't mean anything. People can't even agree what 'objects' are and people did a lot of programming before the term 'object' was invented.

Programming is about is fundamentally about instructions and data. Yin and yang of two things that are completely different, even if people can occasionally mix them and conflate them.


Category theory is often jokingly called the study of dots and arrows. What's a dot? Anything. What's an arrow? A relationship.

I'm surprised I hit a nerve with so many people. I'm quoting someone who's considered one of the greatest mathematicians. Obviously I don't know the backgrounds of people but it seems like programmers have strong opinions on what math is that disagrees with what mathematicians say they do.

https://en.wikipedia.org/wiki/Category_theory


Now it's not math it's 'category theory' and that's dots and arrows and dots are anything.

At what point does the abstract description just not offer any useful insight anymore?

I'm surprised I hit a nerve with so many people.

A lot of people had very good explanations for why you're pretty far off in trying to say two things are the same.

Programming is much more like building a machine than any sort of straight math. There is state, there is interactivity, there is performance, there is IO and there are real world implications to all of it.

Saying they are the same is like saying gardening is plumbing just because you sometimes use a hose.


  > [No,] it's not math it's 'category theory' 
That's a wild claim considering Category Theory is a branch of mathematics

  | Category theory is a general theory of mathematical structures and their relations.
  - https://en.wikipedia.org/wiki/Category_theory
It is necessary that you provide an alternative definition as to what "category theory" is, though I suspect it will make many category theorists and mathematicians upset.

  > A lot of people had
A lot of non-mathematicians disagreed with mathematicians. https://news.ycombinator.com/item?id=43882197

That's a wild claim considering Category Theory is a branch of mathematics

It's not a wild claim since you misquoted me.

A lot of non-mathematicians disagreed with mathematicians.

Mathematicians can claim whatever they want, when it comes to programming, programmers understand it better and they're trying to explain to you why this is nonsense. Vanderbilt claims to be "the harvard of the south" but wouldn't you know it, harvard doesn't claim to be "the vanderbilt of the north".

Show me programming languages designed by mathematicians to be 'mathematically pure' and I'll show you a language that hasn't been used to ship software that people want to use.


It's math the way everything is physics, as well as the way physics is philosophy (as the natural sciences started out as branches of philosophy).

In other words: It's math in a sense that for most of us with a computer science background is often not very relevant to how we work.


>Coding IS math.

No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations, which account for a lot of programming work. Sure, the underlying machine code is all based on math, but the higher level programming doesn't need to involve a single math equation for it to be useful. Let's see where the goalposts move now...


Mathematics is not "equations". Most of mathematics is not related with calculus either.

For anything CRUD, you put the data into a database. The underlying database is relational, or a key->store.

If it's relational, that's one branch of mathematics. If it's another kind of database, it's another branch of mathematics. Mathematics is extensive, and covers more things you can imagine at a glance.

The main difference between writing mathematics and programming, and this applies to any form of programming, is that in mathematics writing a formal proof is amazingly close to: you write the program, and you are also the compiler, and you are the CPU, performing all operations yourself. With programming you only have to write the program.

Source: On one hand I have studied pure mathematics (not the simplified applied mathematics that are taught in engineering, which is mostly equations), on the other hand I have been working as a software developer for over 15 years.


  > It doesn't take math to perform CRUD operation
Yes it does. Just because the objects you are working with aren't numbers doesn't mean it isn't math. In fact, that's my entire point. It is why I quoted Poincare in the first place. He didn't say "numbers" he said "objects".

Sorry but you are wrong. "CREATE, READ, UPDATE, DELETE" operations are not mathematical in nature. It doesn't matter what the data is or how esoteric you want to get with it - a programmer doesn't need any math at all for simple CRUD. You're trying to move the goalposts all the way back to the 1800s to win a pointless internet argument.

A while back I made this[0] as a quick demo of generic batching for CRUD requests. It's very much practically oriented (it's for performance optimization while making the code nicely reusable/not muddying up the business logic), but also felt quite a bit like the same sort of things I did in my math degree.

Actually I'm starting to wonder whether the thing that made university math easy for me was I quickly developed a good internal "type checker"/"type inference engine" in my head, and that helped make the next steps of proofs seem straightforward.

[0] https://gist.github.com/ndriscoll/881c4f5f0398039a3a74543450...


I understand you have strong opinions, I just don't understand why.

In math are highly concerned with structures and abstraction. We have things called operators. They aren't just addition and multiplication. We also use those words to describe different operations. They have things like groups, rings, fields, and algebras. Yes, plural.

The purpose of these things is to create logical frameworks. It matters not what the operators are. Nor does it matter what objects we operate on. Poincaré is explicitly saying this.

The part you're not understanding is the abstraction. This is what math is about. It is also why the Programming Language people are deeper in the math side and love Category Theory (I have a few friends who's PL dissertations are more math heavy than some math dissertations I've seen). It's no surprise. What's a function? How do you abstract functions? How do you define structure? These are shared critical questions. PL people are more concerned with types but I wouldn't say that makes them any less of a mathematician than a set theorist.

We can perfectly describe the CRUD operations with set theory. Do most programmers need concern themselves with this? Absolutely not. But is it something people designing those operations and systems is thinking about? Yes.

I'd encourage you to learn some set theory, abstract algebra, and maybe a bit of cat theory. It'll make these things pretty clear. But I'd caution about having strong opinions on topics you aren't intimately familiar with. Especially when arguing with those that are. Frankly, CRUD is a terrible example. I'm confident a quick google search (or asking a GPT) would quickly point you to relational algebra. It's discussed in most database classes. It certainly was in the one I was an assistant for.


> I understand you have strong opinions, I just don't understand why.

As someone that is "on the other side of the fence", and getting flamed for it, maybe I can shed some light since I have the other perspective as well. IMO, The reason for not seeing eye to eye is (for example) akin to saying "word problems are math". (Thinking of a grade school word problems for common reference). Yes, they are readily mapped to mathematical models that can solve the word problem & perhaps almost indistinguishably so. Though no - word problems are not math. Word problems are a series of phrases and words. That's where the talking past each other comes in... Different interpretations of "word problems are math", or "code is math". It's seemingly not clear whether we are talking about logical 'implies', 'element of', or 'equals'.

Which goes to "We can perfectly describe the CRUD operations with set theory.", we all agree there. That is not readily conveyed though when writing things like "code is math".


> word problems

That is a good analogy, except programming languages are formal languages, not natural languages.

> they are readily mapped to mathematical models

With code we are not talking about something that is mapped to mathematical models. Code is not modelled by mathematics, it is defined by mathematics (denotational semantics and operational semantics).

"Code is math" is true in a theoretical sense. Practically, coding doesn't always feel like what is commonly thought of as "doing mathematics", but it is essentially mathematical.

https://en.wikipedia.org/wiki/Formal_language

https://en.wikipedia.org/wiki/Semantics_(computer_science)


  > That's where the talking past each other comes in...
Then sorry, but that's your own damn fault. I was clear about my definition and quoted a famous mathematician to give some authority, to not be "trust me even though I'm some rando". The way to respond to that is not "you're wrong, trust me, I'm some rando".

Yes, I agree we're misunderstanding each other because we're using different definitions but "you've" rejected "mine" without defining "yours" and expecting everyone to understand. Of course that'll lead to confusion. You can reject the mathematicians definition of math, but you sure gotta say more than "trust me" and it's a pretty wild thing to do, especially as non mathematicians.

The problem here is one side who's dabbled in cooking says "a chef makes desserts" and chefs are responding "we do a lot more than that". Maybe there's a few chefs that go "yeah all I do is dessert" and everyone points to that while ignoring the second part of their sentence is "but that's just my speciality." Wouldn't you think that conversation is insane? The only reason it's obviously so is because we all know what a chef is and agree on the definition. But who is better qualified to define the chef's job? The chef or consumer?


The way I'm responding I'd more characterize as: "wait, if what you are saying is true, then this other thing should be true too, but it does not seem to be. That would indicate what you are saying is not true."

In another thread, you characterized my response as stating: " ¬(A ↦ B) ⟹ ¬(B ↦ A)" (and this is a great example of language not being math, but math being language!). That was not at all my claim.

My claim is "I believe you are saying 'A = B'. It appears that 'B != A', therefore 'A != B'." My only claims are

(1) I believe you are writing to convey that you mean Math IS Language in the sense they are equal, identical, interchangeable, and fully equivalent, and bi-directionally so

(2) that: B != A

The only results can either be:

- "yeah, because B != A, the statement A = B is not true"

- Your claim (1) is false, I'm not actually saying "A = B"

- Your claim (2) is false, "B = A" is in fact true. I would find that to be an interesting assertion and would have loved to explore more why you think that.


You could say that math is behind the structure of a leaf, but the farmer doesn't care about that. Keep moving the goalpost all you want, I don't have time to argue these pointless things, and I stopped reading after the first sentence of your comment. I'm done here.

You seem really angry at the field of Computer Science for some reason.

We are talking about textbook CS fundamentals.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


CS isn't programming, it doesn't write code.

CREATE, READ, UPDATE, DELETE are fundamentally mathematical in nature.

In a typical implementation these are database operations. That involves relational algebra operations, state transitions, boolean logic.

The READ part can be a very complex SQL query (composed of many algebraic operations) but even the simplest query (SELECT * from t where ID = 1) is filtering a set based on a predicate. That is mathematics.

No one is moving goalposts. Set theory and logic are at the foundations of mathematics.


It's almost the argument of programming vs computer science coming out here.

This is math:

{x | x.id = 1}

OTOH, a SQL query is a SQL query.

This thread is hilarious though. It's like

- Cashier: here is your change.

- Customer: you did math!

- Cashier, no, I gave you change.

- Customer: that IS math!

- Cashier: You mean, I used math to give you change?

- Customer: No, giving change doesn't use math, it IS math!!!!" [2]

= D

Moving along.. FWIW, indeed SQL was created to model set theory (relational algebra and tuple calculus), the close relationship is no accident of course [0][1])

> No one is moving goalposts

I feel too they are.

First goal post:

> Coding IS math. >> No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations

Second goal post:

> CREATE, READ, UPDATE, DELETE are fundamentally mathematical in nature

CRUD is closely related to SQL, and SQL is closely related to various mathematics. Are they identical and therefore equivalent? No - because your database is not going to like it when you write "{x | x.id = 1}", and the Oracle DB might not like something that you can write for your Postgres DB.

[0] https://simpleprogrammer.com/mastering-sql/

[1] https://en.wikipedia.org/wiki/SQL

[2] To quote: """Not "coding uses math", I mean it is math""" @ https://news.ycombinator.com/item?id=43872771


The problem you're running into is that people who have some "mathematical maturity" don't get bogged down in notation, so it's difficult for them to see the distinction you're trying to draw between e.g. `{ x∈S | x.id = 1}` and `select x from S where x.id = 1`[0]. You say "a SQL query is a SQL query" and they just think "yes, which is also obviously a mathematical expression".

Computer programs are proofs[1]. This is intuitively and also formally true. You would agree writing proofs is doing math, yeah? Then obviously writing a computer program is also doing math.

Like I have a degree in math and have been a software engineer for over a decade. I do not know what distinction people are trying to get at. It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.

[0] Modulo details like NULLs and multi-set semantics, but surely that's not the distinction?

[1] Up to isomorphism


Code is logical in nature and is defined by mathematics.

I'd agree code is usually governed by mathematics, not defined by it though.

Goes back to this ridiculous proposition:

- Cashier: You mean, I used math to give you change?

- Customer: No, giving change doesn't use math, it IS math!!!!" [2]

The proposition is that "code IS math", not defined by, not uses, not inspired by, not relies on, not modeled after, but IS.


Just because what you're working on specifically is an equivalent of middle-school math doesn't tell us much about the field as a whole.

Though middle-school or not, it's still math.


All of that code is a series of logical statements and expressions. Mathematical logic.

But the CRUD logic is so basic and boring, so obvious, that it doesn't require any thought.


>All of that code is a series of logical statements and expressions. Mathematical logic.

Which code? The machine code that underlies everything? Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function? Not all programming has to be mathematical or even logical at a high enough level, and I say this as someone who's been coding assembly language for 40 years.


> Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function?

Those lines ARE mathematical logical statements.

Each line defines logical operations that are executed by the computer.

Same for high level or low level programming. It's all logic.


Storing a string in some abstraction is not a mathematical operation. I'm done with this thread, it's going way too far down too many rabbit holes. The quarks that make up the matter that the computers are made of are pure "math". There, now I've moved the goalposts.

That is a narrow perspective of mathematics and computer science.

Assigning a variable is a mathematical operation. It is a logical state transition. This is formalized in Programming Language Theory.

We are not talking about quarks. We are talking about lines of code which are objectively mathematical statements, no matter how non-mathematical they seem to you subjectively.

You start with a practical problem to solve, and a computing machine that can perform logical operations on data. Your job is to figure out a correct sequence of logical operations that will solve the problem. The program that you have written is a mathematical structure. It is mathematics.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


You confuse map with the territory. By the same logic you can say quarks are mathematics, because they are modeled by a theory that has some mathematics in it.

Our mathematical models of reality (i.e. Physics) is not the same as reality. So yes, by the same logic "quarks" are mathematics, but only in the same way that a "cup" is English. The quark was a great example, considering we can't observe it and purely rely on mathematical models, but the same argument would still hold true for a bowling ball. Physics is our description of reality, not reality itself. Our description of reality highly leverages the language of math, as that language was developed to enforce rules of consistency. Much the same way we design our programming languages, which is why programming language people study so much more math than your typical CS person.

If you're going to accuse someone of confusing the map with the territory, you really should make sure you aren't making the same error.


How math helps with programming languages? What math says about zero based indexes? How do you prevent off by one errors? How do you prevent buffer overflows? It's ergonomics problems.

It is hard to answer because of exactly what ndriscoll said[0]

  > It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.
As ndriscoll suggests, it is tautological. I mean look at what I said. I really need you to hear it. I said that coding is math. So what I hear is "How programming languages helps with programming languages?" Why are you expecting me to hear anything different?

  > What math says about zero based indexes?
Start at 0? Start at 1? Who cares, it is the same thing. The natural numbers, non-negative integers, integers, even integers, who cares? They're the same thing. And who cares about indexing at 0 or 1 in programming? That's always been a silly argument that's inconsequential.

  > How do you prevent off by one errors?
By not being off by one? What's the question? Like being confused about if you start at 0 or start at 1 and how to get the right bound? It is a shift from one to the other, but they are isomorphic. We can perfectly map. But I really don't get the question. You can formalize these relationships with equations you know. I know it isn't "cool" but you can grab a pen and paper (or a whiteboard) and write down your program structure if you are often falling for these mistakes. This seems more about the difficulties of keeping track of a lot of things in your head all at once.

  > How do you prevent buffer overflows?
By not going over your bounds? I'm so confused. I mean you are asking something like "if f(x) = inf when x > 10, how does math help you prevent the output of the function from being infinite?"

Maybe what will help is seeing what some of the Programming Languages people do and why they like Haskell[1].

Or maybe check out Bartosz Milewski[2,3]. His blog[2], is titled "Bartosz Milewski's Programming Cafe: Category Theory, Haskell, Concurrency, C++". It may look very mathy, and you'd be right(!), but it is all about programming! Go check out his Category Theory Course[3], it is _for programmers_.

Don't trust me, go look at papers published in programming language conferences [4]. You'll find plenty of papers that are VERY mathy as well as plenty that are not. It really depends on the topic and what is the best language for the problems they're solving. But you'll certainly find some of the answers you're looking for.

Seriously, don't trust me, verify these things yourself. Google them. Ask an LLM. I don't know what to tell you because these are verifiable things (i.e. my claims are falsifiable!). The only thing you need to do is look.

[0] https://news.ycombinator.com/item?id=43882197

[1] https://excessivelyadequate.com/posts/isomorphisms.html

[2] https://bartoszmilewski.com/

[3] https://www.youtube.com/watch?v=I8LbkfSSR58&list=PLbgaMIhjbm...

[4] https://www.sigplan.org/Conferences/


Nonmathematical problems are difficult to answer when you try to find mathematical answers. It should be obvious, why it doesn't work.

Mentioning Scala is ironic, it's very light on math spik and to begin with was created to unify object oriented with functional programming, which is mathematically meaningless, because both are Turing complete and thus equivalent, tautological.

>So what I hear is "How programming languages helps with programming languages?"

Oh, right, in mathematics axiom is argument, but in reality it isn't. In programming you should assert what you assume, otherwise your assumptions can be wrong due to divergence from reality, but there no reality in mathematics, only fantasy, so you can't understand this with mathematics alone.


> but there no reality in mathematics, only fantasy, so you can't understand this with mathematics alone

No. Code is an abstraction. It exists as a logical structure, grounded in mathematical logic.

https://news.ycombinator.com/item?id=43888917


That may be true to some extent, but I think you are missing the point. Quarks are physical in nature, and code is logical in nature. Programs themselves are formal systems. Code isn’t just modelled by mathematics, it is defined by mathematics.

In the case of code, we can argue that the map is the territory.


Code is logical if you define logic as reasoning in general, broader than mathematics, and since it runs in physical environment, it now interacts with a messy world. Code is defined by business requirements, and there's no mathematics for that.

Now you’re talking about the human activity of writing code, not the code itself.

Those business requirements are inputs to the process of writing the code.

Once the code is actually written, that exists as a formal logical system, defined by mathematics, not business requirements.


The discussion started about whether human needs math skills to write code. That's what I mean when I say programming isn't mathematics. Meaning of code is defined by human, how do you intend code to be defined by mathematics? The human first imagines mathematical formulas, then by some unknown process they become code? I don't think anybody does it like that. You start with requirements, intuitively guess architecture, then decompose it, it works more like Cauchy problem (but less numeric, more conceptual, you have an owl, now draw it), but I don't think anybody models it like that.

>Once the code is actually written, that exists as a formal logical system, defined by mathematics

I still think that's not code, but your favorite model of code. For spellchecker language is defined by mathematics too: it splits text into words by whitespace, then for each word not found in dictionary it selects best matches and sorts them by relevance. Oh and characters are stored as numbers.


> The discussion started about whether human needs math skills to write code.

Writing code IS a math skill. When writing code you are writing logic in a formal system. Logic is mathematics.

You may be thinking that mathematics is just like doing arithmetic or solving equations. It is way deeper and broader than that.

> I still think that's not code, but your favourite model of code

Code is not just modelled through mathematics, it is actually defined by mathematics. It is fundamentally a mathematical construct, grounded in formal semantics.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


That because mathematics doesn't need to match reality, so it's happy being ignorant about modelling? Anything you think is automatically true.

You are missing the point.

Code is not modelled mathematically, it is defined mathematically.

It exists as an abstraction which is fully defined by operational semantics and denotational semantics, not modelled or approximated.

In the counter example of a quark, that exists in nature and is modelled by mathematics, but not defined by mathematics.

https://en.wikipedia.org/wiki/Formal_language

https://en.wikipedia.org/wiki/Semantics_(computer_science)




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: