Hacker News new | past | comments | ask | show | jobs | submit login

Good code doesn’t just solve a problem, it solves it in a way that’s readable and modular.

I think the problem-solving part of coding requires math skills, while the organization part requires writing skills. The organization part affects the problem-solving part, because if you write messy code (that you can’t reread once you forget or extend without rewriting) you’ll quickly get overwhelmed.

Writing large math proofs also requires organization skills, since you’ll refer to earlier sections of your proof and may have to modify it when you encounter issues. But to me, math seems to have more “big steps”: sudden insights that can’t be derived from writing (“how did you discover this?”), and concepts that are intrinsically complicated so one can’t really explain them no matter how well they can write. Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.




Most coding doesn't need much of any math past boolean logic and very basic set operations. I'm much more likely to spend my time studying DB and interface schemas to understand how something works than doing a lot of mathy fiddling. Sure, some people write game engines and such, but even much of 3D graphics doesn't need anything more complicated than the first half of a linear algebra course.


  > Most coding doesn't need much of any math past boolean logic and very basic set operations
Coding IS math.

Not "coding uses math", I mean it is math.

  Mathematicians do not deal in objects, but in relations among objects; they are free to replace some object by others so long as the relations remain unchanged. Content to them is irrelevant; they are interested in form only.
  - Poincare[0]
I don't know how you code, but I don't think I'm aware of code that can't be reasonably explained as forming relationships between objects. The face we can trace a program seems to necessitate this.

[0] https://philosophy.stackexchange.com/questions/22440/what-di...


> Coding IS math.

Programming is an expression of logic, which is absolutely mathematics.

But then we also have to think about naming variables and classes, structuring our code so that it is more readable by other developers, and so on. That's less about formal reasoning and more about communication.

There is an engineering aspect to programming (prototyping, architecture, optimization, etc). It's a combination of mathematics and engineering. Software Engineering.


> Coding IS math.

The murky world of software patents would like a word about that.

For me, coding also feels artistic at times. But we see artistic beauty in mathematics all the time, so no wonder. But often I look at ugly code, and maybe that is subjective, but fixing that code makes it also feel prettier.

It is clear all developers have their owm artistic style, and probably that is why there is so much disagreement in the industry. Maybe we are lacking the pure mathemetical language to describe our intent in a more beutiful and precise way that is more clearly the-right-way. As in how we find beauty in simple physics equations.


Do you think mathematicians don’t have to think about naming variables and structuring proofs?


The presumably-mathematicians who wrote my Algorithms textbook (CLRS) didn't seem to think giving the variables in their algorithms useful names. They just use i, x, etc. all over the place and don't even consider giving things actual names. This makes the book much less accessible than it would be if they treated it more like most people write code.


  > They just use i, x, etc. all over the place
I do agree with your point btw, but I did want to note that there are good conventions around symbols. The brevity is heavily influenced by the medium. Variable names sucked when you had punch cards. It's still burdensome to write long names when using paper, chalkboard, whiteboard, or any system that doesn't have autocomplete.

In general, lower case letters are used as constants, excluding x,y,z,t,i,j,k (sometimes u,v,w). It isn't a hard rule, but strong preference to begin at the beginning of the alphabet for these. Capital letters usually are held for things like Variable Sets (like random variables). Greek letters need context for constants or variables. BB and Cal typefaces for sets (e.g. Real Numbers, Integers). And much more.

I think a lot of the difficulty in it is that these "rules" or patterns are generally learned through usage and often not explicitly stated. But learning them can really help read unfamiliar topics and is why "notation abuse" leads to confusion. But after all, math is all about abstraction so technically any symbol will do, but no doubt some are (significantly) better than others for communicating.

  There are two hard things in Computer Science:
    - Cache Invalidation
    - Naming Things
    - Off-by-One Errors


Picking what letters to use for what things can still be a struggle.

Mathematicians do have to deal with difficulties in naming things.


They just invent new characters.


What do you mean? Most letters used in math are just Latin and Greek letters in different fonts or stylizations.


That's how it is in applied mathematics. That's a hardcore computer science textbook. If your job is computer science research, inventing new algorithms, that is the kind of book you will get used to. There are better options for practical learning.


Of course they do, but that part is not "mathematics". It is communication. If they use the English language to write the proof that doesn't mean English is also mathematics.


Writing proofs is just as important to mathematics as writing code is to programming.


Yes it is important but that's not the point.

Structuring sentences and naming variables so that it is easier for other people to understand is less about formal mathematical reasoning, and more about communication.

You could name a variable x, y, or Banana, but it doesn't change the logic.


Neither is it "the point" in programming. You should be concerned with communication and have every right to get upset when someone is being needlessly convoluted but that's as much of a point in programming as it is in math, physics, or any domain.

I mean the reason we get mad at this is because it is someone destroying "society" in some sense. Even if that society is your team or just the set of programmers. It would be a pretty dick move were I to just use a word that significantly diverged from conventional meaning and expected you to mull it over. Similarly if I drop a unknown out of context math equation. It would be meaningless.

And I'm on your side, really! I strongly advocate for documenting. And let's be real, the conclusion of your argument more strongly argues for documentation than good variable names. Because variable names are much more constrained and much more easily misinterpreted considering how any word has multiple definitions. Surrounding code is often insufficient to derive necessary contextualization.

https://news.ycombinator.com/item?id=43874738


>Programming is an expression of logic, which is absolutely mathematics.

and also philosophy.


As in GNU vs. Microsoft? Or something more foundational in logic?


Classes are a platonic ideal representation of reality?


That's another gap: mathematical classes are ideal representation of fantasy, programming classes are leaky representation of reality.


I don't think that quote really supports coding and math being equivalent. To me, the quote provides an abstraction of math through a structuralist perspective. Language can also be viewed through this abstraction. I think coding could share the abstraction, but that doesn't make the three of these fields equivalent.


  > coding and math being equivalent
Please see lambda calculus. I mean equivalent in the way mathematicians do: that we can uniquely map everything from one set to another


This seems like the setup to a joke about how mathematicians don't know how to communicate to ordinary folks


Well a lot of people did wildly misunderstand the Poincare quote. To me is is obviously about abstraction and I think this is true for any mathematician. I thought it would also be natural for programmers considering we use "object" quite similarly, if not identically. So... maybe it is or maybe this is the joke.


"There are those who tell us that any choice from among theoretically-equivalent alternatives is merely a question of taste. These are the people who bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans. They are malicious idiots. The only punishment which could stand a chance at reforming these miscreants into decent people would be a year or two at hard labor. And not just any kind of hard labor: specifically, carrying out long division using Roman numerals." — Stanislav Datskovskiy


No one is suggesting programming with lambda calculus. But it would be naïve to think lambda calculus isn't important. They serve different purposes.

We didn't:

  bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans.


> Not "coding uses math", I mean it is math

> I mean equivalent in the way mathematicians do

That sounds like you're backing off from your original claim, probably because it is impossible to defend.

That you can use mathematics to describe code doesn't seem very different from using math to describe gravity, or the projected winner in an election, or how sound waves propagate.

Isn't the primary purpose of math to describe the world around us?

Then it shouldn't be surprising that it can also be used to describe programming.

In the real world, however, software engineering has nothing to do with mathematical abstractions 99% of the time


A programmer constructs a function from some data type to another while a mathematician constructs a function from witnesses of some proposition to another?

Though interpreting a CRUD app as a theorem (or collection of theorems) doesn’t result in an interesting theorem, and interpreting a typical theorem as a program… well, sometimes the result would be a useful program, but often it wouldn’t be.


Interpreting a CRUD apps (or fragments of them) as theorems is interesting (given a programming language and culture that doesn't suck)! e.g. if you have a function `A => ZIO[Any,Nothing,B]`, then you have reasonable certainty that barring catastrophic events like the machine going OOM or encountering a hardware failure (essentially, things that happen outside of the programming model), that given an A, you can run some IO operation that will produce a B and will not throw an exception or return an error. If you have an `A => B`, then you know that given an A, you can make a B. Sounds simple enough but in practice this is extremely useful!

It's not the type of thing that gets mathematicians excited, but from an engineering perspective, such theorems are great. You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.

It's actually the halting problem that I find is not relevant to practical programming; in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data. The hard parts have been neatly tidied away into databases and operating systems (which for practical purposes, you can usually import as "axioms").


  > It's not the type of thing that gets mathematicians excited
Says who? I've certainly seen mathematicians get excited about these kinds of things. Frequently they study Programming Languages and will talk your ear off about Category Theory.

  > You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
Sounds like math to me. A simple and imprecise math, but still math via Poincare's description.

  > in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data
In common settings. But those settings also change. You may see those uncommon settings as not practical or useful but I'd say that studying those uncommon settings is necessary for them to become practical and useful (presumably with additional benefits that the current paradigm doesn't have).


I think we're in agreement. My comment about the halting problem was meant to refer to Rice's theorem (the name was slipping my mind), which I occasionally see people use to justify the idea that you can't prove interesting facts about real-world programs. In practice, real-world programming involves constantly proving small, useful theorems. Your useful theorem (e.g. `Map[User,Seq[Account]] => Map[User,NetWorth]`) is probably not that interesting to even the category theorists, but that's fine, and there's plenty you can learn from the theorists about how to factor the proof well (e.g. as _.map(_.map(_.balance).sum)).


  > Isn't the primary purpose of math to describe the world around us?
No, that's Physics[0]. I joke that "Physics is the subset of mathematics that reflects the observable world." This is also a jab at String Theorists[1].

Physicists use math, but that doesn't mean it is math. It's not the only language at their disposal nor do they use all of math.

  > software engineering has nothing to do with mathematical abstractions 99% of the time
I'd argue that 100% of the time it has to do with mathematical abstractions. Please read the Poincare quote again. Take a moment to digest his meaning. Determine what an "object" means. What he means by "[content] is irrelevant" and why only form matters. I'll give you a lead: a class object isn't the only type of object in programming, nor is a type object. :)

[0] Technically a specific (class of) physics, but the physics that any reasonable reader knows I'm referencing. But hey, I'll be a tad pedantic.

[1] String Theory is untestable, therefore doesn't really reflect the observable world. Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so. But we're getting too meta and this joke is rarely enjoyed outside mathematician and physicist communities.


> No, that's Physics

Going on a total tangent, if you'll forgive me, and I ask purely as a curious outsider: do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?

What would have been the very beginning of math, the first human thought, or word or action, that could be called "math"? Are you able to picture this?


  > do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
I'm a bit confused. What exactly is the counterfactual[0] here? If it is hyper-specific to categorizing and describing then I think yes, those creatures could still invent math.

But my confusion is because I'm having a difficult time thinking where such things aren't also necessary consequences of just being a living being in general. I cannot think of a single creature that does not also have some world model, even if that model is very poor. My cat understands physics and math, even though her understandings are quite naive (also Wittgenstein[1] is quite wrong. I can understand my cat, even if not completely and even though she has a much harder time understanding me). More naive than say the Greeks, but they were also significantly more naive than your average math undergrad and I wouldn't say the Greeks "didn't do math".

It necessitates a threshold value and I'm not sure that this is useful framing. At least until we have a mutual understanding of what threshold we're concerned with. Frankly, we often place these contrived thresholds/barriers in continuous processes. They can be helpful but they also lead to a lot of confusion.

  > What would have been the very beginning of math
This too is hard to describe. Mull over the Poincare quote a bit. There's many thresholds we could pick from.

I could say when the some of the Greeks got tired of arguing with people who were just pulling shit out of their asses, but that'd ignore many times other civilizations independently did the same.

I could say when the first conscious creature arose (I don't know when this was). It needed to understand itself (an object) and its relationship to others. Other creatures, other things, other... objects.

I could also say the first living creature. As I said above, even a bad world model has some understanding that there are objects and relationships between them.

I could also say it always was. But then we get into a "tree falls in a forest and no one is around to hear it" type of thing (also with the prior one). Acoustic vibrations is a fine definition, but so is "what one hears".

I'd more put the line closer to "Greeks" (and probably conscious). The reason for this is formalization, and I think this is a sufficient point where there's near universal agreement. In quotes because I'll accept any point in time that can qualify with the intended distinction, which is really hard to pin-point. I'm certainly not a historian nor remotely qualified to point to a reasonable time lol. But this also seems to be a point in history often referenced as being near "the birth" and frankly I'm more interested in other questions/topics than really getting to the bottom of this one. It also seems unprovable, and I'm okay with that. I'm not so certain it matters when that happened.

To clarify, I do not think life itself necessitates this type of formalization though. I'm unsure what conditions are necessary for this to happen (as an ML researcher I am concerned with this question though), but it does seem the be a natural consequence of a sufficient level of intelligence.

I'll put it this way, if we meet an alien creature I would be astonished if they did not have math. I have no reason to believe that their math would look remotely similar to ours, and I do think there would be difficulties in communicating, but if we both understand Poincare's meaning then it'll surely make that process easier.

Sorry, I know that was long and probably confusing. I just don't have a great answer. Certainly I don't know the answer either. So all I can give are some of my thoughts.

[0] https://www.inference.vc/causal-inference-3-counterfactuals/

[1] https://existentialcomics.com/comic/245


>Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so.

That's not unique, all quantitative theories allow small modifications. Then you select parsimonious theory.


Coding is math in the sense that coding is a subset of math.

Mathematics is a very extensive field, and covers a vast amount of subjects.

For the same reason it can't be said that mathematics is equivalent to coding, as there are many things in mathematics that are not relevant for coding.

However, by far the most interesting parts of coding are definitely related to mathematics.


Sure, lambda calculus is math. To call assembly or typical imperative C math, at least in the same sense, is a bit of a stretch.


Working with different objects doesn't make it any less of math. Just because you can derive calculus from set theory (analogous to assembly or even binary here) doesn't make calculus "not math".

Math is about abstractions and relations. See the Poincare quote again.

Plus, the Programming Languages people would like to have a word with you. Two actually: Category Theory. But really, if you get them started they won't shut up. That's either a great time or a terrible time, but I think for most it is the latter.


Wheeler: It from bit. Otherwise put, every it—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. It from bit symbolizes the idea that every item of the physical world has at bottom—at a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.


That is not in contention with what I said, now is it?

Wheeler is arguing that if a tree falls in the forest, and there is nobody to hear it, that it still makes a sound because there are things that interact with the sound. But if the tree fell in a forest and there was nothing else in the universe then there is no sound because there is no observation.

It helps to read the whole thing[0] and to understand the context of the discussion. This is meta-physics and a deep discussion into what the nature of reality is. Ian Hacker has a good introduction to the subject but I find develop grave misunderstandings when they also do not have the strong math and physics background necessary to parse the words. Even people who understand the silliness of "The Secret" and that an observer need not be human often believe that this necessitates a multi-verse. A wildly convoluted solution to the problem of entropy not being invertible. Or closer to computer terms, a solution that insists that P = NP. There is information lost.

If you wanna argue that there's no difference between the word cup and a cup itself because there is no word without the observer who has the language, then yeah.

[0] https://historyofinformation.com/detail.php?id=5041


If nature is information theoretic, then physics is mathematics.


But that doesnt necessarily mean successful programmers are good at conventional math. This is why certain people in the department are identified as "math people".


I'm not sure why you'd think I disagree. It seems you understood I argued that it's unhelpful to make the distinction between math and "conventional" math

But I'll refer you to a longer conversation if it helps https://news.ycombinator.com/item?id=43872687


> Coding IS math.

> Not "coding uses math", I mean it is math.

Arguably writing a novel is math, if you use the right definition of math. But sometimes its more helpful to use more informal definitions that capture what people mean then what is technically accurate.


By that same logic you could also say that language is math. In fact I think your quote kind of disproves your point because the content/state of a program is super important in coding more than the form.

Coding used to be very close to pure math (many early computer science classes were taught in the Math Department in universities) but it has been so far abstracted from that to the point that it is its own thing and is as close to math as any other subject is.


  > By that same logic you could also say that language is math
Not quite, but the inverse is true. The language to math direction doesn't work because a lack of formalism. I can state incomprehensible sentences or words. (There's an advantage to that in some cases!) but when you do that with code you get errors and even you do it with math its just that there's no compiler or interpreter that tells at you


>I can state incomprehensible sentences or words.

since you can express paradoxes with match, perhaps not that different.


I think you misunderstand what "paradox" means. While it can mean "self-contradictory" it can also mean "contrary to one's expectation." Math uses both, but in very different contexts.

The contradiction is used in proof formulation, specifically to invalidate some claim. I don't think this is what you're implying.

The latter is what it contextually sounds like you're stating; things like the Banach-Tarksi Paradox. There's no self-contradiction in that, but it is an unexpected result and points to the need to refine certain things like the ZFC set theory.

I'd also stress that there are true statements which cannot be proven through axiomatic systems. The Halting Problem is an example of what Godel proved. But that's not contradictory, even if unexpected or frustrating.


forming relationships between objects

This is vague and doesn't mean anything. People can't even agree what 'objects' are and people did a lot of programming before the term 'object' was invented.

Programming is about is fundamentally about instructions and data. Yin and yang of two things that are completely different, even if people can occasionally mix them and conflate them.


Category theory is often jokingly called the study of dots and arrows. What's a dot? Anything. What's an arrow? A relationship.

I'm surprised I hit a nerve with so many people. I'm quoting someone who's considered one of the greatest mathematicians. Obviously I don't know the backgrounds of people but it seems like programmers have strong opinions on what math is that disagrees with what mathematicians say they do.

https://en.wikipedia.org/wiki/Category_theory


Now it's not math it's 'category theory' and that's dots and arrows and dots are anything.

At what point does the abstract description just not offer any useful insight anymore?

I'm surprised I hit a nerve with so many people.

A lot of people had very good explanations for why you're pretty far off in trying to say two things are the same.

Programming is much more like building a machine than any sort of straight math. There is state, there is interactivity, there is performance, there is IO and there are real world implications to all of it.

Saying they are the same is like saying gardening is plumbing just because you sometimes use a hose.


  > [No,] it's not math it's 'category theory' 
That's a wild claim considering Category Theory is a branch of mathematics

  | Category theory is a general theory of mathematical structures and their relations.
  - https://en.wikipedia.org/wiki/Category_theory
It is necessary that you provide an alternative definition as to what "category theory" is, though I suspect it will make many category theorists and mathematicians upset.

  > A lot of people had
A lot of non-mathematicians disagreed with mathematicians. https://news.ycombinator.com/item?id=43882197


That's a wild claim considering Category Theory is a branch of mathematics

It's not a wild claim since you misquoted me.

A lot of non-mathematicians disagreed with mathematicians.

Mathematicians can claim whatever they want, when it comes to programming, programmers understand it better and they're trying to explain to you why this is nonsense. Vanderbilt claims to be "the harvard of the south" but wouldn't you know it, harvard doesn't claim to be "the vanderbilt of the north".

Show me programming languages designed by mathematicians to be 'mathematically pure' and I'll show you a language that hasn't been used to ship software that people want to use.


It's math the way everything is physics, as well as the way physics is philosophy (as the natural sciences started out as branches of philosophy).

In other words: It's math in a sense that for most of us with a computer science background is often not very relevant to how we work.


>Coding IS math.

No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations, which account for a lot of programming work. Sure, the underlying machine code is all based on math, but the higher level programming doesn't need to involve a single math equation for it to be useful. Let's see where the goalposts move now...


Mathematics is not "equations". Most of mathematics is not related with calculus either.

For anything CRUD, you put the data into a database. The underlying database is relational, or a key->store.

If it's relational, that's one branch of mathematics. If it's another kind of database, it's another branch of mathematics. Mathematics is extensive, and covers more things you can imagine at a glance.

The main difference between writing mathematics and programming, and this applies to any form of programming, is that in mathematics writing a formal proof is amazingly close to: you write the program, and you are also the compiler, and you are the CPU, performing all operations yourself. With programming you only have to write the program.

Source: On one hand I have studied pure mathematics (not the simplified applied mathematics that are taught in engineering, which is mostly equations), on the other hand I have been working as a software developer for over 15 years.


  > It doesn't take math to perform CRUD operation
Yes it does. Just because the objects you are working with aren't numbers doesn't mean it isn't math. In fact, that's my entire point. It is why I quoted Poincare in the first place. He didn't say "numbers" he said "objects".


Sorry but you are wrong. "CREATE, READ, UPDATE, DELETE" operations are not mathematical in nature. It doesn't matter what the data is or how esoteric you want to get with it - a programmer doesn't need any math at all for simple CRUD. You're trying to move the goalposts all the way back to the 1800s to win a pointless internet argument.


A while back I made this[0] as a quick demo of generic batching for CRUD requests. It's very much practically oriented (it's for performance optimization while making the code nicely reusable/not muddying up the business logic), but also felt quite a bit like the same sort of things I did in my math degree.

Actually I'm starting to wonder whether the thing that made university math easy for me was I quickly developed a good internal "type checker"/"type inference engine" in my head, and that helped make the next steps of proofs seem straightforward.

[0] https://gist.github.com/ndriscoll/881c4f5f0398039a3a74543450...


I understand you have strong opinions, I just don't understand why.

In math are highly concerned with structures and abstraction. We have things called operators. They aren't just addition and multiplication. We also use those words to describe different operations. They have things like groups, rings, fields, and algebras. Yes, plural.

The purpose of these things is to create logical frameworks. It matters not what the operators are. Nor does it matter what objects we operate on. Poincaré is explicitly saying this.

The part you're not understanding is the abstraction. This is what math is about. It is also why the Programming Language people are deeper in the math side and love Category Theory (I have a few friends who's PL dissertations are more math heavy than some math dissertations I've seen). It's no surprise. What's a function? How do you abstract functions? How do you define structure? These are shared critical questions. PL people are more concerned with types but I wouldn't say that makes them any less of a mathematician than a set theorist.

We can perfectly describe the CRUD operations with set theory. Do most programmers need concern themselves with this? Absolutely not. But is it something people designing those operations and systems is thinking about? Yes.

I'd encourage you to learn some set theory, abstract algebra, and maybe a bit of cat theory. It'll make these things pretty clear. But I'd caution about having strong opinions on topics you aren't intimately familiar with. Especially when arguing with those that are. Frankly, CRUD is a terrible example. I'm confident a quick google search (or asking a GPT) would quickly point you to relational algebra. It's discussed in most database classes. It certainly was in the one I was an assistant for.


> I understand you have strong opinions, I just don't understand why.

As someone that is "on the other side of the fence", and getting flamed for it, maybe I can shed some light since I have the other perspective as well. IMO, The reason for not seeing eye to eye is (for example) akin to saying "word problems are math". (Thinking of a grade school word problems for common reference). Yes, they are readily mapped to mathematical models that can solve the word problem & perhaps almost indistinguishably so. Though no - word problems are not math. Word problems are a series of phrases and words. That's where the talking past each other comes in... Different interpretations of "word problems are math", or "code is math". It's seemingly not clear whether we are talking about logical 'implies', 'element of', or 'equals'.

Which goes to "We can perfectly describe the CRUD operations with set theory.", we all agree there. That is not readily conveyed though when writing things like "code is math".


  > That's where the talking past each other comes in...
Then sorry, but that's your own damn fault. I was clear about my definition and quoted a famous mathematician to give some authority, to not be "trust me even though I'm some rando". The way to respond to that is not "you're wrong, trust me, I'm some rando".

Yes, I agree we're misunderstanding each other because we're using different definitions but "you've" rejected "mine" without defining "yours" and expecting everyone to understand. Of course that'll lead to confusion. You can reject the mathematicians definition of math, but you sure gotta say more than "trust me" and it's a pretty wild thing to do, especially as non mathematicians.

The problem here is one side who's dabbled in cooking says "a chef makes desserts" and chefs are responding "we do a lot more than that". Maybe there's a few chefs that go "yeah all I do is dessert" and everyone points to that while ignoring the second part of their sentence is "but that's just my speciality." Wouldn't you think that conversation is insane? The only reason it's obviously so is because we all know what a chef is and agree on the definition. But who is better qualified to define the chef's job? The chef or consumer?


The way I'm responding I'd more characterize as: "wait, if what you are saying is true, then this other thing should be true too, but it does not seem to be. That would indicate what you are saying is not true."

In another thread, you characterized my response as stating: " ¬(A ↦ B) ⟹ ¬(B ↦ A)" (and this is a great example of language not being math, but math being language!). That was not at all my claim.

My claim is "I believe you are saying 'A = B'. It appears that 'B != A', therefore 'A != B'." My only claims are

(1) I believe you are writing to convey that you mean Math IS Language in the sense they are equal, identical, interchangeable, and fully equivalent, and bi-directionally so

(2) that: B != A

The only results can either be:

- "yeah, because B != A, the statement A = B is not true"

- Your claim (1) is false, I'm not actually saying "A = B"

- Your claim (2) is false, "B = A" is in fact true. I would find that to be an interesting assertion and would have loved to explore more why you think that.


> word problems

That is a good analogy, except programming languages are formal languages, not natural languages.

> they are readily mapped to mathematical models

With code we are not talking about something that is mapped to mathematical models. Code is not modelled by mathematics, it is defined by mathematics (denotational semantics and operational semantics).

"Code is math" is true in a theoretical sense. Practically, coding doesn't always feel like what is commonly thought of as "doing mathematics", but it is essentially mathematical.

https://en.wikipedia.org/wiki/Formal_language

https://en.wikipedia.org/wiki/Semantics_(computer_science)


You could say that math is behind the structure of a leaf, but the farmer doesn't care about that. Keep moving the goalpost all you want, I don't have time to argue these pointless things, and I stopped reading after the first sentence of your comment. I'm done here.


You seem really angry at the field of Computer Science for some reason.

We are talking about textbook CS fundamentals.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


CS isn't programming, it doesn't write code.


CREATE, READ, UPDATE, DELETE are fundamentally mathematical in nature.

In a typical implementation these are database operations. That involves relational algebra operations, state transitions, boolean logic.

The READ part can be a very complex SQL query (composed of many algebraic operations) but even the simplest query (SELECT * from t where ID = 1) is filtering a set based on a predicate. That is mathematics.

No one is moving goalposts. Set theory and logic are at the foundations of mathematics.


It's almost the argument of programming vs computer science coming out here.

This is math:

{x | x.id = 1}

OTOH, a SQL query is a SQL query.

This thread is hilarious though. It's like

- Cashier: here is your change.

- Customer: you did math!

- Cashier, no, I gave you change.

- Customer: that IS math!

- Cashier: You mean, I used math to give you change?

- Customer: No, giving change doesn't use math, it IS math!!!!" [2]

= D

Moving along.. FWIW, indeed SQL was created to model set theory (relational algebra and tuple calculus), the close relationship is no accident of course [0][1])

> No one is moving goalposts

I feel too they are.

First goal post:

> Coding IS math. >> No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations

Second goal post:

> CREATE, READ, UPDATE, DELETE are fundamentally mathematical in nature

CRUD is closely related to SQL, and SQL is closely related to various mathematics. Are they identical and therefore equivalent? No - because your database is not going to like it when you write "{x | x.id = 1}", and the Oracle DB might not like something that you can write for your Postgres DB.

[0] https://simpleprogrammer.com/mastering-sql/

[1] https://en.wikipedia.org/wiki/SQL

[2] To quote: """Not "coding uses math", I mean it is math""" @ https://news.ycombinator.com/item?id=43872771


The problem you're running into is that people who have some "mathematical maturity" don't get bogged down in notation, so it's difficult for them to see the distinction you're trying to draw between e.g. `{ x∈S | x.id = 1}` and `select x from S where x.id = 1`[0]. You say "a SQL query is a SQL query" and they just think "yes, which is also obviously a mathematical expression".

Computer programs are proofs[1]. This is intuitively and also formally true. You would agree writing proofs is doing math, yeah? Then obviously writing a computer program is also doing math.

Like I have a degree in math and have been a software engineer for over a decade. I do not know what distinction people are trying to get at. It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.

[0] Modulo details like NULLs and multi-set semantics, but surely that's not the distinction?

[1] Up to isomorphism


A better analogy is if someone got upset that someone else said "the set of natural numbers is the set of real numbers." One is a subset of course, and when that is highlighted, the response is "yeah, by 'is', I actually mean subset", therefore indeed: the set of natural numbers is the set of real numbers.

This is an interesting example: "Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not."

The first part is everything we need to look at. Are we saying that writing C is equivalent and equal to the entirety of all programming? That if you're programming then you are writing C code. No, there is an implied "is a form of" in there. Given the other clarifications and that so many people are claiming to be mathematicians, I would have expected the precision to say exactly "C is a form of programming" rather than "C is programming."

Turns out, the analogy of saying "the set of reals is the set of naturals" is more fitting compared to sets that are actually equal.


Code is logical in nature and is defined by mathematics.


I'd agree code is usually governed by mathematics, not defined by it though.

Goes back to this ridiculous proposition:

- Cashier: You mean, I used math to give you change?

- Customer: No, giving change doesn't use math, it IS math!!!!" [2]

The proposition is that "code IS math", not defined by, not uses, not inspired by, not relies on, not modeled after, but IS.


Just because what you're working on specifically is an equivalent of middle-school math doesn't tell us much about the field as a whole.

Though middle-school or not, it's still math.


All of that code is a series of logical statements and expressions. Mathematical logic.

But the CRUD logic is so basic and boring, so obvious, that it doesn't require any thought.


>All of that code is a series of logical statements and expressions. Mathematical logic.

Which code? The machine code that underlies everything? Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function? Not all programming has to be mathematical or even logical at a high enough level, and I say this as someone who's been coding assembly language for 40 years.


> Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function?

Those lines ARE mathematical logical statements.

Each line defines logical operations that are executed by the computer.

Same for high level or low level programming. It's all logic.


Storing a string in some abstraction is not a mathematical operation. I'm done with this thread, it's going way too far down too many rabbit holes. The quarks that make up the matter that the computers are made of are pure "math". There, now I've moved the goalposts.


That is a narrow perspective of mathematics and computer science.

Assigning a variable is a mathematical operation. It is a logical state transition. This is formalized in Programming Language Theory.

We are not talking about quarks. We are talking about lines of code which are objectively mathematical statements, no matter how non-mathematical they seem to you subjectively.

You start with a practical problem to solve, and a computing machine that can perform logical operations on data. Your job is to figure out a correct sequence of logical operations that will solve the problem. The program that you have written is a mathematical structure. It is mathematics.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


You confuse map with the territory. By the same logic you can say quarks are mathematics, because they are modeled by a theory that has some mathematics in it.


Our mathematical models of reality (i.e. Physics) is not the same as reality. So yes, by the same logic "quarks" are mathematics, but only in the same way that a "cup" is English. The quark was a great example, considering we can't observe it and purely rely on mathematical models, but the same argument would still hold true for a bowling ball. Physics is our description of reality, not reality itself. Our description of reality highly leverages the language of math, as that language was developed to enforce rules of consistency. Much the same way we design our programming languages, which is why programming language people study so much more math than your typical CS person.

If you're going to accuse someone of confusing the map with the territory, you really should make sure you aren't making the same error.


How math helps with programming languages? What math says about zero based indexes? How do you prevent off by one errors? How do you prevent buffer overflows? It's ergonomics problems.


It is hard to answer because of exactly what ndriscoll said[0]

  > It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.
As ndriscoll suggests, it is tautological. I mean look at what I said. I really need you to hear it. I said that coding is math. So what I hear is "How programming languages helps with programming languages?" Why are you expecting me to hear anything different?

  > What math says about zero based indexes?
Start at 0? Start at 1? Who cares, it is the same thing. The natural numbers, non-negative integers, integers, even integers, who cares? They're the same thing. And who cares about indexing at 0 or 1 in programming? That's always been a silly argument that's inconsequential.

  > How do you prevent off by one errors?
By not being off by one? What's the question? Like being confused about if you start at 0 or start at 1 and how to get the right bound? It is a shift from one to the other, but they are isomorphic. We can perfectly map. But I really don't get the question. You can formalize these relationships with equations you know. I know it isn't "cool" but you can grab a pen and paper (or a whiteboard) and write down your program structure if you are often falling for these mistakes. This seems more about the difficulties of keeping track of a lot of things in your head all at once.

  > How do you prevent buffer overflows?
By not going over your bounds? I'm so confused. I mean you are asking something like "if f(x) = inf when x > 10, how does math help you prevent the output of the function from being infinite?"

Maybe what will help is seeing what some of the Programming Languages people do and why they like Haskell[1].

Or maybe check out Bartosz Milewski[2,3]. His blog[2], is titled "Bartosz Milewski's Programming Cafe: Category Theory, Haskell, Concurrency, C++". It may look very mathy, and you'd be right(!), but it is all about programming! Go check out his Category Theory Course[3], it is _for programmers_.

Don't trust me, go look at papers published in programming language conferences [4]. You'll find plenty of papers that are VERY mathy as well as plenty that are not. It really depends on the topic and what is the best language for the problems they're solving. But you'll certainly find some of the answers you're looking for.

Seriously, don't trust me, verify these things yourself. Google them. Ask an LLM. I don't know what to tell you because these are verifiable things (i.e. my claims are falsifiable!). The only thing you need to do is look.

[0] https://news.ycombinator.com/item?id=43882197

[1] https://excessivelyadequate.com/posts/isomorphisms.html

[2] https://bartoszmilewski.com/

[3] https://www.youtube.com/watch?v=I8LbkfSSR58&list=PLbgaMIhjbm...

[4] https://www.sigplan.org/Conferences/


Nonmathematical problems are difficult to answer when you try to find mathematical answers. It should be obvious, why it doesn't work.

Mentioning Scala is ironic, it's very light on math spik and to begin with was created to unify object oriented with functional programming, which is mathematically meaningless, because both are Turing complete and thus equivalent, tautological.

>So what I hear is "How programming languages helps with programming languages?"

Oh, right, in mathematics axiom is argument, but in reality it isn't. In programming you should assert what you assume, otherwise your assumptions can be wrong due to divergence from reality, but there no reality in mathematics, only fantasy, so you can't understand this with mathematics alone.


> but there no reality in mathematics, only fantasy, so you can't understand this with mathematics alone

No. Code is an abstraction. It exists as a logical structure, grounded in mathematical logic.

https://news.ycombinator.com/item?id=43888917


That may be true to some extent, but I think you are missing the point. Quarks are physical in nature, and code is logical in nature. Programs themselves are formal systems. Code isn’t just modelled by mathematics, it is defined by mathematics.

In the case of code, we can argue that the map is the territory.


Code is logical if you define logic as reasoning in general, broader than mathematics, and since it runs in physical environment, it now interacts with a messy world. Code is defined by business requirements, and there's no mathematics for that.


Now you’re talking about the human activity of writing code, not the code itself.

Those business requirements are inputs to the process of writing the code.

Once the code is actually written, that exists as a formal logical system, defined by mathematics, not business requirements.


The discussion started about whether human needs math skills to write code. That's what I mean when I say programming isn't mathematics. Meaning of code is defined by human, how do you intend code to be defined by mathematics? The human first imagines mathematical formulas, then by some unknown process they become code? I don't think anybody does it like that. You start with requirements, intuitively guess architecture, then decompose it, it works more like Cauchy problem (but less numeric, more conceptual, you have an owl, now draw it), but I don't think anybody models it like that.

>Once the code is actually written, that exists as a formal logical system, defined by mathematics

I still think that's not code, but your favorite model of code. For spellchecker language is defined by mathematics too: it splits text into words by whitespace, then for each word not found in dictionary it selects best matches and sorts them by relevance. Oh and characters are stored as numbers.


> The discussion started about whether human needs math skills to write code.

Writing code IS a math skill. When writing code you are writing logic in a formal system. Logic is mathematics.

You may be thinking that mathematics is just like doing arithmetic or solving equations. It is way deeper and broader than that.

> I still think that's not code, but your favourite model of code

Code is not just modelled through mathematics, it is actually defined by mathematics. It is fundamentally a mathematical construct, grounded in formal semantics.

https://en.wikipedia.org/wiki/Semantics_(computer_science)


That because mathematics doesn't need to match reality, so it's happy being ignorant about modelling? Anything you think is automatically true.


You are missing the point.

Code is not modelled mathematically, it is defined mathematically.

It exists as an abstraction which is fully defined by operational semantics and denotational semantics, not modelled or approximated.

In the counter example of a quark, that exists in nature and is modelled by mathematics, but not defined by mathematics.

https://en.wikipedia.org/wiki/Formal_language

https://en.wikipedia.org/wiki/Semantics_(computer_science)


It's definitely a gray area. Is a DAG traversal algo "math", or is it more computer-sciencey? What if you do it in SQL? Certainly there's a mix of more or less concentrated logic/math vs glue code, and most of that is very dependent on the domain you're working in.

I find this distinction useful in the abstract, that one can engage different parts of the brain for different components of development. This probably explains why a well-written DSL can be so powerful in the right context.


> Is a DAG traversal algo "math", or is it more computer-sciencey?

Firstly, computer science is math.

Secondly, I remember covering graphs in a discrete math course back when I was in college.

> What if you do it in SQL?

SQL is more-or-less a poor implementation of relational algebra. Ie, math.


Computer science is math in the same way that physics is philosophy, in that computer science certainly started as math, just like the natural sciences used to be subdisciplines of philosophy.

But it's hardly a useful grouping any more. You can study and do well in computer science with minimal knowledge of most of the core mathematical subjects.

While graph theory certainly crosses over into math, you can cover most of the parts of it relevant to most computer science as a discussion of algorithms that would not be the natural way of dealing with them for most mathematicians.


Is it possible your mental model of what CS is more aligned with software engineering rather than actual CS? Could you share some examples of what you consider to be CS but lacks any mathematical relation?

I agree is not a useful grouping in practice. I'm just interested in what makes you think like you do.


I did categorically not claim, nor even suggest, that any CS "lacks any mathematical relation".

What I claimed was that in computer science we often discuss things in terms that would not be the natural way of dealing with it in maths. We do that because our focus is different, and our abstractions are different.

It doesn't mean it's not math. It means it's not useful to insist that it isn't a different field, and its obtuse when people insist it's all the same.


Got it, thanks for the reply


> You can study and do well in computer science with minimal knowledge of most of the core mathematical subjects.

You will fail at Theoretical Computer Science without mathematical proficiency. Go read some textbooks and papers in theoretical CS. It is a subfield of mathematics. Theorems and proofs. Rigorous and difficult mathematics.

https://en.wikipedia.org/wiki/Theoretical_computer_science


I've read plenty of theoretical computer science papers over the last 30+ years, and while some of it requires "rigorous and difficult mathematics" that is by no means universal.

I wrote my MSc thesis on the use of statistical methods for reducing error rates for OCR, and most of the papers in my literature review hardly required more than basic arithmetic and algebra.

So I stand by my statement.

Sure, there are subsets of computer science where you need more maths, just like in any field there are sub fields where you will need to understand other subjects as well, but that does not alter what I claimed.

EDIT:

Some authors are quicker to pull out the maths than others, and frankly in a lot of CS papers maths is used to obscure lack of rigor rather than to provide it. E.g the problem I ran across when writing my thesis was that once you unpacked the limited math into code you'd often reveal unstated assumptions that were less than obvious if you just read their formulas.


Now we're getting into the "define maths" part of the discussion which is always where these discussions die. It can be argued that turning a kettle on and boiling some water is "maths" or it can be as narrow as "everything above basic arithmetic is logic, not maths."

So how much of programming is maths? Before we answer that, let's answer: How much of maths is actually maths? Because first we define maths, and then we define programming based on whatever that is, but until we have that first concrete definition this discussion cannot occur.

I will add that "it is taught by the maths department in college" is a flimsy argument, and frankly one the Physics department in particular would mock.


I think for this discussion, "math is stuff you'd learn in a math department" is a pretty useful definition, even if it's not a very good one. There's a lot of math involved in the design and manufacture of the kettle, electrical grid, and water utilities, but a person's ability to put a kettle on isn't going to be improved by math classes. In that way, programming probably is a bit mathy, but good programming is more like good technical writing than it is like math.


> I think for this discussion, "math is stuff you'd learn in a math department" is a pretty useful definition

That means that definition shifts over time. For example, courses on numerical analysis, graph algorithms, programming, and on compilers used to be part of “what you’d learn in a math department”.

It likely also even today will show geographical variation.


Overlap between programming and math is quite small, e.g. halting problem is solvable in programming (by flow analysis) but isn't solvable in math. Programming deals only with practical problems and can be completely guided by practical considerations, while math requires abstract outlandish skills - exact opposite. Why talk about math at all if it's already well known that programming is engineering?


Probability is math and there you have caching, hashing, etc. Then there are permutations, combinations, etc. with a lot of usecases in software. Distributing graphs across nodes? More math.


State machines too


Boolean logic and set theory are very important to computer logic, and graph theory is not far behind. But you can also learn Boolean logic in the philosophy dept. Which I unfortunately learned the hard way by taking Philosophy 101 after already having the CS class on logic. Took a semester to go over what we did for the midterm. Got a lot of naps that semester.


The difficulty is in how many relationships you need to keep in mind, not in how hard each of them are.

Just like in math.

BTW relational DBs are math.


> BTW relational DBs are math.

It's funny, reading the post you're replying to, I basically read it as

> I don't need math, I need <math, but by another name>

My teenage daughter used to complain about math, and I spent some time trying to explain to her that we use math every day... EVERY day. Now, when I see her do something that was math (even if it's not obvious it was math), I say "Math... every day". I say that a lot.

Also, yes, my daughter finds me annoying. But also funny; but likely not for the math thing.


  > I don't need math, I need <math, but by another name>
This seems to be how it always goes. I think we've confused a lot of people by conflating math with arithmetic.

https://news.ycombinator.com/item?id=43872687


DBs aren't math. In math it's perfectly ok for computation to take infinite time, you can just assume it already completed, but DBs don't work like that.


> Good code doesn’t just solve a problem, it solves it in a way that’s readable and modular.

Strongly disagree.

There are plenty of cases where non-modular code is more readable and faster than modular code (littered, presumably, with invocations of modular logic).

There are also countless cases, particularly in low-level languages or languages with manual memory management, where the best solution -- or the correct solution -- is far from readable.

Readability is anyways in the eye of the beholder (My code is readable. Isn't yours?) and takes a back seat to formal requirements.

Just as a matter of personal style, I prefer long, well-organized functions that are minimally modular. In my experience, context changes -- file boundaries, new function contexts, etc -- are the single greatest source of complexity and bugs, once one has shipped a piece of code and needs to maintain it or build on it. Modular code, by multiplying those, tends to obscure complexity and augment organizational problems by making them harder to reason about and fix. Longer functions certainty feel less readable initially but I'd wager they produce better, clearer mental models, leading to better solutions.


I think the important concept that "readable and modular" is trying to get at is how easy is it to continue working on the code in future. There's definitely codebases that are easier to work on than others, even when the domain is the same.

I'd say that readability, which often boils down to consistency, and modularity are ways to do this, but they aren't the only ways. And as you say, sometimes there's a need for "unreadable" code, so not everything can be easy.


Regarding mathematical

> concepts that are intrinsically complicated,

I'm not a mathematician, but I figure mathematicians aim for clean, composable abstractions the same way programmers do. Something complicated, not just complex in its interactions with other things, seems more useful as a bespoke tool (e.g. in a proof) than as a general purpose object?

> Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.

This is well put. I often wonder if a merely average working memory might be a benefit to (or at least may place a lower bound on the output quality of) a programmer tasked with writing maintainable code. You cannot possibly deliver working spaghetti if you can't recall what you wrote three minutes ago.

This is a baldly self-serving hypothesis.


> You cannot possibly write or deliver spaghetti, working or otherwise, if you're not capable of remembering what you wrote three minutes ago.

Forth programmers make a similar point. Forth is stack based; you typically use stack operations rather than local variables. This is ok when your 'words' (analogous to functions/procedures) have short and simple definitions, but code can quickly become unreadable if they don't. In this way, the language strongly nudges the programmer toward developing composable words with simple definitions.

(Of course, Forth sees little use today, and it hasn't won over the masses with its approach, but the broader point stands.)


> Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.

... In my experience, learning to write one component at a time (and try the code, and make sure it works before proceeding) is itself a skill that many struggle to develop. Similarly for avoiding unnecessary dependencies between components. Oh, and also being able to analyze the problem and identify separable components.

One of the most frustrating things about teaching programming, for me, is the constant insistence from other teachers that you have to maintain an "absolutely everyone can learn to program" attitude at all times. Many people who start to learn programming have misguided or confused reasons for doing so and - to say the least - could make much more effective use of their time developing other skills. (It's not a question of elitism; I'd surely flounder at some tasks that others find natural.)


I dislike "everyone can learn to program" because it conflates many, many levels of skills and expertise.

I very much think many people could learn the more advanced Excel Formulas, Power Automate and even simple Bash/PowerShell scripting to make their work more effective. I've met quite a few folks who had been intimidated out of trying who could do it.

On the other hand, how many people on this site could bootstrap a linux kernel on either very new or very old hardware? I know there are some, but they are certainly not the majority. I certainly won't be the first person to get linux and doom to run on a quantum computer.

But that is similar to other professions. Everyone with a largely functioning body can learn to turn a few planks and some metal parts into a functional shed door with some basic tools or to put up a decent brick wall that won't topple over in a month.

That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.


> Everyone with a largely functioning body can learn to turn a few planks and some metal parts into a functional shed door

It's shocking how little physical and spatial ability some people have - that is definitely not true. Sometimes it might be a personal discount or lack of confidence, but this remains true regardless of the cause.


> "everyone can learn to program"

> That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.

Exactly!

I think statements like that are more concerned with philosophy than reality. Any discussion surrounding topics like this typically ends up being a discussion around definitions.

I believe the vast majority of human beings are capable of learning how to program in the most extreme elementary sense of the word. As in, outside of severe disabilities or complete and utter inaccessibility to circumstances in which one could learn program, then I think the remaining population of people could learn to program to some degree. Obviously, not everyone will learn to program due to a near infinite number of reasons.

I would argue it's like music. Anyone can make 'music.' Just make a sound -- any sound. The difference between noise and music is subjective. I would not argue that everyone could be the next Lovelace, Turning, Ritchie, Thompson, Torvalds, etc..

Now, for my jaded opinion, I think a lot of the "everyone can learn to program" talk does not come from a place of desire to share the gift of knowledge and joy of programming. I think it's more of a subtle way to encourage people to go into programming so that they may be hired by mega corps. In order to keep the Capitalist machine running. It's like the National Hockey League's slogan, "Hockey is for everyone." That is just a fancy way of saying, "everyone's money can be spent on the NHL."


I recall reading about a professor at uni which carefully designed his tests throughout the year so he could determine if a student had grasped a certain programming topic. He also tracked effort, through handins and such.

After collecting data for a few semesters he concluded his students could be clearly divided into three categories: those who just "got" programming, those who understood it after working hard, and a small group that just didn't grasp regardless of effort.


I guess it depends on what is meant by "everyone can learn to program". It's a bit like saying everyone can learn to write, or to do math, or play tennis, etc.

I'm sure everyone is capable of learning some basic level of programming, just as they are able to learn a basic (high school) level of any subject. However, not everyone is going to have the aptitude to take that to an advanced professional level, however hard they try. We're not all cut out to be artists, or writers, or doctors, or scientists, or developers, etc.


>"absolutely everyone can learn to program"

Personally I've always considered a solid grasp of algebra to be the minimum bar for being able to program, at least for anything that isn't utterly trivial. Being able to take a word problem and turn it into a system of equations and solve it is a pretty close analog to being able to take some sort of problem or business requirement and turn it into code.

And the sad truth is that a huge percentage of the population struggle with just arithmetic, let alone algebra.


I mean - not "absolutely everyone" can learn to do just about anything. There's a wide distribution of intelligence, aptitude/ability and desire amongst folks amongst all of the various things that you might learn (not to mention learning disabilities, impairments, etc).

I might be capable of learning advanced accounting, but that sounds like torture to me and I'll be damned if I'll ever take that on! I'm sure programming feels like that to a wide variety of people, and I don't see any need for us to try to pretend otherwise - outside of a bizarre ideological desire for equivalent outcomes from disparate groups.


What do you mean by math? Abstractions or calculations? The effort it takes me to do long division in my head matches the effort it takes me to follow some obtuse spaghetti code. See, I can get good and fast at long division in my head, but I may never understand the fundamental theorem of calculus. Some people are really good at mucking around garbage code (they have no choice, they get paid to), but what part of programming did they get good at? Obviously, some part of it, but nothing to write home about. Whenever I sense that I'm just getting practice at doing the equivalent of mental long division at work, that's when I always seek a new job. No amount of money is worth falling behind like that.


I'm thinking of "computation", "intuition", and "organization".

Computation is following an algorithm. e.g. long division or computing a derivative.

Intuition, AKA brilliance, is finding a non-obvious solution. Think "solving an NP problem without brute force"*. e.g. solving an integral (in a form that hasn't already been memorized) or discovering an interesting proof.

Organization is recording information in a way that a) is easy for you to recall later on (and get insights from) and b) is digestible by others**. e.g. explaining how to compute a derivative, solve an integral, or anything else.

Math, programming, and writing each require all skills. The kind of math taught in school (e.g. long division) and your boring jobs are primarily computation. I believe advanced math (e.g. calculus) is primarily intuition; it requires some organization because big theories are broken into smaller steps, but seems to mostly involve smart people "banging their head against the wall" to solve problems that are still quite unclear***. Programming is primarily organization. It requires some intuition (I think this is why some people seemingly can't learn to code), but in contrast to math, most programs can be broken into many relatively-simple features. IMO implementing all the features and interactions between them without creating a buggy, verbose, and unmaintainable codebase is programming's real challenge. Writing is also primarily organization, but finding interesting ideas requires intuition, and worldbuilding requires computation (even in fiction, there must be some coherence or people won't like your work).

> Some people are really good at mucking around garbage code (they have no choice, they get paid to), but what part of programming did they get good at? Obviously, some part of it, but nothing to write home about.

I agree that work you find boring should be avoided, and I also try to avoid working with it. But some people really seem to like working on esoteric code, and I think there are some skills (beyond computation) developed from it, that even apply when working with good code. Building a mental model of a spaghetti codebase involves organization, and if the codebase uses "genius hacks", intuition. Moreover, the same techniques to discern that two code segments in completely different locations are tightly coupled, may also discern that two seemingly-separate ideas have some connection, leading to an "intuitive" discovery. There's an MIT lecture somewhere that describes how a smart student found interesting work in a factory, and I think ended up optimizing the factory; the lesson was that you can gain some amount of knowledge and growth from pretty much any experience, and sometimes there's a lot of opportunity where you'd least expect it.

* Or maybe it is just brute force but people with this skill ("geniuses") do it very fast.

** These are kind of two separate skills but they're similar. Moreover, b) is more important because it's necessary for problems too large for one person to solve, and it implies a).

*** And whatever method solves these problems doesn't seem to be simplification, because many theories and proofs were initially written down very obtuse, then simplified later.


A lot of people in this type of threads always makes the same mistake: confusing what math is with branches of math, or rather, ways in which math is used. The way the education system is built certainly contributes to this.

I've always found the car metaphor to work very good to understand this: A car is a machine that can transport itself to point A to B (some other rules apply). There are different types of cars, but you certainly haven't understood the definition of you say that something is not a car because is not a Volvo, or because it doesn't look like a Ford, when it's clearly able to transport itself.

Math is the study of ideal objects and the way they behave or are related to each others. We have many branches of mathematics because people have invented so many objects and rules to play with them. Programming is nothing if not this very definition. The fact that you don't have to "use math" when programming is not really addressing the point, it's like saying a car is not a car because it has no discernible brand.


Declarative construct is made of relations, but imperative execution isn't, rather it's a process in time, but time is not a thing in math.


Another misconception I'd say.

"Time is not a thing in math" is not understanding what math is. Time is another ideal object following certain rules under a given domain. Programming is coming up with objects of different size, with different characteristics, with interact at different points in time, i.e. following certain rules.


Its very rare imo that computational problems emerge fully formed & ready to be tackled like proofs.

Usually even deciding what the problem is is in part an art, requires an act of narrativization, to shape and form concepts of origin, movement, and destination.

A good problem solver has a very wide range of abstract ideas and concepts and concrete tools they can use to model and explain problem, solution, & destination. Sometimes raw computational intellect can arrive at stunningly good proposals, can see brilliant paths through. But more often, my gut tells me it's about having a breadth of exposure, to different techniques and tools, and being someone who can both see a vast number of ways to tackle a situation, and being able to see tradeoffs in approaches, being able to weight long and short term impacts.


    > Its very rare imo that computational problems emerge fully formed & ready to be tackled like proofs.
In my generation, the perfect example is Python's Timsort. It is an modest improvement upon prior sorting algorithms, but it has come to dominate. And, frankly, in terms of computer science history, it was discovered very late. The paper was written in 1993, but the first major, high-impact open source implementation was not written until 2003. Ref: https://en.wikipedia.org/wiki/Timsort

It has been reimplemented in a wide variety of languages today. I look forward to the next iteration: WolfgangSort or FatimaSort or XiaomiSort or whatever.


The a capital example of an exception that prove the rule.

I absolutely value & have huge respect for the deeply computational works that advance us all along!

But this is an exceedingly rare event. Most development is more glue work than advancing computional fundamentals. Very very very little of the industry is paid to work on honing data structures so generally.


+1. Excellent description of how the skills relate. I often read, listen to poetry when stuck in a math or programming problem. ... or just talk to the rubber duck. :-)


>Good code

"Good code" is very subjective. Even readability and modularity can be taken too far.


The problem with doing things right the first time is some people look at it and just say, “well of course it should work that way.” Yes but did you think of doing it that way?


Organization may also benefit from spacial skills.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: