Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
YourLanguageSucks (theory.org)
57 points by curtis on July 25, 2019 | hide | past | favorite | 80 comments


The site lists Microsoft being the defacto contributor to the C# language spec as a negative. Honestly I think Microsoft has been an excellent steward of the language, and their opinionated choices over the years have helped keep C# succinct while also capable and modern.

I'm a bit of a fanboy, but when you compare it to the language that has been around longer and is a direct competitor (Java), C# really blows it out of the water for most common programming tasks (async, operator overloading, pair/triple returns in C# 7, var keyword, superior generics, to name a few.)


Yup C# is definitely in the king position for application programming right now. Between .net core, asp.net and xamarin you can target absolutely everything. The main impediment to better quality software is devs raised on Linux having irrational aversion to Microsoft still while thinking Google is either competent or benevolent.


I was very disheartened by the negative reaction of the Linux community to attempts to bring C#/Xamarin to Linux. Since if there was anything the Linux desktop really needed was simple ways to write GUI's for Linux. And semi cross platform GUI's. Both of which Xamarin could provide.


Agreed. And: > C# sucks because: > You can't assign a new value inside a foreach loop > (e.g. foreach(int i in vec) { i = i+1; }).

LOL


You can't do this in a lot of languages. Python comes to mind too.


Well, exactly. It's a bad idea...


The go section is great, it's such a derpy 1980s language:

"Go's error type is simply an interface to a function returning a string.

Go lacks pattern matching & abstract data types.

Go lacks immutable variables.

Go lacks generics.

Go lacks exceptions"

I've been searching for as succinct a criticism as this and now have it, thanks.


As a go lover who programs in it daily... That section was fantastic, and I agree (almost) wholeheartedly.

And the select code [0] is kind of horrifying.

I am going to disagree with this, though:

> Go's native package system does not support specifying versions or commits in the dependency information. Instead, the go community recommends that each major release have its own separate repository; github.com/user/package/package-{v1,v2,v3}.

As far as I'm aware, you don't create a new repo for a new major version - just a new branch, or a subdirectory. This comment [1] from Russ Cox describes it pretty well IMO. (Though it appears that the page hasn't been updated since 2017, and Go Modules is more recent than that)

0: https://github.com/golang/go/blob/master/src/runtime/select....

1: http://disq.us/p/1qcpqao


This is literally a list of reasons why Go is wildly successful. Go is a language that does exactly what it needs to and no more. If you want more, there are plenty of languages that have more syntax than users.


And bless go for pulling all those users in!


The defer interaction with named return values is going to give me nightmares.

We have a senior dev whose software caused a memory leak in prod. One of our less senior devs (who has an engineering PhD in concurrent systems theory) immediately identified that the senior was trying to do clever optimizations for single threadedness that simply cannot work when concurrent... I wouldn't be surprised if his code is causing a goroutine leak.

I am convinced that go is popular because it's comfortable to elder seniors and makes them think they can use their aging python/c/c++ skills in something flashy, new, concurrent, "supported by Google" and... Well at least it's genuinely safer.

Disclaimer: I'm an aging dev myself.


I actually think go is being propped up by young people with mediocre experience/jobs that work on glue/ops tooling and thinking they are leet. Go is such a primitive language it feels like systems programming (C) without actually being useful for systems programming and is wildly outclassed for applications programming. The perfect self selector for Google's SRE army.


One nice thing about Go is it produces cross platform binaries without external dependencies, and the standard library supports Windows api stuff like system services and registry.

Mono/C# is better, but the learning curve is pretty steep compared to Go with all the missing reference assemblies, diverse runtime versions, etc. Gopath sucks and sucks to configure but it's less complicated than .net.

So you can ask a junior web dev to learn a new language in a week and write a native windows application, and they might actually deliver. Whether this is empowering or signals the death of Quality remains a hotly contested issue.


I started writing Go at age 23, for what its worth. (That was a just a few years ago.) I like Go because its simple. Obviously, that is also its downfall. Overall, I would still hold it as one of the best languages for writing various kinds of software (servers, CLI tools, and some odds and ends.)

It’s true that you can write bad Go by trying to do things that are not a good idea, but I’m not sure what you’d expect. The Go ecosystem is damning evidence of the ease of writing pretty good code in the language.

Disclaimer: I work for Google (but I do not work on Go and I did not work for Google when I first began writing Go.)


No doubt. I strongly believe go is a major advancement over c/c++ for routine work and an improvement over python.

There are, however major footguns, I find it hard to read other people's, or even my own code (parsing nested if conditionals is no fun when you're debugging under duress), and CSP is really just.... Not the best concurrency paradigm.


I fully disagree. I find Go very nice to read.


There's no such thing as "C/C++".


Sorry c/c++ family. Explicit is better than implicit.

Pardon me. C was my second real language after Pascal and c++ came immediately after that in a 6 month timespan, so for my underperforming and scelerotic brain (aside from an ld_preload shim I wrote two years ago haven't touched them since the aughts) they are "together" in my brain. Curly braces and all that.


I think that's plausible, even though Google's intentions with it were the opposite:

I've had the opportunity to talk with a few people who were involved in Go's design and they explicitly said the idea was to get fresh-out-of-school programmers to be productive on web systems immediately.


One of the idioms of Golang is not being clever but being verbose and clear.

Looks like your senior dev has completely missed that.


By clever I mean clever reuse of system resources, e.g. This is not anything being more verbose would have helped.... Race conditions and deadlocks were introduced.


While I don't like Go very much, but I find this one unjustified:

> Similarly, the len function on strings returns the number of bytes in the string, which is not necessarily the number of characters.

Should len do unicode normalization (NFC)? Than it's extremely complex. Or should it count code points? Then it's simply broken.

Counting bytes is fine, I'll do explicit normalization if I need to.

>can be conveniently concatenated with +, and can be used as map keys.

Does Go do a normalization for a concatenated string as it should?




This page is actually really funny and talks about a lot of the quirks I have with the languages I've used extensively in the past (C#, JavaScript, Python, etc.)

But when it comes to Go, I absolutely adore its simplicity. I find myself wanting to write more Go code and I sometimes even catch myself trying to find reasons to write implementations in Go.

Anyways, that's just my two cents.


Add to that that it's damn fast. That's a pretty good reason to use Go for me.


Many of the criticisms listed are fair (even though quite nitpicky at times), but there are a few that I don't want to let stand unrebutted:

> Because strings are just slices of bytes, there is no simple way to index or slice a string if it contains non-ASCII characters.

Indexing into a UTF-8-encoded string is expensive -- O(n) instead of O(1) -- so it's good that it also looks expensive.

> Despite the above, Go has two competing types for representing text, string and []byte.

As has any other serious language. Byte arrays and strings are just not the same thing. My only real gripe with Go's treatment of strings is that converting from []byte to string is a simple cannot-fail cast. That should be an operation that checks for valid encoding, and returns an error otherwise, like Rust's str::from_utf8.

> Deleting the nth element from a slice sure doesn't look like deletion: [code snippet]

This is a fair criticism, but IMO way overblown. I cannot remember the last time I had to delete individual elements from slices. Most of the time I'm doing map() or filter(), or rather, the equivalent spelled-out for loops since Go does not have generics.

> If you import a library or declare a variable, but do not use it, your program will not compile even if everything else is valid.

And that's great. I don't want old libraries bloating my program needlessly. It's sometimes a bit annoying to have to write

  _ = foo
into unfinished functions because otherwise it will not compile due to "error: foo is never used", but I like my compilers strict.

Also to this point: You can set up your editor to run your code through goimports automatically, which will completely do all the adding, removing and ordering of imports for you, as well as format your code according to the standard style. More languages need this.

> If multiple channels are available to receive from or send to, the select statement picks a case at random, meaning that to prioritize one channel over another you have to write: [lengthy code snippet]

Having to prioritize channels manually is a bit of a code smell in my opinion. I would need a real-world example to judge, but it smells like the real problem is how you slice up your work into goroutines.

> Two implementations of random numbers, math/rand and crypto/rand.

Again, this is completely standard for any serious language. For mathematical simulations, you don't care about cryptographic properties of your RNG and want the extra speed that a simpler RNG like a Mersenne twister gives you.

> The errors package is twenty lines long, because Go's error type is simply an interface to a function returning a string.

You forgot the criticism here.

> Go's native package system does not support X

All of these are wrong. Go has added a proper native package system in 1.11.

> The contributors don't listen to the community.

One of the two examples shows that the contributors do listen.

> Almost nothing in this list can be fixed, because of the Go 1 compatibility promise. Until Go 2.0, that is, but that may never happen.

That's the problem with crystal balls.


> That's the problem with crystal balls.

TIL, you need a crystal ball to foresee the features and design decisions invented many decades ago and polished by the industry.

Who knew that living without parametric polymorphism in a static language would be painful. Of course this requires a handful of RFCs because nobody knows how to implement parametric polymorphism right as if ML didn't do it decades ago with soundness proofs and all needed extensions.

Who knew that `if err != nil` is error prone, verbose and noisy way of dealing with errors. Again, nobody knew how to do it right, there were no Common Lisp, Scheme, Haskell who already did this right. There are Monads, Exceptions, Effects, Conditions for error handling, lets throw all of this experience away and invent some obscure `try` keyword.

Reinventing the wheel is fun. Rooting your work in previous rigid and battle-tested (or even formally verified) experience is boring and complex.


Yet Go is more popular than Common Lisp, Scheme and Haskell combined.


Popularity is very different from quality.


So is PHP.


"Perl 5 sucks because ... the regular expression syntax is horrid"

So horrid that it's become essentially standard :eye-roll:


Why is it that these lists seemingly always give JS a hard time for function scoped variable declarations, but never give Python the same treatment?

Not to mention “nonlocal”...


Probably because everyone knows JavaScript, but not everyone use it by choice, giving you a higher than average ratio of people who both dislike and know the language.

This is actually one of the reasons I think every language except for JavaScript sucks for open source web-development. Not that I particularly like JavaScript myself, but everyone knows how to use it. I work for a Danish municipality, we use a lot of open source software, but we’re roughly 10 developers and ops technicians that support 7000 employees and around 300-500 IT systems. As a result we can’t use every tech-stack. This is pretty standard for average sized muniplacities by the way.

We don’t share tech stacks, but we do work together, and as you can imagine that leads to wasted resources. We’re a C# shop our direct neighbour does PHP, because of limited resources, we can’t utilise each others projects. That’s just a silly waste. Good luck convincing anyone to change their overall and often cohesive strategy for tech choices though.

The one technology we can all use, however, is JavaScript and everything build within that ecosystem can be put to good use everywhere. So while the language sucks at a lot of things, it’s also the only universal web language that we’ve got.


> We don’t share tech stacks, but we do work together, and as you can imagine that leads to wasted resources. We’re a C# shop our direct neighbour does PHP, because of limited resources, we can’t utilise each others projects. That’s just a silly waste. Good luck convincing anyone to change their overall and often cohesive strategy for tech choices though.

One suggestion, if you don't mind. When I did work for the DoD as a contractor, because of small business rules in government contracting we were pretty much forced to work with at least 10 other companies and all of them had their own preferences for tech stacks, their own expertise, etc. So we tried to go down the road of using things like thirft or avro so regardless of what language / stack you used, as long as you could interface with those on the same network (since we're talking about inter-service communication here) it was still pretty quick.

It's not ideal but sometimes you gotta work with what you have. Sometimes you can find unique ways to bridge gaps across platforms to re-use work. One time a group we worked with tried to use a web app with multiple iframes as the communication bridge between stacks. It _worked_ but it was pretty hacky and slow.


We do have some solutions, if it’s meant to be run as a SaaS or it’s got a major supplier doing updates and support, then we can use whatever.

The problem comes when we want to contribute or run things ourselves, in which case I’m not sure stuff like avro would help us, because we genuinely can’t keep X server technology secure or even updated with our available resources. I mean, I’m sure our Azure and ADFS technician could learn how to operate a JBOSS server on red hat, but he doesn’t have the time to do that and do his regular job. He might not even be interested in doing it, being one of the most talented Microsoft technicians I’ve ever met, he might simply move to a place that let him work with what he enjoys. Those aren’t great engineering arguments for not running JBOSS, I know, but that doesn’t mean it’s not part of managing an IT team.


Python has UnboundLocalError. Javascript will use undefined, which may not surface as an issue until later down the line. It’s the combination of features that makes JS behaviour so surprising to people.


Im talking about accessing variables in parent scopes. In python you have to explicitly declare the var as nonlocal, but in JS it just works. Now, I'm well aware that this was an active design decision made by both communities, and the "Pythonic" way wouldn't be to use `nonlocal` at all probably, but instead use a class or similar. However, I still consider its presence a wart of the language. (likely only because I am so much more familiar with JS!)

Silly micro example to explain what I'm talking about:

JS:

   const countUpFromN = (n) => () => n++

Py:

   def countUpFromN(n):
     def add():
        nonlocal n
        n += 1
        return n-1
     return add
Keep in mind that you only need nonlocal to write, reads will work fine. So you can use a variable a bunch in some scope, then suddenly have things break when you try to simply write to it.


Actually, I think an interesting exercise for evaluating a language could be determining "how much code change do small semantic changes require". For instance, if we were to start with passing a generator (in the "generates a value" sense of the word, not the other one), and try to convert it to passing a counter, how would that look?

JS:

   foo(x => () => x)
   bar(x => () => x++)
Py:

  foo(lambda x: lambda: x)
  
  def countUpFromN(n):
     def add():
        nonlocal n
        n += 1
        return n-1
     return add
   bar(countUpFromN)
Fairly damning to python IMO. I'd be interested to see examples of where small changes in Python require large changes in JS.


The whole page, beside being really outdated, feels to be written by multiple people, each of them disliking different aspects of different languages, thus really inconsistent.


Some are downright silly:

> C#: You can't perform any operations, even simple arithmetic, with objects inside a generic method (e.g. T plus<T>(T t1, T t2) { return t1+t2; }

Well... obviously.


Obviously? Surely that's just a type constraint.


C# doesn't claim to have powerful type inference a la. Haskell, so there's no reason to expect it to extract type information from usage of parameters inside generic functions. If you need functionality inside the body, you gotta declare that functionality in the type constraints.


Yes, you have to specify the constraint for code that assumes the constraint to type-check.


Simple -- because basically everyone is forced to use JS. If 10% of the people who use JS complain about it, and 20% of the people who use Perl complain about it, there are still more people complaining about JS.


>basically everyone is forced to use JS

What about Wasm and transpiling?


WASM is interesting, but the market vastly favors JS. JS is simpler than WASM and transpiling, and any performance advantages WASM offers are relevant for edge case applications. A typical web-app will perform just fine with bloated JS these days.


not to mention JS has let/const since 2015


I get this is tongue in cheek, but I really wish this was a little more focused. Something more along the lines:

(1) Language x claims these as its design principles/goals: P1, G1, P2...

(2) These y things subvert those aspirations.

(3) Modus ponens &x suxxxorzzz_Aslang__

I'm not the biggest fan of Go for it's stripped nature or Rust for the complex babysitting the type system needs, but I much prefer both to C. so that is a win for me and hence both achieve that goal.

Similarly, I love how expressive Scala is, that seems like an unspoken design goal of Odersky's. You can't get that without there being a million ways to do something.


Thanks for the read! I appreciate the thought and work you put into this. And now my obligatory gripes!

Under the PHP section you claim

  "if an included file must return a value but the file cannot be included, include statement returns FALSE and only generates a warning. If a file is required, one must use require()."
How is that a negative? That sounds like expected behavior to me and I can't imagine it working any other way.

  $error = include('script.php'); // Return FALSE and continue anyway.
  
  require('script.php'); // Die with error.
How is that undesirable/wrong? It lets you do things like...

  $error = include('script-A.php');
  if ($error == FALSE) require('script-B.php');


I’m not going to defend other js parts, but calls on 0.1 + 0.2 != 0.3 and the type of nan both present the author as a non-skilled programmer at least. Big numbers and exact decimal-point numbers are out of scope of almost any general purpose language core for a reason. It is up to a programmer to decide whether this reason is applicable or not, but not blindly dismissing. Math is a+b, engineering is “%.14g”.


Yay, Ada doesn't suck! :)


In C++, std::map's operator[] will either return the element pointed to by the specified key, or if the key is not in the map, will default construct an object and insert it into that key's slot in the map.

I really want to know who thought this was a good idea, because I've only ever seen this behavior act as a footgun.


Since there's not a section for x, that must mean that x doesn't suck. /s


Closure has the shortest section and is therefore, obviously, the best.


What ever language the site was written in sucks as the site is now down...


To be fair, a site can go down in any language...


python sucks because: default parameters are only evaluated once.


Yes, weirdly missing.

Also omitted:

  my_dict.get('existing_key', foo())
does not short-circuits foo(). That always felt like a bug to me.


Functions in most popular languages behave this way. `get` is just a method; its arguments will never be short-circuited. If you wanted short-circuiting, use the ternary operator:

  my_dict['existing_key'] if 'existing_key' in my_dict else foo()
You could also define a `get_lazy` method that does the above, taking a nullary function as its key and evaluating it if the key is missing. But then you would need to wrap your default values in a lambda, which would look absurd for the common use case of calling `get` with some constant default value. Seems unnecessary when the ternary operator works fine.


If I know my_dict doesn't hold values that that are false-like, e.g. 0 or [], I use `or` to lazily alternate between options. This works because in python, `or` returns the first value that has a truthy value. In this case it would be

    my_dict.get('existing_key') or foo()
This would also easily let you do multiple alternations (much more annoying with the ternary operator).


This is dangerous because sometimes you have legitimate values that are not truthy.

  print(player_data['score'] or 'does not exist')
This will print "does not exist" when the score is zero.


gp's approach works best when you know that the argument is a list or tuple (something which can be empty). Of course, due to python's lack of typing, you should only use that approach when the function is being used in well-understood contexts.


You can fix that/cheat around that by cleverly using a lambda as kwarg, I think... (Real hacky though)


The common-sense approach would be not to introduce such defaults.

  def foo(bazzes=None):
    _baz = bazzes or []
    ...


Then why doesn't the language prohibit having a default of anything mutable?


Python is generally terrible at providing immutable types and workflows. No such thing as a constant, class variables are mutable by default, no built-in immutable dictionary class, etc.

I use a linter to catch this sort of thing and subclass `collections.namedtuple(...)` whenever I need to at least try to abolish mutability. It's one of the worst things about the language.

[edit] You can argue that there would be use cases for mutable defaults, and as with any edge use case you'd be right in asserting its existence. Unfortunately, I can't think of any solution (typing, changing function argument evaluation rules, etc) that wouldn't fundamentally change the language and break just about every sizable python library in existence. This looseness is baked deep into the language as far as I can see.


- how does the language know a thing is (not) mutable?

- how does it know you're not doing that on purpose?


1) aren't there specific immutable pass-by-value forms in python? If you pass a default integer, for example, and change the value of that variable in the function body, the next call will respect the default parameter and not the mutated parameter.

2) good point.


The mutable default is sometimes useful as a quick way to memoize a function


I would argue that that is an edge case that should be done with decorators. The whole mutable default thing bites everyone at least once, and at best it just means more code to do the same thing.


This is the idiomatic way to add mutable defaults.


> XSLT/XPath sucks because [...] It starts numbering from 1. Unlike every single other major programming language in use today.

Does Lua count?


I wonder why anyone would be so cruel as to put so much content in a single webpage.

Did they actually had the patience to write such a long article in a single file, or did they write it in multiple files, and then compiled it down to a single webpage for others to suffer?


What is Backbase? Google throws up a company and not a language.


As I thought, Swift doesn't suck at all :)


So basically clojure is awesome


Probably a similar effect that moksly said upthread about javascript vs python:

"Everyone knows JavaScript, but not everyone use it by choice, giving you a higher than average ratio of people who both dislike and know the language."

The only people using Clojure are those who really like it.

From my end, the worst things about Clojure were the JVM (making installation/configuration hellishly frustrating) and the poor error messaging (making debugging the configuration extremely difficult).


I heard nothing bad about Clojure. Mostly because there is not enough syntax to complain about.


> Camel case sucks

Why?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: