Working in the ML field, I can't hate Python. But the type system (pre-3.12, of course) cost me a lot of nerves. Hoping for a better post-3.12 experience once all libraries are usable in 3.12+. After that experience, I’ve come to truly appreciate TypeScript’s type system. Never thought I’d say that.
One of the most frustrating bugs I had encountered was when I was using memory mapped CSC format sparse arrays.
I needed the array indices to be int64 and specified them as such during initialization.
Downstreams, however, it would look at the actual index values and dynamically cast them to int32 if it judged there would be no loss in precision. This would completely screw up the roundtrip through a module implemented in C.
Same experience here, Python’s typing experience is awful compared to TypeScript, even post-3.12. Mypy’s type inference is so dumb you have to write arguments like `i: int = 0`; `TypedDict`s seems promisable at first and then end up as a nightmare where you have to `cast` everything. I miss TypeScript’s `unknown` as well.
Mypy was designed to enable gradual adoption. There is definitely Python code out there with `def f(i=0)` where `i` could be any numeric type including floats, complex, numpy etc.. This is called duck typing. It's wrong for a type checker to assume `i: int` in such a case.
Pyright probably works if you use it for a new project from the start or invest a lot of time "fixing" an existing project. But it's a totally different tool and it's silly to criticise mypy without understanding its use case.
It still has a special case for dataclass-like things. I don't see how Python type checking (I haven't tried Red Knot) could let you do semi-magical things like Zod schema validation from TypeScript.
How would a typing system know if the right type is `int` or `Optional[int]` or `Union[int, str]` or something else? The only right thing is to type the argument as `Any` in the absence of a type declaration.
The typing system should use the most specific valid type, and the code author can broaden it with explicit typing if needed. No good typing system should ever infer a variable as `Any`: "I don’t know the type of this" (`unknown` in TypeScript) is not the same thing as "This function accepts anything". Conflating these things is one of the main reasons why Mypy is so annoying.
A typing system should only infer things that it knows are true, it should never invent restrictions. In a language like Python that is duck-typed, `Any` is the only reasonable choice in the absence of other real constraints like a type-annotation.
def f(i=0) -> None:
if i is None:
do_something()
else:
do_something_else()
Yeah, I know it's retarded. I don't expect high quality code in a code base missing type annotation like that. Assuming `i` is `int` or `float` just makes incrementally adoption of a type checker harder.
No it’s not. The typing system should use the most specific type available, and it’s your responsability to broaden it if needed. That’s how it works in all statically-typed languages.
I want a typing system with a good inference that doesn’t require me to type each and every variable, just like in any good statically-typed language like OCaml or Typescript. Strong typing and explicit typing are two very different things.
It's guaranteed to be correct if you use different operators for ints and floats, which is what at least some ML dialects (notably, OCaml) do precisely so that types can be inferred from usage.
That's the downside of operator overloading - since it relies on types to resolve, they need to be known and can't be inferred.
I was merely giving an example that strong typing has nothing to do with having to write the types. (and, obviously, the inferred type (int -> int) is correct. )
I believe mypy infers i as an integer in i = 0. I remember I had to do i = 0.0 to make it accept i += someFloat later on. Or of course i:float = 0 but I preferred the former.
Because it shouldn’t in function arguments. The one defining the function should be responsible enough to know what input they want and actually properly type it. Assuming an int or number type here is wrong (it could be optional int for example).
In TypeScript arguments with a default value "inherit" the type of that value, unless you explicitely mark it otherwise. I believe this is how Pyright works as well.
Because you provided a default value so clearly it’s not required to provide an input parameter. It’s also wrong to assume `0` is an int. There’s other valid types it could be. If the default was say `42`, I’d be pushing back a little less (outside of the Optional part), but this contrived example from GP had 0, which is ambiguous on what the inferred typing must be.