> Andreas Kling, the same person behind SerenityOS
Important context - Andreas isn't involved with SerenityOS much anymore. He decided to prioritize Ladybird, which is arguably the more important project here.
Also, he used to contribute to WebKit. Even ended up working at Apple for a period of time. Quite definitely the right person in the right place.
Audiophile USB cables are pure placebo, there's no controversy here. Changing the medium doesn't magically make digital signal higher quality, only reduces data loss, if anything.
I would have taken the opportunity to "educate" them that that your use case is a legitimate one, and why they should teach their systems not to block users like that.
I'm not sure we can leverage the neural cores for now, but they're already rather good for LLMs, depending on what metrics you value most.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
At this point, Google should accept new sign-ups for critical products ONLY from countries that have a functioning law enforcement system when it comes to this - and check based on ID card/passport.
Python's imports are the worst I've seen in any mainstream programming language by far.
Relative path imports are unnecessarily difficult, especially if you want to import something from a parent directory. There's no explicit way to define what you'd like to export either.
> Relative path imports are unnecessarily difficult, especially if you want to import something from a parent directory.
This is never explained well to beginners but this is because Python imports deal with modules and packages, any relationship with directories is accidental. With namespace packages foo.bar and foo.baz might live on totally separate parts of the filesystem. They might not live on the filesystem at all and instead be in a zip file.
somedir/
otherdir/
__init__.py
foo.py
scripts/
__init__.py
myscript.py
# in myscript.py
from .. import otherdir.foo
If you `python scripts/myscript.py` then it won't work because myscript.py is run with a __name__ of __main__ which doesn't have any parent module.
When you run python -m scripts.myscript it sets __package__ to scripts.myscript so now it does have a parent module and somedir and otherdir are importable under the root module because '.' is added to the search path by default.
In that case it only means: execute the code from that module, I don't care about importing stuff. If they set a global variable or interact with other modules I might use it.
As a Python dev, reading the "import * as bindingName from 'module-url'" seems confusing as it sounds like it'd try to assign multiple things to one name.
In Python, the following line pairs are roughly equivalent:
import X
X = __import__("X")
or
import X as Y
Y = __import__("X")
I understand and kinda appreciate the nice (apparent) simplicity of a dedicated "import", especially as you will use it a ton, but I also like the bluntness of Zig where it's not special (well, it has a macro), and you're explicitly doing the assignment:
Really not a fan of languages that allow importing (using, including...) things where magic happens and a bunch of names are just available without any clear idea where from. Yeah, an IDE will help you find the source of a C function from an #include, but if you're trying to debug some 3rd party library and don't want to download the whole thing, the ability to hand-trace it a bit is just gone. Why mix a bunch of names into the local namespace? If you want short names, just assign it to a single letter variable.
Circling back to Python, the only practical place I see 'import * from ...` being used is in libraries where they are bundling up names from a bunch of submodules to be accessible at a top-level. Though, I was searching for an example of this as I know NumPy did it until they undid it: https://github.com/numpy/numpy/pull/24357
> I understand and kinda appreciate the nice (apparent) simplicity of a dedicated "import", especially as you will use it a ton, but I also like the bluntness of Zig where it's not special (well, it has a macro), and you're explicitly doing the assignment:
That is exactly how CommonJS worked for the longest time:
const module = require('url')
The short end of the stick was the export syntax for CJS, which is basically just assigning a property to an object:
module.exports.value = 'foo'
The problem solve with the import syntax is that is statically parseable, you just need to parse the top-level of a package to find its dependencies. The explicit export is also a god-send for tooling, which can have a lot of introspection from it, without executing any code.
Relative imports are fuzzy if you don't dig deep, that said with a project scaffolded for you, it's rarely a headache (like testing in the old days), you can rapidly try various amount of dots.
I still haven't bothered to learn why relative imports require you to be in a package. It's a major headache if, like me, you do a lot of one-off work that doesn't warrant that project scaffolding.
Well, not a major headache. I can always revert to the py2 way: symlinks.
> It's a major headache if, like me, you do a lot of one-off work that doesn't warrant that project scaffolding.
Instead of creating "myproject.py", create a myproject folder with a __main__.py file. Run it as "python -m myproject".
Hardly any scaffolding and your code is in a package. It's one of the things I like about python, it's quite easy to move "up the ladder" scaffolding-wise if you need to.
The worst thing is that it can't handle circular imports, unlike every other language, AFAIK.
I've been told it's because of some deep part of Python interpretation that just can't be changed. The mutable default argument madness is also caused by that.
My guess is that imports in python are just executing python. When you import a module you are (optionally) running code in that module. So a circular import generates infinite recursion.
Python does a lot when it evaluates an import statement. That is where a lot of the python magic happens. As soon as you try to limit the import statement somehow, a lot of python code needs to be re-written (and maybe doesn't work at all anymore). That's arguably a bad decision but it's one at the center of the language and it's unlikely to change.
You're right. And same with the mutable default argument "trap". That's cause by `def foo` being a statement, not syntax. When a module is imported, the code in it is executed. Many of the statements in that will be `def something`, which, when executed, defines a function. And because that's code that gets executed,
def foo(bar=[]):
bar.append('lol')
return bar
the `bar=[]` gets executed at that time. That is, Python doesn't treat `def foo` as some magic thing that gets special cased and squirreled away for later use.
Exactly, importing (like everything else in Python) is just a "syntax sugar" around loading the code on a file and running it instantly. If you do a circular reference it can pop the stack
> importing (like everything else in Python) is just a "syntax sugar" around loading the code on a file and running it instantly.
Not quite - it also checks whether the file has already been imported, and does nothing if so.
And it also checks if the file has been "partially imported" which is what causes it to fail during circular imports.
You could imagine a small change to Python which made the second case a no-op as well. This would allow you to use circular imports. You'd need to restrict yourself to imports like "import mymodule; mymodule.myfunction()" rather than "from mymodule import myfunction; myfunction()" but that's encouraged in popular style guides [0] anyway. By the time you run the function, mymodule.myfunction will be bound correctly and everything will work.
This would create hard-to-debug situations for people who rely on executing lots of side-effect code to set up state at import time, but 1) that would only happen in code that currently can't run at all, and 2) those people deserve it.
I think it would even be reasonable to make the "from mymodule" version work by binding those names later, but it wouldn't be a one-line change to the interpreter and it might break some legitimate use cases where a name got redefined.
IMO the way to think about it is that the turing-complete nature of python imports is one of the central design decisions of the language. Lots of python oddities make more sense once you realize that the module structure is determined at runtime - including the enormous capacity for the language to do its own setup (or fail to do its own setup in amazingly complex ways).
So it's not a bug, it's one of the central choices that makes python "pythonic."
I was searching for a job from April to June 2024 as a senior/lead front-end (React) dev in EU (Germany). In addition to the usual mid-year slow down, the market itself was tougher. It seemed like a lot of companies had weird standards, contradictory at times even; even though the salaries they offered were much lower than before.
Luckily I landed a great job in late June, but it took 60+ applications and countless interviews.
Nitpick: the "H" in HMRC stands for "His", since it always changes to match the current monarch.