Hacker Newsnew | past | comments | ask | show | jobs | submit | amval's commentslogin

Well, yes, it would be better if he didn't amplify propaganda for the country that is committing a genocide and would raise awareness for the victims.

Is this not self-evident?


The point is this comes down to a foreign policy disagreement that isn’t germane to Ellison’s comments on surveillance. (I can come with a litany of policy disagreements with anyone of Ellison’s stature, some of which I probably feel about strongly.)

Read in good faith, it’s overzealous advocacy. In bad faith, which I don’t assume here, it serves to get this discussion flagged off the front page.


These things are not happening in a vacuum:

1. Ellison's comments about surveillance

2. Conservative billionaires, including Ellison, consolidating ownership of social media, print media, TV media, etc.

3. NSPM-7 & the current admin's appetite to criminalize speech

4. The current administration kowtows to Netanyahu, who relishes in conservative ownership of TikTok

The dots are all there: if you express something that doesn't following an accepted US stance, like maybe its stance on Israel, maybe on TikTok, it gives Trump the ability to easily find, label & punish you as a terrorist, maybe even at Netanyahu's request. Trump's desire to do things like this has been explicitly stated since the death of Charlie Kirk. He's always talked about his desires to throw his political enemies in jail.

Even before this, the admin has been targeting people like Mahmoud Khalil, Mario Guevara, etc. for speech.


You don't think that the fact that Ellison is a staunch defender of regimes that disregard the international order in favour of military might is relevant to the fact that is also advocating for building a surveillance state?

In case you don't, to me it's painfully clear that these are just different aspects of the move towards more authoritarian forms of government. You CANNOT support a genocide and expect that this will not have an effect on democracy.

EDIT: Also note that I am trying to take your comments on good faith, but characterising support for genocide as "a foreign policy disagreement" feels a bit like an understatement.


> it's painfully clear that these are just different aspects of the move towards more authoritarian forms of government

Sure. But, like, the evidence for that is the advocacy for a surveillance state. Not his support for a foreign policy project that yes, involves supporting an autocratic government in Israel (fighting, let’s be fair, an autocratic force in Gaza backed by an autocratic state in Iran), but also a whole bunch of other irrelevant things.


I don't think I understand your point, beyond downplaying the severity of current events.


> beyond downplaying the severity of current events.

He is definitely calling it "polarizing" and minimizing it. I infer that he is supportive of it then.


I’m not downplaying the severity of anything. Just its relevance. Someone can be severe and irrelevant, and I think that’s the case here.


Your language suggests a sort of "explaining away" that is pernicious in certain cultures abroad.


> I was active in the Python community in the 200x timeframe, and I daresay the common consensus is that language didn't matter and a sufficiently smart compiler/JIT/whatever would eventually make dynamic scripting languages as fast as C, so there was no reason to learn static languages rather than just waiting for this to happen.

To be very pedantic, the problem is not that these are dynamic languages _per se_, but that they were designed with semantics unconcerned with performance. As such, retrofitting performance can be extremely challenging.

As a counterexample of fast and dynamic: https://julialang.org/ (of course, you pay the prize in other places)

I agree with your comment overall, though.


I'm sort of surprised I'm not seeing any modernized dynamic scripting languages coming out lately, despite the general trend towards static languages. A fast dynamic language, with a day-one concurrency story, and some other key feature that pushes it ahead seems possible to me. (I dunno, maybe a nice story for binding to Rust instead of binding to C at this point could be enough to lift a language off?) I don't see any reason why dynamic scripting languages as a category couldn't do that. The ones we have now don't, not because the category makes it impossible, but because by the time that was desirable they just had too much baggage, and are all still struggling with it even a decade after they started.


> A fast dynamic language, with a day-one concurrency story

That's pretty much what Erlang claims to be. Of course people will always quibble with "fast".


I do quibble with fast, and I also quibble with "dynamic language". Its variables are untyped, but that about ends the "dynamicness" of the language. It's not what people mean.

Elixir is closer, though BEAM in general still leaves a lot of performance on the table. I'm somewhat surprised that tracing JITs and stuff have not managed to make more progress than they have, but it could be lack of developer time. Unfortunately JITs eat that voraciously.

I'd expect this to be a mutable language, though part of that "day one concurrency story" could be an inability to transmit references between something like Erlang "processes".


Welcome to Elixir. A concurrency story baked into its architecture, Rustler for trivial Rust bindings, a full set of project tooling (package management, deployment handling, etc). And a killer combo of database bindings in the form of Ecto and a best-in-class web framework from Phoenix.

And it's got an unusually broad base of existing libraries for how young it is, both because writing bindings is really easy, and because you can import Erlang libraries which have been around since the 80's.


> fast dynamic language, with a day-one concurrency story

simple dynamic scripting languages target people without serious CS background

concurrency is a footgun best hidden from such people, they are not ready for it


What are examples of those semantics? I'm guessing rebindable functions (and a single function/variable namespace), eval(), and object members available as a dict.


Some examples that come to mind: You can inspect the call stack, and get a list of local variables of your callers. You can assign to object.__class__ to dynamically change an existing object's class at runtime. You can overwrite every operator, including obj.field access, dynamically at runtime (including changing an existing class)


a long time ago, there was Parrot and a bet with Guido https://justatheory.com/2004/08/oscon-notes/


Even funnier: https://github.com/tensorflow/swift/blob/main/docs/WhySwiftF...

> Julia: Julia is another great language with an open and active community. They are currently investing in machine learning techniques, and even have good interoperability with Python APIs.


Yeah that was a missed opportunity fore sure, joining forces 8 years ago would probably had yield better results that the current situation.


For a language that announced itself (and raised a lot of money on the premise of) claiming to be "a Python superset", this does not sound like a huge achievement.

In all fairness, their website now reads: "Mojo is a pythonic language for blazing-fast CPU+GPU execution without CUDA. Optionally use it with MAX for insanely fast AI inference."

So I suppose now is just a compiled language with superficially similar syntax and completely different semantics to Python?


I think it was pretty clear immediately that running python code was a far away goal. There was a lot more talk about lifetimes and ownership semantics than details about Python interop. Mojo is more like: Can we take the learnings of Swift and Rust and solve the usability and compile time issues, while building on MLIR to target arbitrary architectures efficiently (and call it a Python superset to raise VC money).

That said, the upside is huge. If they can get to a point where Python programmers that need to add speed learn Mojo, because it feels more familiar and interops more easily, rather than C/CPP that would be huge. And it's a much lower bar than superset of python.


It marketed itself explicitly as a "Python superset", which could allow Python programmers to avoid learning a second language and write performant code.

I'd argue that I am not sure what kind of Python programmer is capable of learning things like comptime, borrow checking, generics but would struggle with different looking syntax. So to me this seemed like a deliberate misrepresentation of the actual challenges to generate hype and marketing.

Which fair enough, I suppose this is how things work. But it should be _fair_ to point out the obvious too.


Absolutely. The public sales pitch did not match the reality. This is what I meant with the "Claim to be Ṕython to get VC money" point.

To first order, today every programmer starts out as a Python programmer. Python is _the_ teaching language now. The jump from Python to C/Cpp is pretty drastic, I don't think that it's absurd that learning Mojo concepts step by step coming from Python is simpler than learning C. Not syntactically but conceptually.


Maybe young generations have some issue learning polyglot programming, I guess.

While I agree using Mojo is much preferable to writing C or C++ native extensions, back on my day people learned to program in K&R C or C++ ARM in high school, kids around 12 years old, hardly something pretty drastic.


No wonder there's so much terrible unsafe C floating around...


Many famous Speccy and C64 titles, written in Assembly, were written by bedroom coders between the ages of 14 and 16 years old, getting some pocket money writing them on the UK scene.

Get hold of Retro Gamer magazine for some of their stories.


I've tried learning C a couple times and given up because the curve is too steep to be worth the climb. It's not even the language itself, it's the inherited weight of half a century's worth of cruft. I can't spend weeks fighting with compiler nonsense, header files and #include. Screw it, I'll just use Go instead.

I'm learning Rust and Zig in the hope that I'll never have to write a line of C in my career.


Geez, what a comment. C is much much more simpler than Rust. You’re not supposed to be spending weeks fighting includes or compiler errors, that means you’re have some very basic misconceptions about the language.

Just read K&R “The C programming language” book. It’s fairly small and it’s a very good introduction to C.


C syntactically is straight forward, but conceptually may be harder than Rust. You’re exposed to the bare computer (memory management, etc) far more than with a GC language or even Rust arguably, at least for simple programs.

Towards deployment is even harder. You can very easily end up writing exploitable, unsafe code in C.

If I were a Python programmer with little knowledge about how a computer works, I’d much prefer Go or Rust (in that order) to C.


Rust memory model is very complicated. C memory model is very straightforward.


This is true, but when you get something wrong related to the memory model in C, it just says "segfault". Whereas in Rust it will give you a whole explanation for what went wrong and helpful suggestions on how to fix it. Or at the very least it will tell you where the problem is. This is the difference between "simple" and "easy".


C before C11 has no memory model. Rust doesn't have one but effectively it inherits the C++/C memory model, so there is actually no difference.


That applies only if you take "memory model" to mean modeling the effects of concurrent accesses in multithreaded programs.

But the term could also be used more generally to include stuff like pointer provenance, Rust's "stacked borrows" etc. In that case, Rust is more complicated than C-as-specified. But C-in-reality is much more complicated, e.g. see https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2263.htm


The model you're referring to, a Memory Ordering Model, is literally the same model as Rust's. The "exception" is an ordering nobody knows how to implement which Rust just doesn't pretend to offer - a distinction which makes no difference.


I do sympathize with the parent: The language itself might not be that difficult but you also have to factor in the entire ecosystem. What's the modern way to a build a GUI application in C? What's the recommended way to build a CLI, short of writing your own arg parser? How do you handle Unicode? How do you manage dependencies, short of vendoring them? Etc.


Errors too. When, inevitably, you make mistakes the C might just compile despite being nonsense, or you might get incomprehensible diagnostics. Rust went out of its way to deliver great results here.


I am not arguing about how good or easy it is to use C in production, I’m merely stating that parent complaints about weeks of insolvable errors and issues with includes screams that he needs to read some good resource like book, because he is definitely misunderstanding something important.


Even more than that: "How do you do a string?" has like 100 answers in C depending on what libraries are available, what your deploy target is...


THe thing is, if one is an expert it is incredibly difficult to understand the beginner perspective. Here is one attempt:

C is simpler than Rust, but C is also _much_ simpler than Python. If I solve a problem in Python I have a good standard library of data types, and I use concepts like classes, iterators, generators, closures, etc... constantly. So if I move to Rust, I have access to the similar high-level tools, I just have to learn a few additional concepts for ressource management.

In comaprison, C looks a lot more alien from that perspective. Even starting with including library code from elsewhere.


Writing hello world in C is easy. Writing complex software without memory issues and vulnerability is pretty hard.


Agreed, I do bash C a lot, and it has plenty of issues, but hardly a monster that a mythological hero has to face.

And as tip for pointers, regardless of the programming language, pen and paper, drawing boxes and arrows, are great learning tools.


I think one of the "Python superset" promises was that any particular dev wouldn't need to learn all of that at once. There could exist a ramp between Python and "fast python" that is more gradual than the old ways of dropping into C, and more seamless than importing and learning the various numpy/numba/polars libraries.


FWIW generics are already a thing in pure Python as soon as you add type annotations, which is fast becoming the default (perhaps not the least because LLMs also seem to prefer it).


They've backed off a little from the Python superset claims and leaned more into "Python family".

> I'd argue that I am not sure what kind of Python programmer is capable of learning things like comptime, borrow checking

One who previously wrote compiled languages ;-). It's not like you forget everything you know once you touch Python.


The second part of the sentence is very important ;)

"... but would struggle with different looking syntax"


I think the point was that Python syntax is simpler than e.g. borrow checking.

Although Python has some seriously PERLesque YOLO moments, like "#"*3 == "###". This is admittedly useful, but funny nonetheless.


I suppose if you accept the innocent-looking "#"+"#"=="##" then your example kind of algebraically follows. Next it's time to define what exp("#") is :)


* does different things depending on the types of the operands, which is Python's strong typing at work, not Perlesque weak typing. Repeating a string is a useful thing to be able to do, and this is a natural choice of syntax for it. The same thing works for lists: [1]*3 == [1, 1, 1].


It does unfortunately mean that sometimes `*` will work (and produce an incorrect result) rather than immediately failing loudly with a clear error message in the context in which it's actually intended to be numerical.

More broadly this is the same argument as whether overloading `+` for strings is a bad idea or not, and the associated points, e.g. the fact that this makes it non-commutative - the same all applies to `*` as well, and to lists as much as strings. At least Python is consistent here.

Although there is one particular aspect that is IMO just bad design: the way `x += y` and `x = y` work. To remind, for lists these are not equivalent to `x = x + y` and `x = x y` - instead of creating a new list, they mutate the existing one in place, so all the references observe the change. This is very surprising and inconsistent with the same operators for numbers, or indeed for strings and tuples.


I was referring to the "creative syntax" and it wasn't meant to be an attack on Python.

We cannot deny that Python has some interesting solutions, such as the std lib namedtuple implementation. It's basically a code template & exec().

I don't think these are necessarily bad, but they're definitely funny.


> and call it a Python superset to raise VC money

What else was proclaimed just to raise VC money?


The real unique selling point of Mojo is "CPU+GPU execution without CUDA", specifically, you write code that looks like code without worrying about distinctions like kernels and device functions and different ways of writing code that runs on GPU vs. code that runs on CPU, and mojo compiles it to those things.


Not unique though. Julia has had that since before Mojo was even started.


What does the performance of Julia's GPU kernels look like in comparison to kernels written by Nvidia or AMD?


Benchmarks of an early version on NVIDIA's blog show that it is about the same (the average across the benchmark shows Julia as actually a little faster here, but it's basically a wash) https://developer.nvidia.com/blog/gpu-computing-julia-progra....

While much has changed since then, the architecture is effectively the same. Julia's native CUDA support simply boils down to compiling via the LLVM .ptx backend (Julia always generates LLVM IR, and the CUDA infrastructure "simply" retargets LLVM to .ptx, generates the binary, and then wraps that binary into a function which Julia calls), so it's really just a matter of the performance difference between the code generated by the LLVM .ptx backend vs the NVCC compiler.


> For a language that announced itself (and raised a lot of money on the premise of) claiming to be "a Python superset", this does not sound like a huge achievement

I feel like that depends quite a lot on what exactly is in the non-subset part of the language. Being able to use a library from the superset in the subset requires being able to translate the features into something that can run in the subset, so if the superset is doing a lot of interesting things at runtime, that isn't necessarily going to be trivial.

(I have no idea exactly what features Mojo provides beyond what's already in Python, so maybe it's not much of an achievement in this case, but my point is that this has less to do with just being a superset but about what exactly the extra stuff is, so I'm not sure I buy the argument that the marketing you mention of enough to conclude that this isn't much of an achievement.)


I've written this somewhere else before, Modular did not raise $130m to build a programming language, nobody does that. They raised that much money to revolutionize AI infrastructure, of which a language is just a subset. You should definitely check some of the things they've put together, they're amazing


Yes. They are revolutionizing AI infrastructure but I guess a lot of world is just babbling about AI, but not every developer needs to worry about AI.

And so his improvements in mojo and now calling mojo code from python just make a lot more net positive to the community than being, some other Ai infrastructure company.

So I do wish a lot of good luck to mojo. I have heard that mojo isn't open source but it has plans to do so. I'd like to try it once if its as fast / even a little slower than rust and comparable to understanding as python.


Agreed, Modular is walking a very fine line, and they're doing so by trading on the reputation of Chris Lattner.

On the one had, as the other poster noted, no one raises $100M+ for a programming language; programming languages have no ROI that would justify that kind of money. So to get it, they had to tell VCs a story about how they're going to revolutionize AI. It can't just be "python superset with MLIR". That's not a $100M story.

On the other hand, they need to appeal to the dev community. For devs, they want open source, they want integration with their tools, they don't want to be locked into a IP-encumbered ecosystem that tries to lock them in.

That's where the tension is. To raise money you need to pretend you're the next Oracle, but to get dev buy-in you have to promise you're not the next Oracle.

So the line they've decided to walk is "We will be closed for now while figure out the tech. Then later once we have money coming in to make the VCs happy, we can try to make good on our promise to be open."

That last part is the thing people are having trouble believing. Because the story always goes: "While we had the best intentions to be open and free, that ultimately came secondary to our investors' goal of making money. Because our continued existence depends on more money, we have decided to abandon our goal of being open and free."

And that's what makes these VC-funded language plays so fraught for devs. Spend the time to learn this thing which may never even live up to its promises? Most people won't, and I think the Darklang group found that out pretty decisively.


I don't think investors look at what makes a net positive to the community when making large investments like in Modular. I was calling out the part of the post that said Modular raised a lot of Money to develop Mojo, that isn't entirely true as just creating a language isn't enough reason to invest $130m into a company, no matter how much net-positivity the language would bring.


It was never going to have Python semantics and be fast. Python isn't slow because of a lack of effort or money, it's slow because of all the things happening in the interpreter.


Fwiw the website still claims this:

> Further, we decided that the right long-term goal for Mojo is to adopt the syntax of Python (that is, to make Mojo compatible with existing Python programs) and to embrace the CPython implementation for long-tail ecosystem support

Which I don't think has changed.


I believe they're still working towards making the syntax and semantics more python-like.


Apparently this spans more countries? Very strange. Possibly a cyberattack or sabotage?

Growing up in Spain I've never experienced anything like this (not there at the moment, but friends have told me over WhatsApp).


Possibly some kind of rare cascade failure where it might have been on the edge for a while, and some small event happened that tripped things, similar to the american northeast blackout in 2003. High demand, plus a power station going offline meant more demand on some interconnects, which shorted on trees and were cut off, putting more load on other lines until the entire system collapsed


Continental Europe Synchronous Area [1]

The whole Europe power grid are somewhat interconnected I wont be surprise if this knock on effect start knocking out other surrounding countries.

[1] https://en.wikipedia.org/wiki/Continental_Europe_Synchronous...


Is that a risk with the small amount of interconnect capacity?


Grids are widely interconnected. Problems on one grid can and do cascade to another.


[flagged]


Ah, yes. The dubious and evil Perro Sánchez.


Woof!


Big companies tend to develop cult dynamics. This is not an exaggeration, but a consequence of how humans tend to operate in large amounts. And I'd wager that in the case of Silicon Valley tech companies, this is also something that they embrace and nurture. I don't think this is a controversial take at all, and rather obvious.

She was probably not "afraid of being let go" (fired), but had convinced herself that it was of the utmost importance to have this level of committment. The book probably reads similar to those books of someone who leaves their church or cult.


They tend to have cult dynamics because the people who subscribe to the cult dynamics are the ones who get promoted. If you’re happy to just make a living as a software engineer instead of trying to propel your way up the ladder of the world’s richest companies then you can live very happily and comfortably.


Yes, but this is not the people they'll hire for this kind of job. They're looking for the batshit crazy that will do this kind of stuff. This is the reason for the psychological profile they do in lieu of interview, when hiring managers.


That's mostly because Julia questions get answered on its Discourse or Slack. The sharp decline is due to an automatic cross-post bot that stopped working.

No one bothered fixing it, in great part due to Discourse being the main place of discussion, as far as I know.


I think it ultimately is a sign of the need for better languages. Of course, there are always engineering compromises. But I think a better world is possible, in which we don't have massive software projects written in JavaScript or Python.


What do you mean, exactly?


Might depend on where you are in Spain. Having lived in both, I'd take Spain over Germany any day of the week.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: