Haven't had the time to watch that talk yet... but what's your motivation to create a new language? Any drawbacks in D that you're trying to fix (besides the GC being replaced with RC - which many consider to also be GC by the way)?
- more normal lambdas (seriously, D's lambdas are wild)
- faster compiler with more caching and (maybe) eventually live reload?
- proper packages instead of include path.
And a bunch of small fry like format strings and named parameters.
And yeah, um, GC can be good but the D GC kind of isn't. We use D at work, and we run into GC issues frequently. Neat's RC is a much thinner wrapper around C memory allocation, which is just better imo, much as GC and RC are logically equivalent in some ways.
Please elaborate on this. It's something I'm trying to get right in my own language since day one. I see in your language's manual that you have chosen to implement modules as files and packages as directories. I came up with something very similar but special cased the main module file so that the entire module could be contained in its directory.
How do you represent modules in your implementation? How do you handle module loading? What paths do you search? Do you support Linux distribution packaging? This last feature is something I'm interested in supporting, I added system directories to my search path for this reason but I wonder if there's anything else that I need to do.
Yes-ish. Basically, it's sort of like D in the sense that a `module a.b.c;` always corresponds to a file `a/b/c.nt` somewhere. But the places where it looks follow a recursive hierarchy, not an include path: you can define paths to search, but you have to explicitly define import relations between those paths. And that's because it's not really a search path, it's a full-on package graph, where each package is only allowed to see imports from packages that it explicitly lists as a dependency. In D, if you're importing a Dub package, because it's just using a search path, the files from that Dub package can actually import modules from your main project. In Neat, the lookup order is "current package", then "dependency packages", then stop.
So Neat's actual hierarchy is "file [> folder]* > package", where `package` itself is optional in the import declaration: you can write `import package(gtk).gtk`, but you don't have to. This is occasionally useful during bootstrap builds when you want to clarify which version of the compiler an import is coming from: the current running compiler is always `package(compiler)`.
This is all because I've been writing it with something like a package manager in mind from essentially day one.
edit: I'm not looking at distribution packaging right now, because I'm not looking at non-source libs at all. That's something that can come later if it's needed at all.
Okay so if you're calling a D std.algorithm function, for instance
int factor = 2;
assert([2, 3].map!(a => a * factor).array == [4, 6]);
Then the `!` indicates that you're actually passing the lambda as a compiletime parameter to `map`. But the lambda can access the surrounding context! How does it pass a runtime value at compile time?
So what you're passing is actually purely a function symbol. The way that it gets the stack reference to the surrounding function is that it's actually a nested function. And the way that `map` gets the stack reference to pass to the lambda is that, effectively, that instance of `map` is also a nested function of the calling function.
That's also why you cannot pass a lambda as a template parameter to a class method in D: it already has a context parameter, ie. the class reference.
In Neat, the value of the lambda is the stackframe reference, and it's just passed as a regular parameter:
int factor = 2;
assert([2, 3].map(a => a * factor).array == [4, 6]);
Which avoids this whole issue at the cost of requiring some cleverness with refcounting.
This "Just pass it as a name" pattern has a been a complete disaster for D IMO. It was before my time but I think the explanation for why seems to be annoyingly along the lines of "dmd optimizer likes it".
It also encourages people not to think about what the structure of their templates, so you can end up with truly massive amounts of duplication.
Well, D lambdas and Neat lambdas cash out the same at the backend level. There shouldn't be a performance difference. If you're passing the context as an explicit parameter, that should turn out exactly the same as passing it as an implicit stackframe parameter. The difference is that instead of instantiating the template with the lambda, we're instantiating it with a type that uniquely corresponds to the lambda - it's pretty similar in the end.
Did you attempt to submit PRs to D for the features you have listed?
It would be indeed quite neat (pun intended) to see D adopt them as well, specially sumtypes, Rust went ahead and made enums better, it's quite an expected feature for a language at this point
The problem the OP was talking about is that in D, you cannot implicitly use an `int`, say, where `SumType!(int, string)` is expected.
You need something like this:
alias StrOrInt = SumType!(int, string)
void takeStrOrInt(StrOrInt s)
{
writeln(s);
}
StrOrInt value;
value = 10; // ok
value = "foo"; // ok
takeStrOrInt(value); // ok
//takeStrOrInt(10); // not ok
//takeStrOrInt("foo"); // not ok
takeStrOrInt(StrOrInt(10)); // ok
takeStrOrInt(StrOrInt("foo")); // ok
Even though this is not perfect, it works quite well (I believe it's zero cost to do `StrOrInt(10)` for example, but I'm a D newbie).
It's a bit crazy for me to see people creating new languages instead of help improving existing ones because of minor stuff like this. The effort to create a language and a stdlib and a package manager etc. is ridiculously high compared to improving existing languages.
I think you're underestimating the effort cost of D sumtypes at scale. Every individual instance of StrOrInt(10) is cheap, but once you're actually constructing structs in the same line, you end up writing a lot of terrible code like
that just adds visual noise. And the fact that you cannot return out of sumtype apply expressions just makes so many neat idioms impossible. Something like
So it's not one thing, it's a lot of things coming together. :) Mostly I just realized one day that D was never going to be the perfect language for me, because it wasn't even interested in being that language.
> The effort to create a language and a stdlib and a package manager etc. is ridiculously high compared to improving existing languages.
Have you seen the DMD source code? Genuinely, writing my own compiler was easier than improving DMD.
We're in a DIP freeze right now. But Walter doesn't like macros and implicit conversion. Half the point of Neat is showing D devs there's no reason to be afraid of those powerful features.
I'm getting quite interested in D... can you explain what's exactly going on with DIP1000 and whatever else you may be referring to?
Are the changes to the language going to make it much better? Break existing code? How long until they finish that, is it nearing completion or just beginning now?? So many questions as someone new to D.
I have no idea honestly, it's all sort of in a state of confusion rn. The big thing coming up is editions, but that's not even slightly codified or implemented yet. I'm curious where it's going as well, I'm very hype for that approach. It could end up really good for the language. Or it could all sort of come to nothing. We'll see over the next years.