Hacker Newsnew | past | comments | ask | show | jobs | submit | more thesmallestcat's commentslogin

As a fallback. I like LISPs as much as the next guy/gal, but editing the AST by default is too low-level. The importance of paredit is a language smell in my opinion, not something to aspire to.


Can you explain in which direction something better should go? The AST is very much at a higher level than how the vast majority of programming is done, which is by manipulating the text that later gets parsed into an AST. I'm having trouble envisioning what editing a higher level than the AST would look like, without jumping straight to visual programming, which I find very inefficient in every incarnation I've seen. (And we could debate how far above an AST such systems actually are.)


It depends on the AST, but most languages' ASTs are quite unwieldy to work with directly. They get marked up with all kinds of semantic information useful for a compiler or interpreter quite early in the parsing process, and are often more explicit and regularized than the syntax is. For example, you can dump a Python AST with the 'ast' module, but it's not pretty. The statement "x=5" balloons into:

    Assign(targets=[Name(id='x', ctx=Store())], value=Num(n=5))
I think a comment up this thread (https://news.ycombinator.com/item?id=16386702) is right that Lisp manages to make this work because the s-expression structure of Lisp source code isn't really an AST, though it has some relationship to one.


That Python doesn't look like abstract syntax; it's an intermediate form full of additional semantic objects geared toward further translation or interpretation.

A Lisp compiler might build up something like that, from a fully macro-expanded body of code. That's not usually accessible to programs.

Like the fact that n is converted to a Name() node, where it has a ctx property indicating Store() has nothing to do with syntax; it is the result of an analysis arising from how x is being used semantically. The user didn't specify any such attributes on the x just that x is assigned and that is already recorded by something like Assign(x, 5).


That seems backwards to me - AST-aware editing is basically embedding more of the compiler into the editor. It seems more valuable to me to be writing in a language that can express higher level concepts with lesser boilerplate, which would eliminate the need (and make far more difficult) for AST aware editing.


>that can express higher level concepts with lesser boilerplate,

The whole premise around Lisp is precisely to express higher level concepts with lesser boilerplate.

This is achieved through a syntax that is really close to the AST, and macros.


In regards to lisp a large amount of lisp editors are written in lisp so they do already do exactly that'


You might be surprised how little a Lisp dialect brings to the table specifically for developing the text editor features geared toward editing specifically Lisp.

The data structure representing Lisp data (i.e. code) is stripped of relevant text editing attributes, like how it is divided into lines and using what indentation, and, oh those semicolon-delimited comments.

Text editors also have to let the user deal with bad syntax.

So it's not just a matter of, "oh, we're written in Lisp, so just hand the buffer to the Lisp reader, do something and spit it out again".


Why not both?


I'm not sure what that is even supposed to mean. Having high quality tooling is completely independent from having a programming language with less boilerplate code. You can have both at the same time. So why are you advocating against it?

Take Java for instance.

In Eclipse the a lot of AST-aware editing features are hidden in the "quickfixes" which you can trigger with CTRL+1.

For example you want to use an ArrayList but you haven't imported it yet. Use quickfixes: Bang the import was created with just two keypresses.

In a raw text editor you have to remember the full package string or google it but when you want to look up what a "XYZ" it's rather easy to find it.

The java import system is verbose but it makes it obvious what you're importing.

one class = one file Want to know what X is? Just open X.java.

Compare it to something like C/C++ where you include a header that contains multiple definitions. I honestly cannot use C++ without an IDE because of this. At the bare minimum my editor needs to be able to jump from a function call to the function definition or variable declaration to the struct definition.

In this case a feature that reduces boilerplate actually made the language harder to use without AST aware navigation.

Then there are quality of life things like extracting parts of a big function into seperate smaller functions. Just select the code you want and the IDE is going to create a function definition based on it and replace your selection with a call to that function.

I don't see how you could avoid this by making a more expressive language.

Does your fancy programming language automatically organise your code into functions or data types and all you need to do is crank out code without a care? No, you still to need think about that yourself, but the IDE can automatically move the code around for you.

Also let's talk about my favourite feature that I would never use without the assistance of an IDE: typeinference.

value blah = foo.bar([1,2,3])

What the hell is a blah? With an IDE I could just hover over blah and instantly know what the function returns. Without that feature I have to go to the definition of bar which is still easier with an IDE. After all if foo is also defined via type inference I have to lookup several function definitions and that's going to take some time. On top of that a tool like grep is imprecise and will give me hundreds of call sites but only one definition.

You can't fix everything by making a programming language more expressive and in my experience the dumber a programming language is, the easier it is to use with a dumb text editor.


>The importance of paredit is a language smell in my opinion, not something to aspire to.

Paredit is just a nice-to-have. Many lispers don't use it.


This is like saying running for exercise is a waste of time because you could be driving instead. It's not about knowing a particular sort algorithm. It's about the discipline of solving performance problems in the small. Knowing how quicksort works isn't that helpful. Being accustomed to the thought processes that led to the development of quicksort is important in any non-trivial programming activity. I'm not writing Google-scale services, but I regularly encounter algorithm design problems on the job, and they're never the exact algorithm you studied for some white board exam. I think the author is approaching algorithm study with the wrong attitude.


I took it to mean that most developers in this current market do not need to know implementation perfect algorithms and data structures like RB trees, depth first search, A*, quicksort, etc. Rather your time is better spent on learning the advantages and disadvantages on the structures and algorithms, and you get more benefit from understanding the implementation. For example, merge sort is dividing a list up into sublists until you hit single element lists and recombine them into a single sorted list. That, in my opinion, is more important than a picture perfect implementation on a white board.


I think it’s more akin to applying for a driving job but they test you on your running rather than your driving. Where you should be practicing your running anyway to maintain your ability to be an alert and healthy driver, but it’s not likely necessary for every driver to be in tip top physical shape to be a good driver.

Some of the interviews at lower paid shops I’ve either participated in or rejected have had more stringent requirements like weird whiteboard questions in the interview while working on more simple implementations (including shops where I found you won’t even be issued your own machine. You partner solely and share) than places where the work will be significantly more difficult and significantly better paid the environment more adult and the perks better.

I know FANG et al employ those tests for various reasons. I think a lot of other places just blindly emulate it.


They destroyed the XPI/XUL platform and thousands of hours of work that went into those extensions. Now Firefox is a much worse version of Chrome, no reason to use it except for dubious ideology preferring Mozilla. Sacrificing the one network effect that kept Firefox relevant was a brilliant move.


Smoke signals here, can't trust the postman either.


There's a good chance somebody on the other end's going to run your program without really vetting it. Especially if you include a convenient "test runner." Not a bad attack vector, just get your alias past the first screen and you might get arbitrary execution inside their firewall on a host with access keys and the like.


Any, really.


That's a sad way to think about life. Back of the envelope math says I can be pretty happy with way less than that, different standards and all I suppose.


Do you live in america? Do you plan to live 15-20 years past retirement? You are already at 50k/year with a million dollars, which is an amount of money that is ok in the present if you are young, but with inflation it will be a lot less.

Are you aware of how much elderly healthcare and care in general costs?

The average cost of a private room in a elderly care facility in the present is about 90k/year.

My standards are about not dying because I can't afford to pay for my prescriptions.


I hope you have kids, and they (continue to) like you.

I don't know what the inflation rate is between now and 65 years old... and I hope to live for 20 years after that. Assuming you're 50, that's 35 years of inflation which is right about a halving in value at 2%. So you're going to end your life with $25k/yr current equivalent and hope that SS and Medicare carry you through. Good luck.


Let us know how a passport.js hiring filter works out long-term.


I think it would probably work out pretty well... for the candidate.


Few if any of the 300 applicants were overqualified for that garbage opportunity. Can you describe on how they were overqualified? List a bunch of certs and techs? Good programmers aren't fighting over scraps for underpaid government positions working with JBOSS, they get snapped up immediately when they hit the market. Five years to saturation? Yea right.

Readers who just deleted emacs in despair should keep in mind that parent is hiring for Danish government position, not exactly the hotbed of software development or skills demand.


While I can't speak to the GP's experiences, I'm guessing you're in SV (most likely), NYC, Chicago, RTP, or Austin based on your remarks on how quickly developers get snatched up.

However, those of us living outside that handful of cities have a totally different experience, regardless of how good we are or how much experience we have. There just are far, far fewer jobs outside that handful of American cities.

And many of us are unable to just up and move due to a personal situation (sick relative, illness, spouse's job, etc.). It's just worth bearing in mind that your experience as a developer is somewhat the exception in the US, not the rule. At best, it's a 50/50 split between devs in the cities I mentioned above and the rest of us.


I've had similar experience finding on-site work in a healthy market as I have as a remote worker. I don't think parent's advice is sound, claiming that software eng. is a dead-end path for most, with a five year fuse. Clearly if you're not excellent or willing to move to a city with jobs in your industry, you're going to be in for a rough time. Not specific to software eng.


Protein is good, but not too much from meat:

https://www.ncbi.nlm.nih.gov/pubmed/27479196

> CONCLUSIONS AND RELEVANCE:

> High animal protein intake was positively associated with cardiovascular mortality and high plant protein intake was inversely associated with all-cause and cardiovascular mortality, especially among individuals with at least 1 lifestyle risk factor. Substitution of plant protein for animal protein, especially that from processed red meat, was associated with lower mortality, suggesting the importance of protein source.


Thanks. Do you know of any naturally-high-in-protein plant-based source that isn't soy (i.e. tofu etc) or hemp?

Asking cos I've had a hard time finding protein dense plant based alternatives, and as a vegetarian, it's really hard to get a lot of proteins from plant-based sources without processed stuff like Vega etc.

I do eat quinoa but 1) it's very expensive and 2) you have to eat 2 or 3 cups of cooked quinoa to get like 25 gms of protein which is a lot of quinoa per day.


I think there have even been studies that correlate all protein intake (regardless of source) inversely with longevity, so I try not to get too worked up about not having a meal with a lot of it. Beans, lentils, peanuts, etc. have a decent amount of protein, but even bread has protein and nowadays it's not hard to find high fiber, high protein bread. I love meat so this is all difficult, and I hope to eat less over time. I think it's a balance between the short-term benefits of protein (energy, muscle) and the long term effects of high protein consumption (tough on your organs).


There is protein powder available from plant only sources, like pea protein. It is obviously processed(not sure how strictly you are against processing), but 'pea protein(extracted from yellow peas)' is the only ingredient in my whole foods version. 20g of it has 15g protein, <1g carbs, 1.5g fat.

edit: Just looked up vega and it looks like it is just plant protein like I recommended - sorry about that. What's wrong with extracting proteins out of plant matter? You don't end up with any noticeable amount of reagents or any intermediaries in the final product, thanks to chemistry.


Plant sources tend to have a higher ratio of calories to protein, so it's tough if you're limiting calories. Chickpeas, eggplant, and tahini all have protein; Trader Joe's Eggplant Hummus has all three, but somehow manages to claim a lower calorie-to-protein ratio than any of its ingredients (not sure how that can work). Mushrooms are good too. Also seitan if you don't mind something a little more "processed".


> somehow manages to claim a lower calorie-to-protein ratio than any of its ingredients

Hummus usually has more oil added on top of the tahini. And chickpeas are the main ingredient and they have a lot of starch on top of the protein.


Right, that's why I find their nutritional claim hard to believe: 2g protein per 35 KCal is better than tahini, eggplant, or chickpea individually.


Nuts.

To a lesser but still significant extent, legumes.


Soy is safe for all but a small few who are allergic. (true of any food)

The dairy industry did a stupendous job convincing people it will interfere with their hormones though. Master class in PR.


pea protein is a very cheap vegan source


Does animal protein include whey protein?


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: