Hacker News new | past | comments | ask | show | jobs | submit | mechanicker's comments login

Wonder if this is more due to Bhyve being developed on FreeBSD and Illumos derives from a common ancestor BSD?

I know NetApp (stack based on FreeBSD) contributed significantly to Bhyve when they were exploring options to virtualize Data ONTAP (C mode)

https://forums.freebsd.org/threads/bhyve-the-freebsd-hypervi...


While we have a common ancestor in the original UNIX, so much of illumos is really more from our SVR4 heritage -- but then also so much of that has been substantially reworked since then anyway.


Would love UC Berkeley to revive innovation and collaboration as seen during the BSD days.


So RISC-V isn't enough for you? ;-)

More seriously, it does seem like there were a number of interesting systems research and development collaborations in the 1980s: BSD at Berkeley, Athena at MIT, Andrew at CMU, etc.

Currently it seems like the interest, funding, opportunities, and incentives for academic researchers are largely for short-term projects and AI/ML rather than long-term, ongoing systems projects. The modern funding and publishing landscape seems to emphasize speed and quantity over quality and impact.

Moreover, it seems that companies with deep pockets (Microsoft, Apple, Nvidia) may be less likely to collaborate with and/or fund academic projects as IBM and DEC did in the 1980s. It could be that those partnerships weren't hugely beneficial for AT&T, IBM and DEC's businesses.


eXperimental Computing Facility isn't enough for you?


I hope this and SCIP becomes a standard and we have more programming languages emitting symbols in SCIP format.


I thought SCIP got promoted into https://lsif.dev/ but chasing the https://github.com/sourcegraph/lsif-java link resolves to https://github.com/sourcegraph/scip-java so maybe I had the evolution relationship backward. Anyway, I'm thankful at least that code is still Apache 2

https://github.com/topics/lsif may interest this audience, too, since the scip topic tag seems to clash with something else

Also, I learned last night that GitLab embraces LSIF, too https://docs.gitlab.com/ee/topics/autodevops/stages.html#aut...


SCIP is an evolution of LSIF, basically: https://sourcegraph.com/blog/announcing-scip


My, and my friends experiences with SCIP indexers built by Sourcegraph have been less than stellar. They are buggy and sparsely maintained


I work at Sourcegraph and would love to learn more.

1. Which SCIP indexers did you having issues with?

2. What issues did you hit (can you share details or link to GitHub issues filed?)

Thanks!


Hello! I am Head of Engineering at Sourcegraph. I'd love to get feedback on which SCIP indexers you've had issues with, and, if you have the time, feedback on what sort of problems you've had with them. Thank you so much!


Hey guys, it's been over two months since I've been in the weeds with SCIP so I'm not going to be able to write very detailed issues, most of my experiences were with scip python and some in typescript.

1. roles incorrectly assigned to symbol occurences

2. symbols missing - this is a big one. I've seen many instances of symbols being included in "relationships" array that were not included in "symbols" array for the document, and vice versa. Plus "definition" occurrences have been inconsistent/confusing - only some symbols have those, and they don't always match where the thing is actually defined (file/position), and sometimes a definition occurrence has no counterpart in symbols array

3. the treatment of external packages have been inconsistent, they sometimes get picked up as internal definitions and sometimes not

I think SCIP is a great idea and I'd explore using it again if it got better. But I see that there are issues staying in the backlog for 6+ months which makes it seem from the outside like Sourcegraph is not prioritizing further development of scip


Thanks for the details. I appreciate you taking the time to give the feedback. We will use this to help to improve SCIP and the indexers.


Awesome tool I purchased when I stumbled upon it.

I draw flowcharts for complex implementations in ASCII using Monodraw and embed them in source code.


This is my exact concern when new corp of developers try and learn operating systems using languages further away from the building blocks.

Unpopular opinion: It is still good to learn enough C + system programming & their gotchas before starting with a more fancy higher level language.


I don't think this is a fair comparison. If you want to teach that you can write(2) to raw FDs in Rust, you can, just like you can use write(2) or fprintf(3) in C.

C has a standard library which students should understand even though it's making system calls deep down. Rust has a standard library which students should understand even though it's making system calls deep down (in fact, sometimes through the host C library).

I certainly see the value in knowing C and Unix and that was my education over two decades ago as well. But I also watched many people quit computer science altogether because they struggled with heisenbugs with C pointers. If they could have been kept on track by Rust compiler errors and higher level abstractions, maybe they would still be in the industry today, learning whatever else they needed instead of quitting in their first semester.


Is going from high level to low level somehow worse?

I went from very high level (C# web and even webassembly) to C

and while I believe I learned a lot and my understanding of computers improved,

then I think the biggest lesson is that one of the most important programming ecosystems (C) is a very messy and painful.

Not because it must be painful, but because of decisions made decades ago, maybe some inertia, maybe backward compatibility, maybe culture, who knows?

Low quality compiler messages, ecosystem fragmentation, terrible standard lib (where are my basic datastructures), memory management being minefield, etc.


C gets a bad wrap because there are now alternatives built by finding solutions to problems we only know because of Cs existence. Compiler messages, standard library and memory management are all things we can agree are terrible now days but when C came out it was a huge improvement over the norms before. Also it’s important to remember even “big” things like Unix were at one point just a few thousand lines of code.


> Unpopular opinion: It is still good to learn enough C + system programming & their gotchas before starting with a more fancy higher level language.

And easier


Rust full-timer with a background in C/C++, lately using neither at all. That opinion isn't as unpopular as you'd think.


After being a web developer for 10+ years, I'm getting into C for the first time. I'd had a bit of experience with Objective-C years ago when I did some iOS work, but that was the "lowest" I'd gone down the stack.

There's a lot of unfamiliar territory, but I'm really enjoying it. When it's complex, it feels like it's just inherently complex. Which is a breath of fresh air for a web developer. I'd gotten so sick of the bullshit complexity that comes along with the high-level work; programming feels fun again.


What happens when rest of the team does not rise to the occasion? You now have a happy but very mediocre team for the task on hand. Decision making is very democratic but seldom happens in time.

I feel you need pace setters but not excessively reward individual heroics.


A very mediocre team that does not rise to the occasion (aka fails at delivering) probably will look bad for management and will be sidesteps (aka outsourced) if not fully replaced.


Well, those OS differences taught us to write (or strive to) portable code. It also taught me to better appreciate strengths in different operating systems over the years.


Absolutely! But in a learning format, the student probably needs to be focusing on the task at hand.

Learning to paper over OS errata isn't as generally useful as, say, groking multithreaded coding models.

Yes, there are environment quirks. Yes, you'll have to deal with them. Yes, you can look up documentation when you run into those situations.


Samba for file sharing? I led development efforts for the first Samba release on VMS (IA64) working at HP. It was not performant enough compared to ASV but we were on the right path.


Could it be more to prevent monopoly by their competitor Amazon becoming the biggest investor?

OpenAI gave Microsoft a competitive edge and this could just be a hedge to prevent Amazon from getting even further ahead.


I feel it is best to leave git out of this - separation of concerns Let external tools evolve independently instead.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: