Hacker News new | past | comments | ask | show | jobs | submit | cplat's comments login

I don't understand. Deterministic and stochastic have very specific meanings. The statement: "To continue my reply I could say this word, more than the others, or maybe that one, a bit less, ..." sounds very much like a probability distribution.


If you really want to think at it as a probability, think at it as "the probability to express correctly the sentence/idea that was modeled in the activations of the model for that token". Which is totally different than "the probability that this sentence continues in a given way", as the latter is like "how in general this sentence continues", but instead the model picks tokens based on what it is modeling in the latent space.


That's not quite how auto-regressive models are trained (the expression of "ideas" bit). There is no notion of "ideas." Words are not defined like we humans do, they're only related.

And on the latent space bit, it's also true for classical models, and the basic idea behind any pattern recognition or dimensionality reduction. That doesn't mean it's necessarily "getting the right idea."

Again, I don't want to "think of it as a probability." I'm saying what you're describing is a probability distribution. Do you have a citation for "probability to express correctly the sentence/idea" bit? Because just having a latent space is no implication of representing an idea.


GeorgiaTech's online Master's in Computer Science has lower costs. About $8k in total. I did it along with a full-time job since I only took one subject per semester.


You need a bachelors to get into it which they don't have.


I have a coworker that did this and also recommended it but unfortunately I need the bachelor's first. Do you know any cheaper CS bachelor's programs?


Which aspects? Foundational textbooks would focus on principles, not necessarily implementations, and don't go "outdated" the same way a snippet does.


To be fair, he never said "every single role at your company requires a secret clearance." He has specifically mentioned that you can get by even without them. You have misquoted.


I think there's merit to both design docs and prototypes.

At the same time, your argument that "you'll be replaced by GPT in no time" is also an opinion that you've not supported with any data; the same thing that you're accusing the OP of.

I mean If I stopped reading opinions, 99% of the HN comments would disappear.


I agree with you. I've wrote very big web applications in Rails, Django, and Java/Spring.

I understand the argument for things being "explicit" in Python, but then I just prefer Java if I'm going to be very verbose about what I want (I'm only talking about backend APIs here). It's just my opinion that whether explicit is good or not depends on the level of abstraction we're interested in, and I don't believe that explicit is always good. (I like the concept of meta-algorithms, for example)

But if there's a one-person project that needs to be scaled quickly, I prefer Rails. (The article mentions how Django makes model fields explicit in the models file, but doesn't talk about schema.rb in Rails which doesn't require you to view each migration to know how the database looks.)

Yes, big projects in any language can get messy, but that's a software engineering problem, not a framework problem.

I recently wrote a FastAPI project that was db-driven, with all the necessary test cases, etc. The amount of lines it took to express the controller, the schemas and models separately, the dependencies for auth and stuff, and especially elaborate test cases was pretty substantial. Yeah, the code was all explicit, but it was not enjoyable.


> Yes, big projects in any language can get messy, but that's a software engineering problem, not a framework problem.

Strongly disagree. IME the biggest factor determining how big your project can get before it turns into a mess is your choice of language/framework.


Good for you.


I'm familiar with PureBasic (although didn't use it a lot). I got introduced to it in the 2000s (2000-2007 or something), along with DarkBasic, GameMaker, and the likes.

In today's era, however, I have not yet found a need to use a proprietary language.


PureBasic for Windows and Linux was released in 2000 (the Mac version is probably a little more recent, that for Amiga a bit older, and the latest for Raspberry PI is a few years old); a community got active in 2001.

> In today's era

In today's era it's still priceless to have a tool that * compiles to a native machine code executable; * is lean and fast - you can get results quickly and from a light efficient interface; * does pretty much anything (and in case you are missing anything in the native libraries, you can interface with external C libraries)...

And (subjectively) can be a real pleasure to use.


That's true, but still almost no one wants to pay for that pleasure. People rather suffer day after day to save 50 bucks. Not saying purebasic is solid as I have never used it, but there are definitely times when I long back to the times when I could pay a few 100$ and get something that was batteries included and vetted without the breaking churn of people just dumping libraries online and hoping for the best.


Please note that there are free versions, with code size as a limitation. One can test if it is worth it, maybe upgrade if the need arises for bigger projects:

https://www.purebasic.com/download.php

https://www.spiderbasic.com/download.php


Yes, which is why the rest of my sentence is important. I said, “I haven’t found a need to use a proprietary language.” I didn’t claim that one will never be needed by anyone. :-)


You could use Godot (has a nice set of GUI widgets in addition to the game-specific stuff) or Lazarus (Free Pascal IDE) for instance. I did not use Lazarus much, but Free Pascal as a compiler and the language is nice and stable and supports a long list of platforms, generating native code (including cross-compiling between many different pairs of platforms).

https://www.lazarus-ide.org/


it looks like spiderbasic is from the same company as purebasic.

there is a link at the bottom of the purebasic web site to the spiderbasic web site:

https://www.purebasic.com

mdp2021's comment also indicates that:

https://news.ycombinator.com/item?id=42349737


Yes, I’m aware, which is why it reminded me of PureBasic with which I’m more familiar. :-)


This is a very good comment and does reflect my experience. As engineers, we're the only people who can estimate something close enough, and it becomes our job to do that while taking into account the risks.

Our bad assumption is in thinking that only the final output matters, regardless of when and where it is delivered. Like saying that it only matters if the train arrives at the station, regardless of when it does.

The problem is anyone depending on us downstream will get impacted. And yes, estimation is tough, requires foresight, and maybe a lot more things, but that's what being a professional means.


Thank you! And yeah, I think a lot of software engineers miss the forest for the trees in assuming that estimates are "only estimates", in what is implicitly assumed by others (who may not have as much context) when eg communicating dates, and in the assumptions they make regarding constraints that may not actually be constrained (eg the ability to add more people to the project or partially release/launch it with reduced scope, with the remainder to be added later). It's like, people forget that a few weeks of one engineer's time costs their employer tens of thousands of dollars, and that multi-quarter/multi-person projects literally are $1mm-$10mm+ endeavors.

A lot of less experienced engineers also are just bad at estimating and don't do a good job clarifying blockers/risks/etc when participating in planning poker with a scrum master/manager, who may also not be very good at their jobs. Obviously a lot of what I wrote is overkill for "you said this was only one story point but it actually took you two days!" but I think this environment being most SWE's first/only exposure to estimation causes them to take the opposite lesson than what they should (that estimation is awful/bullshit, it doesn't matter if you blow past it/everybody always blows past it, you will be argued with if you estimate something as too high, you are incentivized to excessively overestimate to keep your workload low / prevent people getting angry). But that's really only a productive mindset to have when the stakes are low.

I do think leaders, from executives to managers/tech leads/"scrum masters", should be more in the habit of proving estimates/deadlines for risks and ranges than they are. A lot of the time these things become games of telephone were eg an engineer comes up with a plan with all the risks identified and detailed completion dates, and then their manager converts that into a single date given to the CEO as a reasonable deadline. That said, if you actively communication the current ETA and risks proactively throughout a project you also end up with a convenient paper trial and give people multiple opportunities to correct miscommunication.


This is it. I'm a hardcore engineer at heart who has a lot of these sales, marketing, and product folks as friends, and can attest to the fact that they also have constraints.

The whole world runs on deadlines and timelines. Even a president is elected for a specific duration. If you're in a B2B setting, the customer demands (sometimes even contractually binding) at least the Quarter when something will be delivered.

Time is the only common denominator by which different activities can be coordinated. Without some heed to time, there will be no coherence.


Presidents are technically timeboxed, at least in the US.


If your knowledge of Python comes from JavaScript, I would not blame Python for it. It's the failure of the person to not "read the instructions" and assume instead. Maybe conduct interviews in languages that you're familiar with?


If you were programming long enough, you easily had contact with dozens of languages. How do you pick up a new language? You can't really treat it as your first one. Especially if you need it "for yesterday". You won't read about "if" or "for". No, you scan for what's different from what you already know. If you're lucky you'll find "python for javascript programmers", but that probably won't go into non-core details like this. In practice, you learn basics first and start coding. Then you find a piece of code somewhere that you don't understand. That's a learning opportunity! However, it's easy if it's a function since it's easily googlable. Operators are harder to search for, for instance in new versions of C# you have operators like "?." (just an example). Since googling is hard you go to some "operators documentation" and try to find it there. And hope that it covers the new version. For cases like this story it's even harder because it describes a concept (chaining) and you maybe don't even know which name is used for that.

At least ChatGPT recognized and explained it correctly, so it makes picking up new features easier than it used to be. I'm making a mental note to ask LLMs whenever I encounter unknown constructs.


If it’s a language I don’t know, I’d still read a book or check the doc for a tour of the syntax. I can scan one in a couple of hours and get an overview I can refer to later for a more specific question. Even if I needed it for yesterday.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: