Hacker Newsnew | past | comments | ask | show | jobs | submit | apwheele's commentslogin

Yeah ditto. I don't know when it happened, but the Coursera courses I tried at first (around 2012 I think?) were very high quality -- I thought it was clearly a competitor to traditional brick and mortar.

Then a few years later, checked it out and there were thousands of courses, many clearly without as much thought or effort.

I am not as familiar with the other online schools that focus on quality (like WGU). I am surprised they have not eaten traditional schools lunches, since the actual quality of instruction is often very variable (I am a former professor, for the most part profs have little oversight in how they run classes). Market for lemons maybe?

Another aspect I am surprised at is that the big companies have not just started their own schools. UT-Dallas where I was at for a few years was basically started to help train up folks for Texas Instruments. (RAND Pardee school is kind-of an exemplar, although that is not focused on software engineering.)

I debate sometimes I shouldn't bother with hiring seniors and just train up everyone. If you have 10k software engineers does it not make sense to just have that level of training internally?


> Then a few years later, checked it out and there were thousands of courses, many clearly without as much thought or effort.

Thousands, and no decent way of separating the wheat from the chaff. Their filtering options suck. I'm also a bit disappointed that (most? all?) of their courses don't feature interactive exercises the way Khan Academy does. I mean I get they started out as basically a repository of recorded lectures, but i.e. a Linear Algebra course is pointless without practicing problems. A few overly simplistic multiple choice questions are the "best" I've seen on on Coursera.

Mean while their prices seem to go up every year.


Unfortunately the get rich quick/grifter community realised that online courses was a way to make money.

Do folks have opinions on the quality of the recent books? Last one I picked up was not good (good chance it is partially AI slop).

Read online (and not in the app), but the copy-editing did not do it any favors, and then how code snippets were formed broke the simple copy/paste (used icons for line breaks that could have been avoidable).


When you want to run stuff client side instead of your server is one question to determine.

For R specifically, it is focused on stats/graphing. So if you wanted an app where someone could upload data and fit a complicated regression model, this would be a good use case. (There are probably javascript libraries for regression, but if willing to live with the bit of start up lag, worth it for anything mildly complicated -- factors in R for example would not like to worry about writing my own code in javascript to make the design matrix.)

In the case where you run the server, the data has to travel to your server, your computer estimates the model, and it sends it back. WASM apps this all happens client side.

It is a good use case for dynamic graphs/dashboards as well. If the data is small enough to entirely fit in memory, can basically have a local interactive session with the data and everything will be quite snappy (do not need to continually go back and forth with your server to query data).


I used quarto for my book as well and have a write up, https://andrewpwheeler.com/2024/07/02/some-notes-on-self-pub...

One nice thing about Quarto, is that I could have different fonts and formatting niceties for the epub vs PDF version (which the PDF version is the one I use to sell a paperback copy).

Additionally wrote a little script to auto-translate the contents from the markdown, so currently have the book available in Spanish and French as well.


I don't know what NIST says, but for the tests that use Chi-square, you can look at the left tail. Basically tests that have very small Chi-square values are "too close" to the expected distribution.

This is how Fisher critiqued Mendel's experiments -- they were too perfect!


For the more specific part that the PD states it does not physically save the data locally, I do work with cities Police departments and Flock will integrate with the local record management system.

I suspect they probably do have the data locally integrated (at least for the time period the state allows them to retain the records). But even if they do not, many police departments that would not be an excuse (although you need to request fast, many states only retain for 30 days or less now).


This is very obnoxious when transferring documents from Gemini to google docs (and not just math, tables/code sections are often not transferred correctly as well).

I have a javascript hack from the dev console where I can at least print the chat to PDF, https://andrewpwheeler.com/2025/08/28/deep-research-and-open... (that at least worked 2 months ago)


For a demo of this (although not sure with duckdb wasm that it works with iceberg) https://andrewpwheeler.com/2025/06/29/using-duckdb-wasm-clou...


I do remember 1 example of an emoji in tech docs before all of this -- learning github actions (which based on my blog happened in 2021 for me, before ChatGPT release), at one point they had an apple emoji at the final stage saying "done". (I am sure there are others, I just do not remember them.)

But agree excessive emoji's, tables of things, and just being overly verbose are tells for me anymore.


I do recall emoji use getting more popular in docs and – brrh – in the outputs of CLI programs already before LLMs. I’m pretty sure thst the trend originated from the JS ecosystem.


It absolutely was a trend right before LLM training started — but no way this was already the style of the majority of all tech docs and PRs ever.

The „average“ style, from the Unix manpages from the 1960s through the Linux Documentation Project all the way to the latest super-hip JavaScript isEven emoji vomit README must still have been relatively tame I assume.


Really hate this trend/style. Sucks that it's ossified into many AIs. Always makes me think of young preteens who just started texting/DMing. Grow up!


Can folks comment on what applications they use k-means for? It was a basic technique I learned in school, but honestly I am not really familiar with a single use case that is very clearly motivated besides "pretty pictures".

So I do a bit of work in geospatial analysis, and hotspots are better represented by DBSCAN (do not need to assign every point a cluster). I just do not even use clustering very often in gig (supervised ML and anomaly detection are much more prevalent in the rest of my work).


It's used for vector quantization which can be used for color quantization


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: