> The salt in the wounds is that universities are flush with cash, yet its spent on anything and everything except for the welfare of the students.
Maybe the elites. State schools and small colleges are not flush with cash and many have been shuttered or severely downsized recently. Though they could still spend their limited funds better.
Recent events alone do not fully represent the affairs of the past 2+ decades. Community, state, ivy, all levels were gorging themselves on federal funding and endowments. I have no comment on the current admin, but blatantly inefficient use of funds is an understatement.
Charitably, they may mean "the proceeds from their endowments" (or maybe "engorging their endowments", if that's even a proper use of the word), but I think that's a weak point. Proportionally very, very few institutions have significant endowments.
It sounds like you can turn the feature off entirely, which seems semi reasonable — the ads support the updated scenic content, and you can opt out. but i’d bet it shows ads in other places too.
Gotta have regular updates to my videos of forests, waterfalls, lapping waves and crackling log fires. You never know when they're going to launch a new version of trees.
> Q: My TV started playing a video in full screen by itself. What happened? A: Your TV launched Scenic Mode, a FREE, new feature that displays relaxing, ambient content when your TV is idle for a period of time. Scenic Mode delivers an experience that adds to the environment of your home or office.
It's relaxing, so you need to RELAX rather than getting in a huff over blaring ads. What, you're not relaxed and going to pull the plug?
And no evidence - literally nothing - presented to support the idea of looming AGI except ‘some people in the business/in the government said so, and I believe them.’
I think of hallucinating as a phenomenon where the model makes up something that appears correct but isn’t. Citations to papers that don’t exist, for example. Regurgitating training data (which may or may not be correct) is a different issue.
PyTorch is just much more flexible. Implementing a custom loss function, for example, is straightforward in PyTorch and a hassle in Keras (or was last time I used it, which was several years ago).