Hacker News new | past | comments | ask | show | jobs | submit | c-linkage's comments login

It's spelled like it sounds! :D

Local to south Jersey it's "ruckers".

I’m from south Jersey and have never heard of this “ruckers”. Is it near Ouaisné?

I was about to call fake on this -- Americans from south Jersey are largely unfamiliar with the present perfect and would not say "[I] have never heard of" but "[I] never heard of" instead.

But it turns out this grammatical cue is an effective way to discover that the comment is not about an American south Jersey but a British one.


It’s an Albany expression.

The most insidious version of this I experienced was when a library changed the FPU settings.

Fortunately, it was sufficient to reset the FPU settings after initializing the library. But it sure took a long time to figure out what happened!


There's a great Random ASCII blog post about an obscure FPU issue like this [1].

  - The crash was in a FPU that Chrome barely uses  
  - The instruction that crashed Chrome was thousands of instructions away from the one that triggered the exception  
  - The instruction that triggered the exception was not at fault  
  - The crash only happened because of third-party code running inside of Chrome  
  - The crash was ultimately found to be caused by a code-gen bug in Visual Studio 2015  
I've run into this kind of thing once myself (sharing a process with other companies is fun!). It was real confusing to get a stack trace showing our code suddenly crashing on a line of code that was doing the exact same thing with the exact same values as it had always done before.

[1]: https://randomascii.wordpress.com/2016/09/16/everything-old-...


This also happened in python ecosystem where gevent were messing with numpy because gevent was compiled with -ffast-math, which disables subnormal numbers

Blog post: https://moyix.blogspot.com/2022/09/someones-been-messing-wit... HN discussion: https://news.ycombinator.com/item?id=41212072


I've always disliked economics because it never seems to make much sense. The first equation in the article -- the basis on which the entire premise rests -- just feels wrong.

> Spending is either on the consumption of goods and services or investment spending on equipment, structures, and intellectual property products. Income is allocated to either consumption or to saving by households, businesses, and government. In a closed economy, spending equals income—that is, the sum of consumption and saving equals the sum of consumption and investment spending.

> Spending (Consumption + Investment Spending) = Income (Consumption + Saving)

> Because consumption drops out on both sides of the equation, investment spending equals domestic saving in the economy. This makes sense: the funds available to invest in productive projects have to come from domestic savers.

It _doesn't_ make sense. How is consumption on the income side of the equation? And even if that somehow did make sense, who is to say the consumption on the income side is the same as the consumption on the spending side such that they balance out?

Saving is deferred spending, meaning money is set aside temporarily. One might think of this like a "cash queue" where the velocity of money slows down for a while. Is the assumption that all saving takes place in banks where banks can lend it out? If I stuff cash in a mattress (saving), how can that cash be used for investing?

A more realistic version might look like this:

                (fast)
  Income --+--------------+-> Spending --+--> Consumption
     ^     |              |              |
     |     +--> Saving >--+              |
     |          (slow)                   |
     |                                   v
     +---------------------- Investment and Production

This model involves time, but apparently economists only like models that incorporate addition and subtraction.

EDIT: If I'm asking questions, saying I don't understand, and offering a counter-model, doesn't that count as adding to the discussion? If I'm operating under some misunderstanding, there are certainly others who have the same misunderstanding but didn't speak up.


> It _doesn't_ make sense. How is consumption on the income side of the equation? And even if that somehow did make sense, who is to say the consumption on the income side is the same as the consumption on the spending side such that they balance out?

The model they are using is a simplified macroeconomic model. In their model, they are simply saying that when you account for Income--the total amount of money earned across the entire economy--it can only fall into two mutually exclusive buckets. Either the income is related to Consumption (purchasing goods and services), or Saving (as you mention, deferred spending--money in banks, or surpluses in the budget for states etc.--anything that is not in the consumption bucket).

> who is to say the consumption on the income side is the same as the consumption on the spending side such that they balance out?

By definition, it has to be. The way national income accounting works is that you can look at things from the perspective of expenditures or income. Since GDP is total output and total income, the total amount of consumption is the same, which is why it drops out in the equation from their model.


So in this simplified model, consumption is like a mobius strip in that while the mobius strip only has one side, income can only come from the consumption from spending.

The fact that they included it on both sides of the equation seems pointless, then, and only serves to confuse.


Klitgaard is talking about accounting, ie, instantaneous currency exchanges where, by definition, the gain on one side of the exchange has to equal the outlay on the other. Your model doesn't look like an accounting model; I suspect you want to talk about something different from what he is talking about - and that you want to talk bout policy implications rather than truisms (which is a good instinct; just not what economists care about when they sit down to talk accounting identities).

> How is consumption on the income side of the equation?

Isn't it just that one person's spending on consumption is another person's income from that consumption?


So what's negative savings in this model? Money you get temporarily and use to increase income?

> I've always disliked economics because it never seems to make much sense. The first equation in the article -- the basis on which the entire premise rests -- just feels wrong.

That's because you're reading an econ 101 equation, similarly how basic physics blogs use spherical cows in equations to simplify them. Most of internet (and even media) discourse about economcs never grows out of this level - it's like having people debate physics of nuclear reactors while their knowledge is stuck on Newtonian level.

Or, as some academic once put it - first year in economics college you learn econ101... and the rest of the years you're taught all the ways that model doesn't apply to real life.


Economics is the formalisation of confusing stocks for flows.

In the real world approximately zero saving occurs by stuffing cash in the mattress, so “all saving is investment” is a correct simplification.

Other people's consumption is your income, not production, producing does not generate income by itself.

It isn't called "the dismal science" for nothing.

Change occurs one death at a time.

The chart is good because it shows actual measurements, but I'm concerned about the baseline being 1970. It is possible that measurements prior to 1970 are higher and make the projected measurements look worse than they are, especially since many charts for mid-Atlantic sites were clearly on a downward trend shortly after 1970.

I'm not saying there is an intentional attempt to bamboozle readers. It's just hard to judge the report on such a short time scale.

EDIT: I'm not a climate denialist, but I have many people in my life who are. I was really looking forward to showing them this website but refrained knowing that they would come back with the argument I mentioned.


I shouldn't be surprised that we still see this standard climate denial logic, but it's worth pointing out that your logic here is not particularly strong. The problem with this line of reasoning is it takes the assumption that all we have is measurements without any real understanding of the process that's happening to cause those measurements.

Sure, if we knew nothing about Earth's climate (and also didn't have plenty of other ways to measure historic sea level), only having 50 years of measurements might be misleading. But we know that sea level is rising and we know why it is rising. We have a very strong hypothesis as to what's happening and we see this hypothesis confirmed again and again across a very wide range of subjects that otherwise have no relation to one another.

On top of this, we do have plenty of other measurements for historical sea levels that all indicated yes, sea level is raising they just aren't perfectly apples-to-apples "actual measurements" so it wouldn't be perfectly honest to include them in this chart.


This is an interesting podcast: https://corecursive.com/briffa-sep98-e/

"In this episode we explore the “Climategate” scandal that erupted from leaked emails and code snippets, fueling doubts about climate science. What starts as an investigation into accusations of fraud leads to an unexpected journey through the messy reality of data science, legacy code struggles, and the complex pressures scientists face every day."

He goes into the code/data that is seemingly the root-cause of a lot of "it's all a hoax." I found it pretty informative, as to how climate data is gathered and processed (by the scientists). And the limitations therein. He's simply trying to explain the cause of climategate, rather than advocate any view.

It's also a great example of a tech/dev investigation into root-cause analysis, of someone else's code. So it's interesting from that point of view, even if you're less interested the climate side of it.


> just are perfectly apples-to-apples

aren’t?


Fixed, thanks!


> standard climate denial logic

This stinks of circular reasoning. It's bad logic because climate deniers use it and climate deniers are wrong because they only ise bad logic.

The reason the other measurements you mentioned can't be included are often that the measurements are or equal or greater distance between eachother than this entire set. Including this data in one of those sets would demonstrate that there are plenty of times in history where the sea level changed the amount it has in the last 50 years.

If we know so much about why it's rising, what's with all the measurements? We don't "know" nearly as much as you're implying. The reason we don't go around measuring healthy humans body temperature is because we know what it is. The entire purpose of the measurements is to increase understanding.

Current forecasts of Y temperature rise would lead to X sea level rise rely in a static model of all other variables. It should be obvious that the climate is anything but static, considering the entire argument is about climate change.

It's perfectly reasonable to criticize this kind of extrapolatory thinking without denying the fundamentals of climate change.


> Including this data in one of those sets would demonstrate that there are plenty of times in history where the sea level changed the amount it has in the last 50 years

In history? No. Sea levels have never been higher in the written record.

In geologic history? Of course. No serious scientist argues otherwise. The point is returning to those levels means abandoning Baltimore, Houston, much of Los Angeles and most of Miami and multi-trillion dollar projects to protect San Francisco, New York and Boston.


> The point is returning to those levels means abandoning Baltimore, Houston, much of Los Angeles and most of Miami and multi-trillion dollar projects to protect San Francisco, New York and Boston.

Here’s my problem with all this stuff. All the science says LA, NYC, etc. are going to be underwater. Not maybe, not in the worst case, no. All the reporting says this is pretty much a forgone conclusion, and has for many years.

So why have these cities not started working on erecting (say) 50ft tall “future-proof” sea walls? Even if they end up not being needed, it _seems_ like this is the type of climate change mitigation step that would be a prudent thing to do. Certainly more so than the whole lot of nothing currently being done. Surely LA and NYC politicians and voters, being so much more educated than all those dumb red state hicks would be in favor of that, wouldn’t they?


> why have these cities not started working on erecting (say) 50ft tall “future-proof” sea walls?

Because we don’t need to yet? Also, a sea wall doesn’t block, it deflects. Protecting Manhattan means deflecting those surges to e.g. Long Island and New Jersey. That’s a difficult conversation much easier had after a hurricane washes away some of the opposition (and/or generates urgency in the core).

> LA and NYC politicians and voters, being so much more educated than all those dumb red state hicks would be in favor of that, wouldn’t they?

Yes, but they’ll do what those states do with their own climate risks: wait for a catastrophic failure that ultimately costs more but unlocks federal funding and so costs less locally.


In short there's no actual will and people think short term.

A bit longer:

Good luck sourcing that from taxes. People vote, and those projects would A, fall to graft, B piss off many in your voter base both as a consequence of the graft and the general disagreement over their value.

The answer is you would see the people who greenlit the projects voted out and the projects would be scuttled.

People can say they know this is a problem but because its in the abstract most of your voter base just won't go for it and it's squarely in a "people don't actually vote in their best interest" type of problem.

It's a riot trying to get a few new MTA tunnels approved and needed repair and modernization for the NYC subways is always basically just out of the question.

So 50 ft sea walls? Yeah people would actually be under water and still doubting the need for them.


I'm not talking about height, I'm talking about rate if change.

The height is concerning regardless but the rate of change is the link to anthropomorphic climate change. If it's shown that this rate of change is not unprecedented, the link to human causes is less solid.

I'm not here to say CO2 isn't a greenhouse gas and that humans aren't likely responsible for current and future warming, I'm pointing out that there are plenty of people who believe the same as me but to a degree that is not supported scientifically.

The data fits the co2 hypothesis great but Baysian reasoning also must account for other models that fit the data as well and must even include the prospect that there are other unknown causes that could produce the effect, as there clearly are given the thoroughly precedented nature of our current situation.


> This stinks of circular reasoning. It's bad logic because climate deniers use it and climate deniers are wrong because they only ise bad logic.

That would be a good point, if that was what I was arguing but it's clearly not. I am pointing out that this is a common form of argument used by climate deniers, and then, independent of that fact, demonstrating why it's poor logic. My argument regarding why the logic is poor has nothing to do with the fact that's it's a commonly used line of reasoning in climate denial. However the classification of the logic as such is useful to help people quickly identify the common set of erroneous methods used that show up very frequently in online discussions (and sadly, very commonly on HN).

Climate denial arguments do tend to use faulty logic in a similar vein to the way creationists tend to use faulty logic, because the evidence in favor of the alternative hypothesis is so much greater the easiest way to "attack" that hypothesis is through poor use of logic. But clearly that does not imply that all logic employed by people in these camps is inherently faulty.


> It is possible that measurements prior to 1970 are higher

Except for all the evidence from trees and Antarctic ice cores, sure. We couldn’t forecast hurricanes at all until the 1950s, and even then barely until the age of satellites, so 1970 makes sense as a starting point for high-frequency data. But let’s not pretend the lower-frequency data don’t exist. To the extent there is bamboozling afoot, it’s from the climate deniers.


I was curious about this, so I looked for data from prior to 1970 [1] and it's all clearly trending in the same direction. So 1970 isn't special or misleading in this case, there does not appear to be any bamboozlement going on.

[1] https://www.climate.gov/news-features/understanding-climate/...


Thank you for adding information in response to my comment. I've been to [Mer de Galce](https://montenversmerdeglace.montblancnaturalresort.com/en) and seen the effects of glacial melt, so no doubt it contributes to sea level rise.


If that is a risk, it is just as likely to cause an error the other way and then we are underestimating sea-level rise.


check out the Eocene geological period. invest in swimming gear...


> deploying native applications is simply too costly.

I do not understand why people hold this impression, especially in corporate environments.

Windows supports both system and per-user deployments; the latter so you don't even need administrator rights. And with Intune, deployments can be pulled or pushed.

Many desktop applications are written in .Net so you don't even need to install the runtime because it's preinstalled on the operating system.

Even ClickOnce deployments -- which you can deploy on the web or on a file share -- pretty much make deployments painless.

EDIT: For the naysayers: please then explain to me why Steam is so successful at deploying large games on multiple platforms?


We have 2 main products: a SaaS and a desktop app (1mln+ users combined). It's a pain in the ass to support the desktop app:

- many people refuse to upgrade for various reasons so we have to support ancient versions (especially for important clients), for stuff like license activations etc.

- various misconfigurations (or OS updates) on Windows can make the app suddenly crash and burn - and you waste time investigating the problem. My favorite recent bug: the app works OK everywhere except on some Japanese systems where it just crashes with access violation (see the next bullet point)

- debugging is hard, because you don't have immediate access to the machine where the bug triggered

- originally it was built 100% for Windows but now we have people asking for a MacOS port and it's a lot of work

- people crack our protection 1 day after release and can use it without paying

SaaS has none of those problems:

- people are used to the fact that SaaS applications are regularly updated and can't refuse to upgrade

- modern browsers are already cross-platform

- browser incompatibilities are easier to resolve

- you debug your own, well-known environment

- if someone doesn't play by the rules, you just restrict access with 1 button


You could of course also make your desktop app auto update without offering a way to refuse. Or you could make it display an error, if the user accesses it without being at the newest version, and block all further interaction. Those are considered impolite, but it seems you are doing it anyway in your web app already.


Even the most friction-free ClickOnce deployment is going to be more of a deployment hassle than "hey, users, you know how you go to https://subdomain.local-intranet/place to add or subtract items from the inventory database? Well, continue doing that".

The webapp doesn't care if someone's machine was down overnight or if the paranoid lady in design managed to install some local "antivirus" which blocked the updated rollout or if the manager of sales has some unique setting on his machine which for some inscrutable reason does silly things to the new version. If their web browser works, the inventory database works for them, and they're on the latest version. If their web browser doesn't work, well, your support teams would have had to eventually take care of that anyway.


Some people should probably only be given thin clients, because they are too inept to be allowed to handle anything else.

Not sure yet how to solve this problem on the Internet yet though. How can we prevent uninformed masses from creating incentives for businesses, that turn the web into a dystopia?


The web browser is not some magic tool that is always guaranteed to work. Group policy alone can wreck total havoc on web apps on all the major browsers.


This would be noticed immediately when all of the workers under the group policy try to access their email in the morning


That sounds nice and it probably works fine if you're targeting a single business running Windows, but targeting Mac, Windows and Linux remains more difficult, no?

And is there even a guarantee that your deploy will be rolled out in X minutes?

Version skew remains one of the biggest sources of catastrophic bugs at the company I work for, and that's not even taking into client app ver, just skew between the several services we have. Once you add client app ver, we have to support things for 3 years.

At my one-person company I just do a Kubernetes deploy, it goes out in a couple minutes and everyone has the latest ver whether they like it or not. I don't have the resources to maintain a dozen versions simultaneously.


There’s a hidden cost in web applications as well: you lose the value of the operating systems application language. Every website and thus electron app suffers from this. There’s no standard place for settings, buttons and windows don’t behave normally. There’s a lot of value in native apps in usability just in it sharing the same UX language as the operating system. Sadly this is mostly dead for new applications.


Building a game for multiple platforms for steam is way more work than building a single web app. Builting the game often requires unique settings for each platform, machines for each target platform, testing on each target platform, dealing with bugs on each target platform. Getting noterized/certified on each

> EDIT: For the naysayers: please then explain to me why Steam is so successful at deploying large games on multiple platforms?

How many games are multi-platform on steam? Checking the top 20 current most played games, 14 of them are windows only. If it's so easy to be multi-platform, why would 14 of the top 20 games not be multi-platform? Oh, it's because it's NOT EASY. Conversely, web apps are cross platform by default.

Only 2 of the top 20 games supported 3 platforms (windows, mac, steam-deck). 1 was windows+steam-deck, 3 were windows+mac


> For the naysayers: please then explain to me why Steam is so successful at deploying large games on multiple platforms?

Because Valve puts lots of time and money into making it work for their customers (https://github.com/ValveSoftware/Proton/graphs/contributors), time and money that the average small business can't afford.


Here's a good example: I used Weylus which turns any touch device (phone, tablet etc) into a drawing tablet + screen mirror. It can be used to turn your iPad into a second monitor or as a touch device for your laptop.

Weylus gives you a URL that you can visit on the device and instantly use it. Try doing that with native apps. They'd need native apps for Windows, Linux, Mac, iOS, Android... get them on the app stores too, support all the different Linux distros... or just a single URL that works instantly anywhere.

Steam works for the same reason the App Store works, it targets mostly a single platform (Windows) and all dependencies are bundled in. The Steam client itself is a web app running on the chrome engine, though.


In large corporate environments I agree but small companies still mostly don’t have central management of end user computers. They order a computer from Dell and use it as delivered. Much easier to just access everything via a browser that way.


> Many desktop applications are written in .Net so you don't even need to install the runtime because it's preinstalled on the operating system.

The last .NET version to be deployed this way has a 10 year old feature set. Nowadays you bundle the parts of .NET you need with the application.


You can still do system wide deployments with .NET Core, or .NET 5+, as you prefer to call it.

https://learn.microsoft.com/en-us/dotnet/core/install/window...


Of course. I was strictly refering to .NET preinstalled in Windows as per the comment I replied to, which I believe only applies to Framework 4.8.

Although, on re-read, maybe they meant there's a good chance another application already installed it? This I wouldn't agree with, as applications often insist on installing different versions of the system-wide runtime, even for the same major version.


It doesn't usually happen (on Windows but realistically elsewhere too) unless you publish an application as self-contained.

To be specific, .NET install is version-aware and would manage those side by side, unless the destination folder is overridden.


> please then explain to me why Steam is so successful at deploying large games on multiple platforms?

if you look into the support forums on steam for any random game you'll find lots of complaints about stability and crashes, many of which are likely to be esoteric system-specific problems.


That’s a very ironic example given that the Steam Client is a web-app hosted in Chromium.


I get the impression this is said by people who observe large enterprises or don't have good device management. I work for a big company that deploys proper MDM to our Macs and PCs, and as long as the application is packaged correctly, they sure can push stuff out fast and without user intervention, including killing a running process if they so choose. Making an app that packages well is also hard, but that's not on those who push them to environments.


I love ClickOnce and am amazed that it never got that popular. Installs and updates are seamless.

I wrote an inventory management program for my father's hazardous waste company. Used .net mvc as the backend, WPF as the frontend. WPF sucks in many ways, but I got it working and have had zero complaints.

I built that program almost 20 years ago. Until last year I spend maybe 5-6 hours total in maintenance, mostly ssl cert updates. Let that sink in.


> it's preinstalled on the operating system.

Not on my computers. At home, or at work


If someone had made that statement in a blog without referencing a research paper, the first comment on the post would have been "where's your research? I want to double blind study!"


Like the seagulls in Finding Nemo, crying “source? Source? Soooource?”


My suspicion is that Amazon -- in addition to having paid product placement -- probably has engagement metrics. If you have to dig into the details on every product in the search to find exactly what you're looking for, that increases engagement.


Just as black people have claimed the "n" word, white racists have now claimed the "w" word.

Still not sure it was a fair trade though.


>Still not sure it was a fair trade though.

It's never a fair trade. But at least one is a singular word you never have to use in a discussion. The other was a term that de-humanized people.


People of all backgrounds, which hate Asians, Jewish people and white Americans use the term. Including some members of those groups.


It's the same kind of people who would design a city with no toilets or waste water treatment. Everything is easy when you ignore half of the requirements.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: