According to this paper https://www.lingref.com/cpp/decemb/5/paper1617.pdf the natural linguistic evolution towards compounds in Chinese was well under way by the time of Middle Chinese (~800CE). And most of the cultural exchange with Japan happened after that.
That’s true, but many Japanese terms for describing the modern world were mass-imported into Chinese during the late Qing and Republican eras (think 1900-1930).
The Chinese republicans looked towards Japan as a model for how to develop, and part of that was importing new lexicon wholesale, since Japanese scolars had already helpfully transliterated Western concepts into new compounds during the Meiji era.
And lets not forget the enormous number of English (and other) words imported into Japanese that look crazy to a non-Japanese speaker, but once you spell out the katakana you realize it is just English (e.g. ホテル / hoteru = hotel, シングルルーム / shingururūmu = single room)
One interesting thing about gikun is the widely different forms it can take according to the stylistic purposes of the text.
- Most of the time it's simply a pragmatic way to introduce a clarification without breaking the flow of the text, essentially a more concise form of parenthetical or footnote.
- In classical poetry it is used for a variety of effects, for example novel synecdoches. One side of the gikun might refer to a season, and the other side might refer to a key detail the poet idiosyncratically associates with that season.
- But the contemporary Japanese learner usually notices them the most in fantasy/sci-fi manga and novels. In this genre it's used to introduce in-universe jargon while showing its meaning in parallel. At the extreme, it can allow writers to go over-the-top with how much special jargon the universe includes, without slowing down the pace of storytelling. (This can pose quite a challenge for translators!)
Dan's point about being aware of the different levels of inequality in the world is something I strongly agree with, but that should also include the middle-income countries, especially in Latin America and Southeast Asia. For example, a user with a data plan with a monthly limit in the single-digit GBs, and a RAM/CPU profile resembling a decade-old US flagship. That's good enough to use Discourse at all, but the experience will probably be on the unpleasantly slow side. I believe it's primarily this category of user that accounts for Dan's observation that incremental improvements in CPU/RAM/disk measurably improve engagement.
As for users with the lowest-end devices like the Itel P32, Dan's chart seems to prove that no amount of incremental optimization would benefit them. The only thing that might is a wholesale different client architecture that sacrifices features and polish to provide the slimmest code possible. That is, an alternate "lite/basic" mode. Unfortunately, this style of approach has rarely proved successful: the empathy problem returns in a different guise, as US-based developers often make the wrong decisions on which features/polish are essential to keep versus discarded for performance reasons.
> That's good enough to use Discourse at all, but the experience will probably be on the unpleasantly slow side. ... an alternate "lite/basic" mode
Why does this need to be the "alternate" choice though? What does current Discourse provide that e.g. PhpBB or the DLang forum do not? (Other than mobile friendly design, which in a sane world shouldn't involve more than a few tweaks to a "responsive" CSS stylesheet).
I like the scroll view in discourse. Makes it super easy to follow a thread. The subthreads and replies are also easier to use. The search is better, the ability to upvote makes it better for some use cases, and in general phpbb is a mess in terms of actually being able to see what's useful and what threads are relevant.
I think flipping the question makes more sense, why do you think some forums switched to or started using discourse instead of just using phpbb? I can guarantee you that it's not just to follow a fad or whatever, most niche or support forums don't care about that.
I was thinking about this when I saw this post earlier today.
Why shouldn't the default be: does this website work in Lynx? I think that's a damn good baseline.
And in response to the other parent post, on a (almost) new iPhone, both news sites & Twitter continuously crash and reload for me. I'm not sure what the state of these other popular sites are because I don't use them.
What do you mean by voice and video?
Why would I want to have voice in a forum? I think that would be akin to receiving voice messages in messengers.
Or do you mean, that for these kinds of things a widget can be displayed? That certainly is possible in old style forums. It is just HTML, an embed code away.
> For example, a user with a data plan with a monthly limit in the single-digit GBs
I live in a poor Southeast Asian country.
People with small data plans don't use data from efficient websites, they use wifi which is omnipresent.
30GB of data on a monthly plan is $3.64. Which is about 4-6 hours of minimum wage (minimum wage is lower in agricultural areas).
But more to the point, people don't use data profligately like in the West. Every single cafe, restaurant, supermarket, and mall has free wifi. Most people ask for the wifi password before they ask for the menu.
I've never seen or heard anyone talk about a website using up their data too fast.
It honestly sounds like a made up concern from people who've never actually lived in a developing country.
People here run out of data from watching videos on TikTok, Instagram, and Facebook. Not from website bloat.
> Every single cafe, restaurant, supermarket, and mall has free wifi.
I live in a major city in the Philippines, and free WiFi is becoming more of a rarity nowadays. Not even Starbucks and other big chain restaurants, malls, and cafes offer WiFi anymore because of how widely available data is. They expect you to bring your own data and tether if you want to browse or do some work.
In more rural areas, WiFi is definitely not widely available. On the rare chance it’s even offered, it’s usually “piso WiFi” paid by the minute.
I think one way for first world country citizens to empathise with this is how people behave when on roaming data plans during overseas trips. One does keep to public WiFi as much as possible and keep mobile data usage to a minimum or for emergency purposes.
I mean not using Data Plan here in Northern Europe was me 11 years ago… and me using it sparingly because video or songs would blow through the Data Plan instantly was me eight years ago.
eh, idk. This is your anecdotal experience, there are others (like me) who have different ones
>It honestly sounds like a made up concern from people who've never actually lived in a developing country.
I once loaded a site that loaded approx 324mb "Super resolution" image (I knew it was high res, but I thought it was like 30-40 mb at best). Took care of 1/3rd of my monthly data in a single page load.
A useful feature of uBlockOrigin is being able to block all media elements larger than a given amount such as 50KB. Wish I could set it to only do this on mobile networks and let wifi stay unlimited.
"It honestly sounds like a made up concern from people who've never actually lived in a developing country."
You mean, the one developing country you live in.
You are also missing the full spectrum of users. People don't just browse the web for fun. They look for important information like health or finance information, they might not want to do that in a public place or they might not be able to put it off for when they next have wifi.
If you are building an e commerce website it might not matter, but you could be building a news site, or any number of other things.
If all the sites tot more efficient it may also increase longevity of laptops and PCs where unsavvy people might just “need a new computer it is getting slow”.
Also applies to bloatware shipped with computers. To the point where I was offered a $50 “tune up” to a new laptop I purchased recently. Imagine a new car dealer offered you that!
I worked at a now-defunct electronics store (not fry's in this instance) in the early 2000s that offered this "tune-up" - it was to remove the stuff that HP and Dell got paid to pre-install, and to fully update windows and whatever else.
Remove the mcafee nuisance popups and any browser "addons" that were badged/branded. and IIRC we charged more than $50 for that service back then.
For the performance boost it could offer the unsavy user stuck on a HDD, it was probably worth it to many. Gross to be the middleman, but it is what it is.
Another computer shop i worked in charged $90 for virus removal, but we also eventually made it policy to just reformat/reimage the drive and remove all the crap and fully update the OS. Prior to that the policy was "remove viruses, remove crapware, update OS", but we had a few customers that had machines with 30,000 viruses. I forget what the record was, but it was way up there in count. Trying to clean those machines had a marginal failure rate, enough that it was costing the owner money to have us repeatedly clean them without payment.
No one wants to tell a customer that they need to find better adult content sites, and that we won't be cleaning their machines without payment anymore!
"just reformat/reimage the drive and remove all the crap"
And that is not more work?
It was usually the way I did it, too. But this requires checking with the owner what apps are important, saved preferences, where are the important files stored (they never know) etc.
What’s the financial incentive in that? Manufacturers ideally want you to buy a whole new device every year, they don’t want you repairing or extending the life.
Some of these sites are un-fucking-bearable on my gen old iPhone.
And the if I’m in a place with a shitty signal, forget about it, this problem is 10 times worse.
I’m not even talking about the cluttered UI where only a third of the page is visible because of frozen headers and ads, I’m talking about the size of the websites themselves that are built by people who throw shit against the wall until it looks like whatever design document they were given. A website that would have already been bloated had it been built correctly that then becomes unusable on a slow internet connection, forget slow hardware.
All that is to say, I can’t imagine what it must be like to use the internet under the circumstances in which you described.
I can only hope these people use localized sites built for their bandwidth and devices and don’t have to interact with the bloated crap we deal with.
I really wish all software developers had to have 10 year old phones and computers and a slow 3G connection as their daily drivers. It might at the very least give them some empathy about how hard it is to use their software on an underspec machine.
I'm in Canada and have a triple-digit plan, in MBs. It's for emergency use only. It would be nice if something as simple as checking on power outages didn't chew up a good portion of the data plan.
Yeah, different people need different things out of their phones. Yet the point remains that stingy data plans still exist in developed countries. Even though people may have better devices than those mentioned in the article (it is easier to justify a one-time expense than a recurring one), there are people who are stuck with them for various reasons. Affordability is definitely one of the reasons.
Even so, we should avoid pigeonholing those who have limited access to data as poor people. There are other reasons.
in mid 00's, I had ADSL with iirc ≈300 MB included in the monthly payment, with an extremely predatory rate over the limit. I used to stretch it for 3 weeks out of a month browsing with images disabled (and bulk of my bandwidth spent on Warcraft 3).
that would last for a few hours of lightweight (not youtube/images/etc) browsing now.
In another world this mode dominated UI/UX design and development and the result was beautiful and efficient. Where design more resembles a haiku than an unedited novel.
We don't get to live in that world, but it's not hard to imagine.
It's not even just the middle-income countries—I have an iPhone 13, so only three years old, on a US wifi connection with high speed broadband, and it can't handle the glitzy bloat of the prospectus for one of my ETFs. I don't understand why a prospectus shouldn't just be a PDF anyway, but it baffles me that someone would put so much bloated design into a prospectus that a recent phone can't handle it.
> The only thing that might is a wholesale different client architecture that sacrifices features and polish to provide the slimmest code possible. That is, an alternate "lite/basic" mode. Unfortunately, this style of approach has rarely proved successful
But it is gaining popularity with the unexpected rise of htmx and its 'simpler is better even if it's slightly worse' philosophy.
My point exactly. By making your website fast and light, you make it easier and more pleasant to use. HTMX has a limited set of actions that it supports, so it can't do everything that people typically want. It can do more than enough though. (remember websites that actually used the `<form>` element?)
These phones aren't just in the Developing World, though. This is a USA problem too.
I work with parolees and they get the free "Lifeline" phones the federal govt pays for. You can get one for free on any street corner in the poor 'hoods of the USA. They are the cheapest lowest spec Android phones with 15GB data/month. That data is burned up by day 3 due to huge Web payloads and then the user can't fill out forms he needs for jobs/welfare and can't navigate anywhere as he can't load Maps.
I’m curious how quickly the data would be used up if only using it for the intended forms/jobs/welfare. I wouldn’t be surprised if the data lasted barely any longuer due to bloat.
When I had one of these the data only lasted me 4 days. I didn't even purposely watch any videos, but I did read a lot of news articles and many "magazine" style sites have huge video payloads that load (and sometimes play) in the background if you're not running something like uBlock. I found some sites with 250MB home pages :(
For what it's worth I've gone the opposite direction (one language, 70k Anki reps). For me, carefully adding context has largely felt like time wasted at card creation time (which can a surprisingly large proportion of study time per card, given how brisk reviewing usually is) and I've been bothering with it less and less. The default simple cards my dictionary plugin creates are usually good enough for me. I go out of my way to add context on the front of the card now mostly when it's a specialized word almost always seen within that context (so there's zero added value in learning it independently).
I do agree with the general idea that laziness and going easy on yourself is good though. I give myself quite a lot of slack when grading my answers, applying a "my understanding of this word is close enough to avoid confusion in practice" threshold rather than some impractical ideal of native-level mastery.
> For me, carefully adding context has largely felt like time wasted at card creation time
I actually had several custom tools that heavily automated card creation—I could grab a sentence from a web page, or bulk import highlighted phrases from an ebook. Then I had a UI which allowed me to easily highlight an interesting word, and either cloze it, or add a Wiktionary definition on the back. Then I had an Anki plugin to bulk import the cards. This could all obviously be combined into a single tool, and occasionally someone tries.
For my most heavily automated experiment, I used a tool similar to subs2srs to import sound, bilingual subtitles, and tiny screen captures from 4 episodes of Avatar: The Last Airbender. That was a fascinating experience, and I'm still earwormed with the dialogue of those episodes a decade later, after only a couple of months of Anki reviews. (See elsewhere in this thread for a link.)
Unfortunately, I'm not convinced that there's a good startup market for language-learning tools. Language learning is normally aspirational, much like a gym membership. And customers don't have any serious plans on how to reach their stated goals. (Again, like a gym membership.) Duolingo isn't terrible, but I suspect—based on lots of Anki experiments—that it should be possible to build much more effective tools than Duolingo. I'm just not convinced that anyone but serious ESL students would pay for them. Too many genuinely good tools in this space have sunken quietly, despite a user-friendly UI and a good landing page.
I guess I am of the few that would pay for a good tool aimed at serious learners, but as I could not find any (in language space almost everything is a Duolingo clone), I am building myself.
So far I think I have proved my main hypothesis: it works really well for me. It's not magic, but it saves me many hours of learning.
Whether it could work for someone else, is an entirely different story... but on the other hand, I have no big ambitions.
Duolingo and spaced repetition flashcards serve different purposes. Flashcards can’t teach you grammar or pronunciation, while Duolingo does a bad job at actually teaching you a large set of vocabulary. Using both of them is probably a good plan, though even better is flashcards + a real class.
Explicitly, by writing explanations of various grammatical phenomena on flashcards. Implicitly, with usage examples that demonstrate how the grammar works.
Flashcards can teach you pronunciation.
Explicitly, by writing explanations of how to produce various sounds on flashcards. Implicitly, with recordings that demonstrate how things are pronounced. (Ok, paper flashcards can't do that, but we're not talking about those, right?)
I got an HSK 1 deck on Anki and it has audio and the pinyin has tone marks so I am learning pronunciation. It is also giving me grammar structures in the first week that Duo didn't in months. I truly feel Anki is better and that Duo is the one that only teaches vocab.
If I need an explanation I just paste it into ChatGPT and ask for a breakdown.
If you have better tooling, you can add cards way faster. My project (https://github.com/FreeLanguageTools/vocabsieve/) is a tool to help you make sentence cards nearly effortlessly, or even converting ereader highlights, which probably averages to maybe a few seconds per card created.
Anki FSRS moves closer to being a "just-in-time" algorithm based only on user-provided inputs. And although its data structures aren't strictly modular, they come as close as practical to that ideal while still remaining compatible with legacy Anki decks and extensions.
In practice, that's illustrated by the fact that there's now a button to fully recompute all intervals and difficulties based only on your history and the current algorithm tunings. And if you've already been using FSRS and the tunings haven't changed, the recomputation won't have any effect because it's equivalent to the incremental computations after each review.
So in principle it could be thought of as a just-in-time pure function, which involves a cache of generated data only for legacy & performance reasons.
I unfortunately can't find it right now, but I remember seeing a semi-famous quote from the 1950s/60s? calling out variable-speed windshield wipers as an absurd consumerist luxury emblematic of what's wrong with America.
The refrigerator camera sounds like the same kind of thing. Modestly useful feature that may well become standard-issue someday because the underlying components can be made very cheaply at scale.
10x over 75 years is a ~3% risk-free annual return. It looks like ordinary government long-term bonds (like 30-year treasuries)
were yielding around 4% in 2008, making this look at first glance like a good deal for Chicago.
However, municipalities with a good credit rating can usually issue bonds that pay out about a third less yield than treasuries, thanks to favorable treatment of municipal bonds in the US tax code. I don't know how healthy Chicago's rating was in 2008 but even if somewhat mediocre, it's likely they could've gotten 3% or less on the bond market.
So at a minimum there was no advantage for the city in signing up for a sweetheart deal with strings attached, instead of covering the shortfall by issuing a bond.
This deal was secured by something other than Chicago's taxing power, so it should have less effect on Chicago's creditworthiness and ability to issue other bonds.
In any case, if it was 3% it's economically very comparable to issuing a regular bond and not the "fleecing" that's talked about. If the city regrets the deal, it should be able to issue a regular bond and use it to buy out the investors, or issue regular bonds annually and use them to pay the penalties, all at roughly a wash economically.
I'm not sure 3% is correct, though. I'd like to see another source on that. It reads like it has some protection against inflation. If it's 3% + inflation, that's a really enormous return.
Brooks' definition of a silver bullet was "an order of magnitude improvement" and certainly none of those innovations individually come anywhere close. But I agree it's plausible that combined they might amount to one single order-of-magnitude improvement (as compared to average early-1980s development), for the subset of organizations that apply all of them together effectively.
In my personal experience at my workplace I've certainly observed many times that the quote "Adding manpower to a late software project makes it later." is not universally true. When things go well, onboarding an experienced engineer and assigning them a parallelizable task has often proved a relatively lightweight process on teams I've been on, and I've seen the new member's productivity rapidly exceed their communication burden on the rest of the team.
That might be explainable by how several of the innovations you listed reduce the need for communication between humans. For example, we now need less communication about regressions since CI/CD alerts the responsible party immediately, and we need to ask fewer questions when individuals can examine the entire history of a codebase in version control themselves.
(That doesn't mean Fred Brooks's work is now obsolete, this story is consistent with Brooks's underlying analysis of the problem.)
An often under-appreciated part of the original paper is effectively libraries - code you don’t have to write yourself.
This is absolutely the biggest contribution to productivity improvements over that time period, and I think that we don’t always make good use of them (NIH syndrome, it’s not written in my favorite language, etc). Seriously, if there is a good quality library for the task already, just use it! That’s why python is where it is today in the data science community, and is also the reason why I will start any new project on the JVM unless there is a good reason not to. (The third contender for the biggest ecosystem is JS, and while it has similarly vast amount of packages, I found their average quality is behind that of the JVM’s)
What fraction of late projects have all those ingredients in place and readily identified, parallelized tasks needing to be started? I feel like part of the Brooks quote’s premise is that late projects got that way by some reason and those reasons are rarely amenable to more bodies. (That’s to suggest “if you have a choice to add programmers to a late project or an on-schedule one, you might be better off doing the latter if overall delivery is your goal.”)
Well, "need" is subjective. But subpixel rendering has diminishing returns as DPI goes up and (as the article describes) it causes several performance and interoperability problems, so the appropriate DPI threshold to turn it off might reasonably be somewhere lower than parity with print. Everyone agrees it's a good idea on cheap low-dpi displays, but whether it's still worth the costs on so-called "retina" displays is a judgment call.
PPP is for comparing consumer lifestyles. World-tradeable goods like military hardware are precisely the kind of situation where PPP adjustments should not be made.
Of course the sums are not directly comparable for other reasons, such as corruption in Russia and procurement dysfunction in the US. But those factors would need to be estimated explicitly.
(EDIT: Thanks for the replies, I hadn't heard that there also exist specific estimates of "defense-sector PPP", so I acknowledge what I wrote above was wrong.)
Is the “world-tradeable” aspect so relevant? Nuclear warheads are going to be locally built and would be send in missiles that would also be built in Russia. The PPP argument is that it’s much cheaper to pay a bunch of rocket engineers and nuclear scientists in Russia, in rubles than in Los Alamos, in dollars. Same for all the logistics chain from raw materials to final assembly, upkeep, and maintenance. If your steel is dirt cheap because you’re full of ore and gas is plentiful, and your workers are paid peanuts, then your tanks are going to be cheap, at least for your own use, regardless of how much they cost in another country. So why would PPP be irrelevant in the case of military hardware?
A russian soldier is paid less than an American. A russian factory worker making military equipment is paid less than their American counterpart, leading to lower cost of that equipment. I don't see a problem describing that difference as ppp.
What a soldier or worker is paid only matters in a vacuum, you have to consider the pipeline from military budget to the actual good or service. That includes both commodities bought on the world market (E.G. steel) and corruption raising costs for the same good (which is a problem for both the US's military-industrial complex and Russia's oligarchy).
That being said though when the budgets are off by an order of magnitude there's no way to make that back just via lowering wages a bit
What's the cost of commodities in building a tank? Yeah fine both an Abrams and a t90 have a few tonnes of steel in it. That doesn't account for a fraction of the final cost though.
Maintenance requires labour. Labour has different prices in different countries. You can compare that labour cost.