Hacker Newsnew | past | comments | ask | show | jobs | submit | 3wayMerger's commentslogin

Location: Orange County, CA

Remote: Yes

Willing to relocate: No

Technologies: C#, Javascript, C++, CSS, HTML, Elixir, ASP.NET, WPF

Résumé/CV: https://drive.google.com/file/d/1s_6gtw4e6mhBnr2kARbc3JsiJyb...

LinkedIn: https://www.linkedin.com/in/scott-kim-66b39a117/

Email: kimsaehun@tuta.io

I'm a SWE with 3 years of experience working with C#, WPF, ASP.NET and a touch of C++ and MFC. My primary motivation for looking for a new position is to join a team that will help me grow as a SWE.


What if instead of paying Bezos paying more taxes, he has to pay his workers more? I think it's wrong to just force Bezos to give away his wealth. However, I also think that some Amazon workers should be payed more.

I don't think Bezos would have had the same amount of success with Amazon, if he had to do everything himself without hiring any other people. I believe most successful achievements were achieved through the work of many people. In the villagers and apples example, if the one man was able to produce 1000 apples by himself, I agree that the one man should be able to keep all 1000 apples to himself. However, if the one man was only able to produce 1000 apples with the work of others, I think that the workers should be receive a "fair" share of the produce. I don't have an idea or definition of what "fair" would be but 266,666 to 1 apple doesn't seem very "fair".

I don't see Bezos as being bad or evil but I see the fact that his workers don't get payed more as unfair. It's kind of like the "thank you essential workers for keeping the society functioning during the pandemic" phase that America had last year. We call these workers "essential" but their pay doesn't reflect that title. Some of the work that these "essential workers" do might be simple unskilled labor, but it is necessary work. Even if the work itself might not be of something that demands better pay, I believe the fact that the work is necessary is something that demands better pay. And if one person has 266,666 apples, I think there are enough apples to go around to pay the "essential workers" more. I just don't think it's "fair" that some "essential workers" who are earning close to minimum wage will never have the same opportunities that others in more lucrative fields (e.g. software engineering) will because of financial limitations.


>What if instead of paying Bezos paying more taxes, he has to pay his workers more? I think it's wrong to just force Bezos to give away his wealth. However, I also think that some Amazon workers should be payed more.

Amazon already pays the most for this type of work. So how much should they pay?

>We call these workers "essential" but their pay doesn't reflect that title.

The work is essential but the workers are easily replaceable, with the exception of medical staff whose pay is appropriate.


As mentioned before, I don't know what a "fair" pay would be. Maybe it could be based on a percentage of the total revenue? But one man being paid 266,666 apples while others get paid 1 apple doesn't seem "fair". Especially, if the 266,666 produced apples were only possible with the combined work of others.

And yes, the work is essential and workers are easily replaceable because the work is something practically anybody could do. My point is that because the work is essential, the easily replaceable workers should be payed more. Regardless of who does the work, whether it is some random civilian off the street or Bezos, somebody has to do the work. That's why it's essential. The work itself is valuable so I believe whoever does the work should be payed based on the value of the work.


Bezos isn't "paid" his net worth. His compensation was actually very low compared to his net worth, only $1.6 million in stock compensation in 2019 (and $82k cash). His net worth comes from his 10% stake in Amazon that he's pretty much had since he founded the company. So, he isn't paid 266,666 apples, he owns 10% of a giant apple farm that he founded.


Yes I understand but even if he isn't necessarily being paid 266,666 apples, he has access to that much wealth. I mean, Bezos is able to start his own space expedition which I don't think many people can do. While that's happening, we have people who are doing essential work getting paid close to minimum wage. Their net worth is probably some insignificant amount. These essential workers probably won't be able even own a home until well later into their life, if ever.

Maybe I'm just too immature and ignorant but the fact that Bezos can start his own space expedition while Amazon workers have to pee into bottles and get paid $15 per hour just doesn't seem fair. And paying people the minimum amount simply because they are replaceable makes it seem like people are treated as if they were resources. Maybe in terms of logistics people can be seen as resources, but my opinion is that people and their lives are not resources. I don't think people go to work and spend a good chunk of their life working just to be a resource for someone else to take advantage of.


Maybe I'm missing the point but could you clarify a bit more on APL's notation vs J's notation?

Speaking as someone who is not very well math inclined and as someone who was born in an Asian country, both APL's special characters for verbs and J's alphabetical characters for verbs are similar enough for me. Both languages use symbols for verbs, it's just that J's symbols happens to very closely resemble the characters of the English alphabet.

Although, due to the familiarity of the English alphabet, J's symbols might intuitively bring up ideas of the alphabet character, is it not possible to just think of it as a new mathematical symbol? For example, instead of seeing "o." as the alphabet character 'o' followed by a period, couldn't it be seen as a circle followed by a dot? Or if we lived in a world where the alphabetical characters of the English were swapped with the special characters of APL, would J's notation still be broken? Does familiarity of the symbols used in a notation make it any less powerful?

Maybe the reason why I don't understand is because I haven't tried APL and only tried J. And I eventually ended up quitting on learning J because it was starting to get too difficult for me. Would it be possible to explain the differences in APL's notation and J's notation is an easier or simpler fashion?


APL’s verbs are geometrically suggestive. They are little pictures that represent what they do, and how they are related to each other. For example, ⌽ reverses an array along its last axis; you can see the array flipping around the vertical line. And you will know what ⍉ does without looking it up, I bet. These symbols are so well designed that you don’t have to memorize much, because they document themselves.


Couldn't the same be said of J? If APL's powerful notation comes from not having to memorize much and being a good visual representation of what the verb does, doesn't J's usage of alphabetical characters achieve something similar albeit a bit worse? For example "i." for index and related functions. Since the letter 'i' is usually used for indexing, one could assume that "i." is something related to indexing. Does the usage of alphabetical characters weaken the notation so much that it could be considered an abomination?

If there was another language that was a copy of APL but with new non-alphabetical symbols that were less suggestive than the original APL symbols, would that language be considered to have a less powerful notation? If so, how much weaker would it be considered? What would the symbols of a language that is APL-like and uses non-alphabetical characters, but would still be considered an abomination look like? Would that language be considered to have a more powerful notation than J?

This might be a bit of a stretch but I'd like to use the symbols on a media player as an analogy. The symbols on a media player (play, pause, resume, seek back, seek forward) could be compared to APL's symbols. Then, for the J version of the media player, rather than the symbols, there could be "Pl", "Pa", "Re", "SB", "SF" or something of the sort. I would say that the APL's symbols do look nicer, but I don't think J's usage of alphabetical characters should be considered an abomination. If so, wouldn't all text GUI's (e.g. command line managers such as nnn or MidnightCommander) be considered an abomination compared to a regular GUI version?

Maybe I'm not looking at the right thing here but APL's and J's notation seem to be similar. One does look better than the other, but both seem to serve the same purpose.


I’ve only glanced at J and never used it, so I don’t have any strong opinions about it. But APL just has that extra magic that J seems to lack. Notation does matter. It could be that I’m partially sentimental, as it’s the first programming language that I learned.


> Maybe I'm not looking at the right thing here but APL's and J's notation seem to be similar. One does look better than the other, but both seem to serve the same purpose.

Not sure if it is possible to understand this without having the context of being well versed in another means of communication that uses specialized notation. Musical notation being an easy example of this. Mathematics could be another. And, of course, languages that don't use the latin alphabet. Outside of APL, I happen to be fluent at musical notation and one non-ASCII spoken language, as well as having the mathematical background.

The closest I can come to explaining what happened with J is that they did their best to convert every APL symbol into an equivalent combination of ASCII characters. Here's the key:

They did NOT do this because Iverson thought this was a better path forward. He did not abandon thirty years of history creating and promoting notation because mashing ASCII characters together was a better idea. He did this because computers of the day made rendering non-ASCII characters a pain in the ass. This got in the way of both commercial and open adoption of the language. He likely genuinely thought the transliteration would bring array programming concepts to the masses. It did not.

In the grand context of computing, J is a failure and APL suffered greatly when its creator and primary evangelist abandoned it.

Imagine a world where people are writing perfectly legible code in C, Basic, Pascal, etc. Now imagine someone proposing the use of seemingly random arrangements of ASCII characters instead of those languages. It's like telling everyone: Stop programming in these languages! We are all going to program in something that looks like regex!

Well, the rest is history. The proof is in the fact that APL is but a curiosity and J isn't a commercially viable tool. Yes, they both exist in corner-case applications or legacy use. Nobody in their right mind would use either of them for anything other than trivial personal or academic applications. That's coming from someone who used APL professionally for ten years and even envisioned a future creating hardware-accelerated APL computing systems at some point. It's computer science history now.

I still think it should be taught (along with FORTH and LISP) as there's value in understanding a different way of thinking about solving problems computationally.

As an extension of this, part of me still thinks that the future of computing might require the development of specialized notation. For some reason I tend to think that working at a higher level (think real AI) almost requires us to be able to move away (or augment) text-based programming with something that allows us to express ideas and think at a different level.


Thanks for taking the time to reply. I think I'm beginning to understand but am not quite sure.

While I wouldn't consider myself fluent in any of the following, I do know how to read musical notation (from middle school/high school band) and I can read/write/speak a non-ASCII language (Korean). So I am somewhat familiar with non-ASCII notation.

> The closest I can come to explaining what happened with J is that they did their best to convert every APL symbol into an equivalent combination of ASCII characters.

This is the statement I keep on getting stuck on. From what I have read, besides the symbols being converted to ASCII characters, APL and J are generally the same. Both work on arrays, both are parsed right to left, etc. It seems like the only major change is that the symbols got converted to ASCII characters that are at a maximum 2 characters long. If this is the case, what would you say about the J language's notation if the authors one day decided to change all the symbols to non-ASCII characters? Everything else would stay the same, such as what the symbols do and how much space the symbols takes up (max 2 characters). If the J language were to change only its symbols and nothing else, would its notation be considered to be on par with APL's?

As you mentioned, my lack of proficiency in other specialized notation might be preventing me from understanding the issue. That said, your last set of comments strikes a chord with me and I do think I kind of understand. As you mentioned previously, notation is "a powerful tool of both expression and thought." The usage of specialized notations allows one to express their thoughts and ideas in a way that normal writing can't. But I guess this is where being well versed in the subject matter comes into play, since after all it is a "specialized" notation. It would be difficult for someone who doesn't have a strong background in the subject matter to take advantage of the specialized notation.

To me, with my limited knowledge and experience, J vs APL appears to be a symbol (graphical) design comparison rather than a notation design comparison. And as someone who doesn't have a strong mathematical background, both APL's and J's symbols conveyed nothing to me when I first saw them. Changing the symbols to non-ASCII or ASCII has no effect on me besides figuring out how I would input the non-ASCII characters. But I suppose that to you, a change in the symbols isn't something so superficial. The way I understand APL vs J now is that for those who are experienced in APL, the changing of the non-ASCII symbols to ASCII characters, simply for the purpose of not having go through the trouble of inputting non-ASCII characters, "broke" the notation.


> what would you say about the J language's notation if the authors one day decided to change all the symbols to non-ASCII characters?

That's a very interesting question. I think the only possible answer has to be that this would return the language to what I am going to label as the right path. It would be wonderful.

APL is the only programming language in history to attempt to develop a notation for computing. Iverson actually invented it to help describe the inner workings of the IBM mainframe processors. Any hardware engineer who has ever read a databook for, say, an Intel (and other) processors has run into APL-inspired notation that made it into the language of explaining how processor instructions work. It's a remarkable piece of CS history.

> besides the symbols being converted to ASCII characters, APL and J are generally the same

Let's call it "notation" rather than "symbols". The distinction I make is the difference between a language and just a set of glyphs that not entirely related to each other.

You might want to read Iverson's original paper on notation. It makes a very strong argument. Coming from the man who created APL, this is powerful. It also --at least to me-- tells me that his detour into J had to be motivated by business pressures. There is no way a man makes such an effort and defends a notion with such dedication for three decades only to throw it out for something that isn't objectively better.

I don't think we can find a paper from Ken Iverson that says something like "I abandoned three decades of promoting a new notation for computing and created J because this is better". You will find statements that hint at the issues with hardware of the era and the problems this created in popularizing APL.

Here's my lame attempt to further explore the difference. I don't know Korean at all. I just used Google translate and this is what I got for "hello from earth":

지구에서 안녕

I count seven characters, including the space.

Let's create J-korean because we are in the 80's and it is difficult to display Korean characters.

지 This looks like a "T" and an "L": So "TL".

구 This looks like a number "7" with a line across it: "7-"

에 This looks like an "o" with two "L"'s, one with a pre-dash: "O-LL"

서 This looks like an "A" with a dashed-"L": "A-L"

안 This looks like an "o" with a dashed-"L" and another "L": "OLL-"

녕 This looks like an "L" with two dashes and an "O": "L--O"

Space remains a space.

Here's that phrase in J-korean:

TL7-O-LLA-L OLL-L--O

It's a mess. You can't tell where something starts and ends.

OK, let's add a separator character then: "|"

TL|7-|O-LL|A-L| |OLL-|L--O|

Better? Well, just in case we can do better, let's make the space a separator. Two spaces in a row denote a single space:

TL 7- O-LL A-L OLL- L--O

We have now transliterated Korean characters into an ASCII printable and readable combination of characters.

Isn't this an abomination?

We destroyed the Korean language purely because computers in the 80's could not display the characters. We have now trifurcated the history of the language. Which fork will people adopt? Which will they abandon? Will all books be re-written in the new transliterated form?

Which of the above encodings (real Korean and the two transliterations) conveys, communicates and allows one to think in Korean with the least effort and the greatest degree of expressive freedom?

If I, not knowing one bit of Korean, expressed a strong opinion about J-korean being better because it doesn't use "funny symbols" I would not be treated kindly (and rightly so).

I don't know if this clarifies how I see the difference between APL and J. Had we stayed with APL's notation, evolved and enriched it over the last few decades we would have had an amazing tool for, well, thought and the expression of computational solutions to problems. No telling where it would have led. Instead Iverson took a path driven by the limitations of the hardware available at the time and managed to effectively kill both languages.

I happen to believe that the future of AI requires a specialized notation. I can't put my finger on what this means at this time. This might be a worthwhile pursuit at a future time, if I ever retire (I can't even think about what that means...I love what I do).

Here's Iverson's paper on notation. It is well worth reading. It really goes into the advantages of notation to a far greater level of detail than is possible on HN comments:

https://www.eecg.utoronto.ca/~jzhu/csc326/readings/iverson.p...


> I happen to believe that the future of AI requires a specialized notation. I can't put my finger on what this means at this time. This might be a worthwhile pursuit at a future time, if I ever retire (I can't even think about what that means...I love what I do).

I also share this opinion and I might know what you mean. A lot of breakthroughs in physics are due to new notation – e.g. Maxwell's equations, Einstein notation, etc. Or to be precise, it is easier to think new thought in notation/language that is suited for it.

Current machine learning is a 90:10 blend of empirical stuff followed by delayed theory. A language for theory-based ML is math with paper and pencil. However language for empirical experiments are PyTorch, Tensorflow, JAX, DEX, Julia, Swift, R, etc. Former is "dead" historical language, latter are notations for differentiable computational graphs that can be run on modern HW accelerators. If you look at what have those programming languages in common is that they were all influenced by APL. And funny enough, machine learning is the best use case for APL, but practically non-existent. APL should be reborn as differentiable tensor-oriented notation. That would be wild – prototyping new ML architectures at the speed of thought.

Anyway, another angle for APL + ML is an alternative interface – write APL with pencil on paper that understands your handwriting and evaluates your APL scribbles. [0] I committed myself to provide a BQN [1] with such interface and see where it leads.

Ideally, the best outcome would be the combination of both approaches.

[0] https://www.youtube.com/watch?v=0W7pPww6Z-4 [1] https://mlochbaum.github.io/BQN/


Thank you for the discussion. I am now convinced but unfortunately, I cannot confidently say that I deeply understand.

I've taken a shot at the linked paper but will require more readings to fully grasp what it's saying. However, between what I understood of the paper and the example that you provided, it makes sense that J would be considered an abomination of APL.

Hopefully, after reading the paper a few more times and maybe even trying out APL, I'll have a better understanding. Thanks again for your time.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: