Hacker Newsnew | past | comments | ask | show | jobs | submit | frostwarrior's commentslogin

As a web developer, I would really like to get involved in Open Source software and give back to the community a bit.

But the amount of options is overwhelming tbh, and the skills needed for open source projects are far from the backend API's the industry usually needs.


I had an interview a few days ago and I asked something similar. I asked about the possibility to travel to the US to greet my coworkers and work alongside them.

They said they won't raise my salary and they expect me to stay in my region (Latin America). They also said that if I go to the US or Europe, it would be though to live there with my remote salary.

It felt somewhat insulting tbh. As if they said "we want you as disposable cheap labor, not to become a part of the company" directly upfront.


Exactly. Alienation from the fruits of your labor. It's disgusting we expect people to tolerate it, and act like it's not a totally optional choice by management to exploit labor resources to get a better return on investment for the already wealthy. A modern take on colonization.


It's a choice by management,but it's an obvious choice.

We should not expect everyone to be a horrible person, only motivated by profit, but we also should not expect people to act against their own interests all the time.


Following profit motive doesn't make one a horrible person. People operate rationally within the rules they're constrained by (e.g. capitalism). I just think we are capable of making better rules.


On the other hand, the only reason the company hired someone overseas was to save money. They would not have even considered these candidates if they weren't hugely cheaper. Instead, the company would have hired from the local labor market instead.

What the submitter should be asking themselves, is whether they can pull a higher wage working from their current location with any other company. If they can do that, they should. If they can't, and it's likely that they cannot because most companies are operating similarly, then they're getting a fair market rate for their situation. That's the free market.


Well, we agree that "cheaper labor" is desirable because it's more profitable to the company right? So let's apply transitivity to your first paragraph:

"The only reason the company hired someone overseas was to make more money. They would not have even considered these candidates if they weren't hugely more profitable. Instead, the company would have hired from the local labor market instead."

Yeah, what you are describing is practically the definition of exploitation of labor by capital, and alienation of the worker from the value they create.


> On the other hand, the only reason the company hired someone overseas was to save money. They would not have even considered these candidates if they weren't hugely cheaper.

Not the only reason: I have plenty of times hired people who had skills we needed and couldn't find locally. Sometimes brought those people over on H-1B, sometimes just hired them remote. And then if we had a few in the same area, opened an engineering office (though these days I am less likely to do that unless the folks really wanted it).


Yes, of course.

But it would be a great motivator if they actually bothered to relocate you in case you showed remarkable performance and added value to the company.

"We just want someone to work for cheap without prospect of progress" tells me that job will have a low ceiling for someone that wants to progress further their career


> On the other hand, the only reason the company hired someone overseas was to save money.

Ding ding ding - we have the winner here. In my company, all new headcount is from cheaper countries. We are no longer hiring in the US.


Maybe it's a joke, but a problem I see in every "open source self hosted alternative" is that people tend to underestimate how much work is to self-host everything.

It's either paid hosting like AWS, some intermediate docker-compose solution or your own personal server machine. In every case someone has to do the gritty work. It's either a paid service, a volunteering open-source contributor, or you.


Imagine getting hit by a car.

Poor car.


That's the thought I had; at least the car would take some damage.


It's similar weight to a small motorcycle.


My entire experience with MOBAs is that people will rip you to shreds for not doing or knowing every overly specific thing people didn't know either when they had less experience.


Or don't even know currently but just want point fingers. Something about humans becoming anonymous really bring out a different side of us...


Not really about being anonymous as I've seen the same pattern when playing with people I know in real life, although playing with strangers removes a lot of restraint.

It's more that those games create a scenario where you are going to be frustrated for 20+ minutes because of mistakes of others, and you likely started playing because you're trying to get away from frustration IRL.


Unfortunately I can agree with this completely. I stopped playing Fortnite because of this. Although it can be fun without winning, when they introduced the victory crowns to reward multiple consecutive wins (along with an emote to rub it in everyone's face) it seemed like just enough to cause us all to start getting frustrated and screw up team morale.


Those checks and balances weren't in Tzarist Russia either.


I don't think that (2) is OP's fault.

Look the words up in a dictionary. It's ok not to know and to learn but you shouldn't expect everyone to adapt to your lack of knowledge.


A dictionary probably won't give you a useful definition for most of those terms. They're jargon.

But then again I would add (3) they're just buzzwords that don't actually tell you anything...


Perhaps not HA, but 'fault tolerant' and 'extensible' are pretty apt.


....explain it like I'm 5 years old?


If everyone wrote software to be used by 5 years old without any interest in looking up things they don't understand (yet) themselves, we'd never move the field forward.


AutoHotKey is not normally aimed at developers but highly skilled administrators. If you target your advertising at this market you would likely get more traction.

Moving the field of software development is about solving more problems for people without putting a large learning curve in their way.

Yes some aspects will always require expertise but that is not an advantage.


Dictionary is good for learning words.


Also, it was until 2015 where Intel was on a great streak with their processors and didn't show signs of stagnation until around 2017/2018

I5/i7's were great at that time


The 12” MacBook came out in 2015 and had terrible performance and a problem with overheating. Insider reports say that Intel had promised a lower power chip with better performance that Apple designed the MacBook for but then Intel killed that chip and Apple had to use another Intel mobile chip. Some people feel that was when the problem got real.


Apple would still have a lot of people who relived the same situation during the IBM days.

If you're getting hints that your chip vendor is not aligned then you better have a backup plan.


Not entirely true, people watching and in the field knew. Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift. Similar with this move by Apple. There are people in those rooms, making the decisions, sharpening their ideas on HN — if we care to listen.


>Back in 2015 I made a massive bet on AMD because people on HN working in the field explained the arch shift.

I think you got really lucky. Zen1 didn't ship until 2017 and it lagged severely behind in single thread. You had no idea what AMD was going to have.

Even AMD would tell you that they were surprise that Intel fell so far behind. They've been quoted saying this a few times.

Intel's 10nm Node (equivalent to TSMC 7nm) was suppose to ship in 2016! It didn't ship anything on the desktop using 10nm until Alder Lake in 2021. Five year delay.

Intel would have been well ahead of Zen2 in node technology. Instead, it was around 1.5 node behind.

If you made your bet purely on what was said inside AMD in 2015, you just got lucky. No one knew that Intel would be stuck on 14nm for 7 years when they were planning for 2 years.


There were a ton of Intel engineers at the time complaining about management in a different thread.

I’m sure luck was involved (so many things could go wrong). But i tend to make money on bets based on what I hear on the fringe. AMD, Bitcoin, etc


The problem is how do you separate the gold from the cruft? Seems impossible. Also, so much cruft makes you miss the gold as well. :/


I actually wrote software to do that lol

Built this: https://insideropinion.com/

But use it for investments.

You can’t completely remove risk, but I invest in areas where insiders discuss publicly about their work. It provides insight often fundamentals lack; leaving massive potential upside.


Stalking as a Service? How could that possibly go wrong?

I’m equal parts horrified and amazed. And curious what The Algorithm thinks about my ramblings.


From my understanding, it is way more costly to miss the gold than to get some cruft.


They had already plateaued in 2014/2015 with Haswell/Broadwell. They've basically been releasing that same CPU with minor tweaks to power consumption and codec support for 8 years now.

At the time, it was hard to notice, but reviews at the time absolutely noticed the minor CPU update (https://www.theverge.com/2015/4/9/8375735/apple-macbook-pro-..., search "Broadwell"). Another funny aspect of that review: it mentions out 10+ hour battery life for the MBP as a nice, but hardly astonishing spec. 9 hours 45 minutes with Chrome was the worst case. It's amazing to think how bad the 2016-2019 MBPs were in comparison, to the point where getting back to 10 hour battery is an amazing Apple Silicon feature!


I don't think my 2019 MBP has ever laster more than 3 hours on battery.

My M1 Max is amazing by comparison.


They were bad for thin and light laptops with good battery life (mostly atom shit and underpowered core cpu's like in the 12inch macbook).


Also it's baffling how high the bar is for those who want to do good.

Most mainstream vendors will shit on your privacy and sell information of your entire life as a product and nothing happens to them.

But the privacy-respecting company is always a bad decision away from being crapped all over and cancelled as an option.


Today's KDE can be almost identical to a MacOS desktop, if you customize it properly.

I'm a KDE user since 2010 and a week ago I purchased an M1 Pro (hopefully to install Asahi in the future). My desktop was basically a top bar with global menu, a few widgets and Latte Dock.

If I didn't know that Mac came up with the functionality before, I would think that it is Mac that feels like a skinned KDE


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: