Hacker Newsnew | past | comments | ask | show | jobs | submit | irae's commentslogin

I hope they end up removing HDR from videos with HDR text. Recording video in sunlight etc is OK, it can be sort of "normalized brightness" or something. But HDR text on top is terrible always.


I believe you are speculating on digital mastering and not codec conversion.

From the creator's PoV their intention and quality is defined in post-production and mastering, color grading and other stuff I am not expert on. But I know a bit more from music mastering and you might be thinking of a workflow similar to Apple, which allows creators to master for their codec with "Mastered for iTuenes" flow, where the creators opt-in to an extra step to increase quality of the encoding and can hear in their studio the final quality after Apple encodes and DRMs the content on their servers.

In video I would assume that is much more complicated, since there are many quality the video is encoded to allow for slower connections and buffering without interruptions. So I assume the best strategy is the one you mentioned yourself, where AV1 obviously detects on a per scene or keyframe interval the grain level/type/characteristics and encode as to be accurate to the source material at this scene.

In other words: The artist/director preference for grain is already per scene and expressed in the high bitrate/low-compression format they provide to Netflix and competitors. I find it unlikely that any encoder flags would specifically benefit the encoding workflow in the way you suggested it might.


"I believe you are speculating on digital mastering and not codec conversion."

That's good, since that's what I said.

"The artist/director preference for grain is already per scene and expressed in the high bitrate/low-compression format they provide to Netflix and competitors. I find it unlikely that any encoder flags would specifically benefit the encoding workflow in the way you suggested it might."

I'm not sure you absorbed the process described in the article. Netflix is analyzing the "preference for grain" as expressed by the grain detected in the footage, and then they're preparing a "grain track," as a stream of metadata that controls a grain "generator" upon delivery to the viewer. So I don't know why you think this pipeline wouldn't benefit from having the creator provide perfectly accurate grain metadata to the delivery network along with already-clean footage up front; this would eliminate the steps of analyzing the footage and (potentially lossily) removing fake grain... only to re-add an approximation of it later.

All I'm proposing is a mastering tool that lets the DIRECTOR (not an automated process) do the "grain analysis" deliberately and provide the result to the distributor.


RIP electron apps and PWAs. Need to go native, as chromium based stuff is so memory hungry. PWAs on Safari use way less memory, but PWA support in Safari is not great.


I, for one, would not miss a single one of the electron apps I'm forced to use.

Every single one of them makes me feel like the vendor is telling me "we can't be bothered employing half decent developers or giving the developers we have enough time and resources to write decent software, so we're just going to use cheap and inexperience web developers and burn another gigabyte or two of your memory to run what could easily be a sub 100MB native app."

At least now I'll have significant numbers to tell my boss: "Sure, we can continue to use Slack/VSCode/Teams/Figma/Postman/ - but each of those is going to require an additional GB or two of memory on every staff member's computer - which at today's pricing is over $500 in ram per laptop which are all on a 18-24 month replacement cycle. So that's maybe a million dollars a year in hardware budget to run those 5 applications across the whole team. We'll need to ensure we have signoff on that expenditure before we renew our subscriptions for those apps."


Your laptops have an 18-24 month replacement cycle? What are you guys doing to the poor things?


Apps can't be 100MB on modern displays, because there are literally too many pixels involved.

Not that I know what's going on in an Electron app heap (because there's no inspection tools afaik), but I'm guessing much of it is compiled code and the rest is images and text layout related.


> Apps can't be 100MB on modern displays, because there are literally too many pixels involved.

What? Are you talking about assets? You'd need a considerable amount of very high-res, uncompressed or low-compressed assets to use up 100MB. Not to mention all the software that uses vector icons, which take up a near-zero amount of space in comparison to raster images.

Electron apps always take up a massive amount of space because every separate install is a fully self-contained version of Chromium. No matter how lightweight your app is, Electron will always force a pretty large space overhead.


No, I'm talking about window buffers. This is about memory not disk space.


I was talking about RAM - in that running Chromium on its own already has a preset RAM penalty due to how complicated it must be.

But window buffers are usually in VRAM, not regular RAM, right? And I assume that their size would be relatively fixed in system and depend on your resolution (though I don't know precisely how they work). I would think that the total memory taken up by window buffers would be relatively constant and unchanging no matter what you have open - everything else is overhead that any given program ordered, which is what we're concerned about.


Well, you see, there's a popular brand of computers that don't have separate VRAM and have twice the display resolution of everyone else.

Luckily, windows aren't always fullscreen and so the memory usage is somewhat up to the user. Unluckily, you often need redundant buffers for parts of the UI tree, even if they're offscreen, eg because of blending or because we want scrolling to work without hitches.


Are you 100% sure every single window needs to have 8k resolution?


The size of the window is up to the user.


What you just said can be applied for any sides of any infrastructure political discussion.

If we leave nuclear waste (which is already proven not problem over and over) the next generations will come up with solutions we don’t know yet. If we leave the climate worst, by not doing what we should, they will also figure it out.

The discussion right now is about getting to a reduced carbon footprint per country fast (nuclear, in 10 or 20 years) or keep betting renewables will take less than 50 years to make an impact in reducing carbon emissions.


Year over year we see the same discussion: one side say we can’t get to environmental goals without a lot more nuclear. The other side says renewables will be enough if we just [insert trending unproven excuse].

Anyone watching it closely can tell the pro nuclear argument is being proven right over and over and renewables are aways “a few years away”. Why continue insisting on the same error over and over again, with the proof in front of their eyes?

The major mistake was made 10 years ago by not building better and modern nuclear plants to replace Diablo and increase capacity. If this continues we’ll likely see the US entering an energy crises or going back to burning fossil fuels in a few years.


The Bush administration gave out major incentives to get more power plants built, yet only a handful of projects were started. These projects ended up (like most nuclear projects) massively delayed and over budget. In the end, Westinghouse filed for bankruptcy: https://www.reuters.com/article/us-toshiba-accounting-westin...

Pro nuclear people keep acting like our lack of new nuclear power plants is due to fear or lack of vision. The reality is that corporations do not want to build them, because they are incredibly expensive and may take over a decade before any revenue is produced. Renewables are cheaper and provide fast return on investment.


Why is my power bill in CA so high?


The best counter-argument is that people say they are going to build a nuclear plant but a year passes and they add another year to the schedule. That problem, and not what people imagine about safety, nor the fact that the headline costs aren’t terribly attractive, is a big problem,


I used Safari for years and it is so easy to open Chrome just for Google Meet. It is way less annoying than one would imagine


My guess to why he banned it now is that before he was more moderate against woke or left in general. Now that he is more openly and harshly targeting woke and Fauci, it becomes a bit more hazardous to have it up.

But this looks bad. He should have stated publicly why before banning it.


As times passes I am less supportive of how companies use ideology to weaponize regulations in order to slow down competition. How fast the COVID vaccine was created is a bit of uncovering how fast it could go when people are willing to compromise for the greater good (also talent and hard work).

It feels like the machine was created to slow down progress and now it cannot be stopped. As other comments said already, factory farming is worst than what Neurallink is doing, and vegan activists are trying to save animals for ages. On the other hand, Musk is also known for pushing hard employees, and animals is just the extension of pushing hard for breakthrough.

If not done in the US, maybe it will go elsewhere, and to what purpose? This feels to me like a very worthy cause. A huge number of other animals are used for other types of health research. I guess just because people dislike rodents they give it a pass. But what Neurallink is doing is in far fewer number than rat use for drugs, etc, and far less than animal farming. It is just the engine of regulation, competition, lobbying and patents working as it is supposed to.


The COVID vaccine is perhaps a poor example as mRNA vaccines have been something in development for quite some time before COVID. It was a technology there at precisely the right time


That is not the first time I want to understand a bit better the performance difference today between the approaches of Rust, without a garbage collector, ARC on Swift with the reference counting and other garbage collected languages, such as Javascript.

I know Javascript have an unfair advantage here, since the competition between V8 and the other javascript cores is huge over the years and garbage collection on JS is not often a problem. At least I see more people struggling with the JVM GC with its spikes in resource usage.

I've also heard that the erlang VM (be it written in elixir or erlang itself) implements GC on a different level, not to the global process, but on a more granular way.

Is there a good resource that compare the current state of performance between those languages or approaches?


If there is no provision on L1 for early termination, then the rules of this particular type of visa are entirely broken and needs fixing. I understand the reason L1 exists, I actually lived in the US under L1 myself, but there is no excuse for a country to accept a worker they deemed necessary and acceptable to migrate, and then impose such a harsh rule for edge cases. If the person needs to be let go because somehow their own fault, it is fine. But layoffs are an example where the company is at fault and the employee is the victim - in this case the minimum the government should do is to offer a fair amount of time for the person to find another job.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: