Hacker News new | past | comments | ask | show | jobs | submit | fattire's comments login

Why not use the storage access framework, which is agnostic to where the files are being saved whether local or remote? By selecting the file to open or naming it to save, you choose a destination and permission is implicitly granted to that location for that file. Could be google drive or the local file system or any cloud provider app that supports SAF. No storage permissions needed, and it's been around for years and years.

https://www.youtube.com/watch?v=C28pvd2plBA


It's always suspect when a big company asks for more regulation-- OpenAI, FTX, etc. It's usually in the name of the public good or fairness etc, but they may actually be looking for legal recognition of otherwise dubious activities, ways to block out new competition, or they could be trying to curtail inevitable regulation by writing the law themselves in the most favorable way possible.


Similar with the Texas Two Step Bankruptcy (create a subsidiary when you're facing massive lawsuits, offload everything that relates to those lawsuits to the subsidiary, promise to fund it to be able to pay out expected judgments, then don't, or massively underfund it, declare bankruptcy, and walk away with no further liability while still making massive profits).

One law firm specializes in this, and the story above has been the case in each of the five or six times it has been done.

Yet they claim, with a straight face (and even have writers like Matt Levine carrying water for them) that really, truly, honestly, they're not doing this to avoid liability, they're doing it "to make the process easier for the plaintiffs and streamline their legal efforts".

Like how utterly stupid do they think we are? Why on earth would a multinational for profit company spend considerable effort to actively assist people who are suing them?


> Why on earth would a multinational for profit company spend considerable effort to actively assist people who are suing them?

Because the work overall the organization is doing becomes more valuable with this change. As an analogy, it's akin to saying you'll deal with a specific problem for this part of the day - and the rest will be focussed on other productive things. For a whole class of problems, your overall day would go better.


You can opt out of it thankfully.


I've played with langchain now for a couple weeks (with some of the llama-derivative local models and Oobadooba's native & openai apis + TextGen https://python.langchain.com/docs/modules/model_io/models/ll... ) and find it not-too-insanely-hard for an idiot like myself to figure out, though I'm just experimenting at this point with different models, esp. using tools, etc. I've found that some of the recommended prompts in the demos that, while perhaps working well with chatgpt/gpt4, need a lot of tweaking to work with with say WizardLM. But then I can get them working, so that's kinda neat.

I also played with huggingface's transformer agent (https://huggingface.co/docs/transformers/transformers_agents ) and thought it was a lot easier to useas far as the tools go, though is perhaps less capable for other things. I may go back to playing with that actually.


"Tyranny"? Really?


If they can make up new meanings for words like "Piracy" and "Theft" I think we can make up "Tyranny".


Strange that you use a still-popular, decentralized, open-standard global service with a myriad of paid and free providers, for which there exists dozens of open-source clients and servers across the planet, all of which integrate with each other... as a model for why Mastodon won't work.

Why do you say "gmail won"? Did gmail switch to a proprietary, closed protocol? Did they lose compatibility with even the smallest, independent email servers? (That was rhetorical-- provided those smaller servers adopt appropriate open, non-proprietary anti-spam measures (DMARC, SPF, DKIM, etc), they work fine.)

Meanwhile, no one uses AOL's old proprietary mail system, if it exists any more.


The year is 2029. Farmers everywhere are confounded by new weed strains which look more and more like the crops they intermingle with, hiding more perfectly among the plants every year.

Then someone had the idea to not vaporize a small percentage of the most conspicuous, ugly weeds, so they'd survive into the next year, crossbreeding with the stealth strains and keeping the weeds from getting too stealthy.

The End.


Might not be so bad even if that solution doesn't work. We got oats and rye out of it last time around: https://en.wikipedia.org/wiki/Vavilovian_mimicry


Thank you for sharing that. What a surprising thing to learn!


Alternative ending:

Instead of weeding based on the looks of the plans a new company UpRound has developed a chemical solution that can target the weeds without relying on sight, instead it chemically targets all plants other than the crops themselves.


Here is an account of the whole laptop decision on Kara Swisher's podcast from the perspective of Twitter's former safety chief Yoel Roth.

https://podcasts.apple.com/us/podcast/why-twitters-former-sa...


Really good podcast and Kara has always been a treasure. Her takes on Musk now are hilarious now too!


As you mentioned, the nodes are used in Fusion and the Color page, where they belong, not in either timeline editor (Cut Page & Edit Page).


The integration with Fusion is pretty clumsy at the moment. It's not clear what timeline event you're working on when you're on the Fusion page, especially when you have several video tracks stacked at the playhead's location. It disregards the user-selected event (if one has been selected) and instead shows (IIRC) the topmost one instead.


Of course it's a non-linear editor roughly on par with Avid, Final Cut, or Premiere. With all due respect, this person has no idea what they're talking about.

The Cut page (as opposed to the Edit page) is what's apparently to be included on iPad, but it's 100% as legit a NLE as any other. I wouldn't call After Effects a NLE per se, but the node-based corollary to AE in Resolve is called Fusion and doesn't appear to be (initially) included on iPad.


Moved this response as an edit to the main comment on https://news.ycombinator.com/item?id=33277308 instead, but keeping this comment rather than deleting it, because the replies to it have value to the discussion.


(You edited your comment to oblivion, but because it was so embarassingly wrong, I don't blame you.)

My original response:

"close enough to a NLE"?? What the--

DaVinci Resolve has included a standalone non-linear editor since 2014. Today it has every major feature one would expect from a NLE-- with the noted exception of a lack of some codecs in the Linux version due to licensing issues, AAC being conspicuously one of them. So ffmpeg is often needed for transcoding media. I have a fair amount of familiarity with Avid, Final Cut Pro (7, not X), and to a lesser degree Adobe Premiere, and some experience with other NLEs from OpenShot to iMovie to Lightworks to Blender's NLE. Not only is DVR a feature-packed NLE- it ALSO includes a ProTools-like audio editing component called Fairlight, a node-based 2d/3d After Effects-like component (which integrates neatly with Blender) called Fusion, and a best-in-class color grading tool, for which Resolve is probably best known, which has roots going back to the DaVinci color correction systems of the 1980s.

DaVinci Resolve has not one NLE interface, but TWO-- the traditional Avid-like Edit page, and a new "Cut Page" (the one that appears in the iPad demo videos), which I think first showed up in DaVinci Resolve 17 (18 is current) and that is meant as a faster UI for doing a rough assembly that heavily integrates with the "Speed Editor" specialized hardware. For a while, there were deals where the Speed Editor came free with a ($300) Studio License. Now, I think it's $400 maybe (?) The paid Studio version includes extra features like more plugins (many of which use neural networks to, say, infer depth or separate objects from backgrounds), headless python scripting, 3d audio, 8k export, and pro stuff like that.

I'm presuming the iPad version will also work with the Speed Editor (it can connect via bluetooth or USB).

And since you mentioned it, the Fusion-style node system is considered superior by many pros to the older layers-based system used by After Effects, which is why it has been adopted by newer software from Unreal to Blender to nuke, etc. Also, you can drop effects "on" clips and layers-- this can be done in the Edit page as per tradition, and works as expected.

Since DaVinci Resolve is meant to run in CentOS, I've helped collaborate on a method for running it in a Linux container as well for anyone who might be interested:

https://github.com/fat-tire/resolve


The "Cut Page" has been a godsend for me. I'm going through about 1500 hours of footage right now and that editing mode is the only thing that has made my task even remotely possible for me to pull off.


You need a Speed Editor.

I don't know if you ever played FPSes in the early days of Quake, Doom, etc. but remember how you used to just play with a keyboard and it was okay, but then you switched to mouselook and it was just a whole 'nother world?

That's what the Speed Editor does.


Played FPSes in those early days, so familiar with that. Also use Resolve a lot and have the Speed Editor. But I never got used to the Cut page and just kept using the Edit pane with keyboard shortcuts. The hardware feels really nice but just didn’t surpass the keyboard for my use - I even watched some Cut videos to see what I was missing. I think the way I assemble videos is just too different to the norm?


That's what I use -- the Speed Editor with the "Cut Page."


> Since DaVinci Resolve is meant to run in CentOS, I've helped collaborate on a method for running it in a Linux container as well for anyone who might be interested:

Soooooooooo interested, but sadly it seems NVidia-only.

If Blackmagic would put some of their port-to-iOS muscle on a make-it-work-with-AMD team, it'd be useful to me. Alas, I have a knack for picking losers.


Only for lack of hardware to test it on. There is an open issue if you want to try your hand at getting it to work on non-NVidia, though it will run best on some kind of dedicated GPU due to the heavy graphics operations it does.

See https://github.com/fat-tire/resolve/issues/8


> in any normal NLE, you put your effects on your clips, or your layers. You don't do that in Resolve, because it's grading software

You can add effects onto clips in the Resolve NLE. If you just want to use Resolve as an NLE (ie. you don't want to colour-grade your work or do complex VFX in Fusion), you don't need to go near a node graph.

Perhaps there is some pedantry to be done about whether it is an NLE or has an NLE. No dispute that it used to be colour grading software and the NLE was bolted on later (like a decade ago).


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: