Hacker Newsnew | past | comments | ask | show | jobs | submit | bvm's commentslogin

Nah, it just came out like that (i printed it :D)

We experimented with ironing, but you can't see it in those views. The A1 is a pretty good printer OOTB. There were a few iterations on the case design though to optimise the edges.


>- Hosting OSS models was a pain [solved]

what's the solution here? vllm?


i'm more intrigued by the image of the rescue in the article where someone appears to have been riding a quadbike on a frozen lake (?!) and fallen in.

https://www.firetruckmall.com/AvailableTruck/10923/2018-Neot... this would seem to be the source, is it a test? but why would you use a real quadbike in freezing water for a test? Looks terrifying.


Oddly relevant. A man and his daughter drove a quad over ice and went through it in the Netherlands last weekend, and the girl passed away earlier today. :(


800gb


Where are you getting that figure?


That is the size of 'The Pile' which contains books3 among many other things.


I also found this tedious and made a tiny vscode extension to make it less tedious

https://marketplace.visualstudio.com/items?itemName=TomJenni...


Sure. I worked at a company that produced tens of thousands of human written summaries of news data a year. This was costly and slow but our clients really valued them. Back in 2019 we fine tuned an LLM to help, we put a lot of effort into creating a human-in-the-loop experience, highlighting parts of speech that were commonly hallucinated and ensuring that we were allowing humans to focus on things that humans are good at.

We also released some of the data as a free dataset with a commercial option for all of it. This was more successful than I thought it would be and was hoovered up by the kind of people that buy these datasets.

It will have been surpassed by recent developments now but it was an incredibly enjoyable project.


What kind of clients value news summaries that much?


large corporates, financial services. Use cases were needle-in-a-haystack style searching, internal comms, following research topics over time, external newsletters, that kinda stuff. It wasn't particularly high margin but it was a fun business.


I imagine it's a company similar to Bulletin Intelligence. Would you be open to discussing your experiences in this industry?


yeh sure! how'd you like to do so?


Awesome! To protect your privacy on HN, please email nparker2050@gmail.com and let me know whether you prefer getting on a call or keep things in writing. Looking forward to hearing from you!


did this come about because shopify was the first page you visited in your May update video? nice bit of serendipity!


I first spoke with Tobi about this in mid-May, so it's been in the works for a while longer. :)


READ COMMITTED SNAPSHOT


if hypergols are allowable then some existing RCS thrusters would fit on a model rocket. Or rocketlab's curie is pretty small IIRC.

but...liquid propellents are either pretty dangerous (hydrazine et al), or impossible for an amateur to work with (cryo requirements, pressures required, complex starting sequence etc).

The plumbing, extra parts, materials required, tolerances mean that what you might consider as a "classic" liquid design ends up being pretty heavy at a small scale, so you have to build a bigger engine to cope with the extra weight. So you would end up with an engine not really that analogous to what you would consider "classic" liquid designs (gas generators/tap-offs/staged combustions).

However.... Frank Malina and Jack Parsons essentially did what you're proposing in the 1930s without any new fangled 3d printing technology, so it's possible! Just don't end up like Jack Parsons.


Woah. I didn’t know who Jack Parsons was and wanted to know how to not end up like him - his wikipedia is a wild ride! What a guy

https://en.m.wikipedia.org/wiki/Jack_Parsons


Wow that paragraph went into a vertical climb and somehow kept going higher


Just checking out the docs now, how do Loaders fit into the vision alongside AI functions? I can't quite piece it together in my head. Would a function grab extra context from a loader prior to execution? Is this supported now?


Good question and actually a great illustration of how unexpectedly the "flagship" feature of a library can change!

At its core Marvin isn't just for AI functions, but a high-level library that makes it easy to interact with LLMs in a programmatic way. In fact the first version was written to make it easier for us to upload public and private knowledge into our customer service Slackbot. This happens through the `Bot` class, which is designed to help users / programs explore more complex problems. In particular, bots can use plugins to access proprietary knowledge.

Our loader classes are designed to get the knowledge into the bots. Why build "yet another LLM loader library?" We've had enough real-world use cases to know that just taking a document, chunking it, and throwing it in a vector store gives pretty bad results over large enough datasets (especially if the documents are relatively homogeneous). You have to preprocess documents in a particular way, and we wanted to take our learnings and codify them for future use.

So AI functions are definitely the on-ramp to the library, but the real power is in utilizing it to extract insight from data. Since AI functions are "just" bots under the hood, they can use plugins (pass `plugins=[...]` to the `@ai_fn` decorator) and will benefit from this as well. This is supported right now, but we are rapidly improving loader integration more broadly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: