Hacker News new | past | comments | ask | show | jobs | submit login

Hi I built this little stream processing library so I could write things like;

  const stats = await pipe(
    Nodes.scan({ fields: true }),
    map(generateStats),
    tap(() => count++),
    reduce(mergeGraphStats, {})
  )

and have generateStats and mergeGraphStats be async functions and not have to worry about error handling and pushing more than one object at a time in a read stream. We use it to process billions of events and objects a day. It makes nodejs streams fun to use.

Hope you find it as useful as we do.




Pretty cool. Could the following be used as a baseline to create the same functionality?

    async function * nextStreamEntry() { 
      while(true){
        const entry = yield;
        const result = await processEntry(entry);
      }
    }

    // Initialize the iterator
    const iter = nextStreamEntry();
    iter.next();

    // For every entry of the stream
    for (const monster of results) {
      iter.next(monster)
    }


yes!

I would love to, but not all node versions and not all streams support async iteration yet! Also theres is a lot of utility to some of the stream types in bluestream. Concurrency for one thing. I've been working on an async iterator version of this but haven't yet started using it in production yet.

https://github.com/reconbot/streaming-iterables




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: