Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Batch-AI – A TypeScript SDK for Batch AI Calls Across Providers (github.com/grantsingleton)
2 points by grantsingleton on Feb 18, 2025 | hide | past | favorite | 2 comments
Hey HN,

I built batch-ai, a TypeScript SDK that simplifies batch processing across AI model providers (OpenAI, Anthropic, and soon Google Gemini & xAI Grok).

Why?

If you've ever tried running AI workloads in batch mode, you know how frustrating it is:

OpenAI requires file uploads for batch processing.

Anthropic doesn't, but has different request/response formats.

Cost savings are significant (~50% vs. real-time API calls), but every provider handles batching differently.

I wanted a simple way to process large volumes of AI requests across different providers without dealing with writing code for all of them. batch-ai provides a single interface to handle batch requests efficiently.

Features:

Unified batch processing API for OpenAI & Anthropic (more coming).

Define output schemas with Zod for structured responses.

Reduce costs by using batch APIs instead of real-time calls.

Easily switch providers without changing request logic.

Who is this for?

AI moderation tools (like Filtyr, my AI content moderation SaaS). https://filtyr.com

Large-scale AI processing (e.g., sentiment analysis, classification).

Researchers & enterprises handling structured AI output at scale.

What’s next?

Support for Google Gemini & xAI Grok.

More batch API options (e.g., generateTextBatch).

Smarter retries & error handling.

Repo: GitHub – grantsingleton/batch-ai

Would love feedback from anyone working with batch AI APIs. What’s your experience? What pain points have you run into?



Nice! This was one of the missing parts of the Vercel AI SDK.


Yeah this is what I wanted the AI SDK to do but didn't have it and surprisingly no one built anything for it yet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: