Hey HN,
I built batch-ai, a TypeScript SDK that simplifies batch processing across AI model providers (OpenAI, Anthropic, and soon Google Gemini & xAI Grok).
Why?
If you've ever tried running AI workloads in batch mode, you know how frustrating it is:
OpenAI requires file uploads for batch processing.
Anthropic doesn't, but has different request/response formats.
Cost savings are significant (~50% vs. real-time API calls), but every provider handles batching differently.
I wanted a simple way to process large volumes of AI requests across different providers without dealing with writing code for all of them. batch-ai provides a single interface to handle batch requests efficiently.
Features:
Unified batch processing API for OpenAI & Anthropic (more coming).
Define output schemas with Zod for structured responses.
Reduce costs by using batch APIs instead of real-time calls.
Easily switch providers without changing request logic.
Who is this for?
AI moderation tools (like Filtyr, my AI content moderation SaaS). https://filtyr.com
Large-scale AI processing (e.g., sentiment analysis, classification).
Researchers & enterprises handling structured AI output at scale.
What’s next?
Support for Google Gemini & xAI Grok.
More batch API options (e.g., generateTextBatch).
Smarter retries & error handling.
Repo: GitHub – grantsingleton/batch-ai
Would love feedback from anyone working with batch AI APIs. What’s your experience? What pain points have you run into?