I do the same but a co-founder usually has different view or opinion. LLMs (in my experience) are way to agreeable and just tend to agree and not push back much.
But that feels weird combined with this. You can buy OpenAI API access which is served off of AWS infrastructure, but you can't bill for it through AWS? (I mean, lots of companies work like that. but Microsoft is betting that a lot of people move regular workloads to Azure so they can have centralized billing for inference and their other stuff?)
I would imagine they couldn't offer models through Bedrock. I think this means training and traditional computing workloads for their products (such as the workspaces for Codex cloud)
Is it purely chat based (ad-hoc) or users can make dashboard like “cards” for common queries they have? (Imagine the stripe dashboard MRR, today’s revenue, failed charges, etc)
reply