They're not open sourcing it because it's just gpt. Both of the new models are gpt-4o(-mini?) with presumably different fine-tuning. They're obviously not going to open source their flagship gpt models.
I guess you are aware of this, but just in case: some of us rely on dictation in our daily computer usage (think people with disabilities or pain problems). A MacBook Pro with M4 Max and 64GB of RAM could easily run something much larger than Whisper Large (around 3GB).
I would love a larger, better Whisper for use in the MacWhisper dictation app.
with devices having unified memory now we are no longer limited to what can fit inside of a 3090 anymore. consumer hardware can have hundreds of gigabytes of memory now, is it really not able to fit in that?
What’s the minimum hardware for running them?
Would they run on a raspberry pi?
Or a smartphone?