Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Probably not. An 40GB Nvidia A100 is arguably reasonable for a workstation at $6000. Depending on your definition an 80GB A100 for $16000 is still reasonable. I don't see this being cheaper than an 80GB A100. Probably a good bit more expensive, seeing as it has more RAM, compares itself favorably to the H100, and has enough compelling features that it probably doesn't have to (strongly) compete on price.


Surely NVidia’s pricing is more what the market will bear vs an intrinsic cost to build. Intel being the underdog should be willing to offer a discount just to get their foot in the door.


Nvidia is charging $35K so a discount relative to that is still very expensive.


Pricing is normally what the market will bear. If this is below your cost as supplier you exit the market.


But if your competitor's price is dramatically above your cost, you can provide a huge discount as an incentive for customers to pay the transition cost to your system while still turning a tidy profit.


Isn't it much better to get a Mac Studio with an M2 Max and 192gb of Ram and 31 terraflops for $6599 and run llama.cpp?


Macs don't support CUDA which means all that wonderful hardware will be useless when trying to do anything with AI for at least a few years. There's Metal but it has its own set of problems, biggest one being it isn't a drop in CUDA replacement.


You can do LLM inference without CUDA just fine. Download Ollama and see for yourself


I'm assuming this won't support CUDA either?


For LLM inference - yes absolutely.


I think you're right on the price, but just to give some false hope. I think newish hbm (and this is hbm2e which is a little older) is around $15/gb so for 128 gb thats $1920. There are some other cogs, but in theory they could sell this for like $3-4k and make some gross profit while getting some hobbyist mindshare/research code written for it. I doubt they will though, it might eat too much into profits from the non pcie variants.


in theory they could sell this for like $3-4k

You’re joking, right? They will price it to match current H100 pricing. Multiply your estimate by 10x.


They could, I know they wont, but they wouldn't lose money on the parts


> is around $15/gb

This figure is old and I dont think $15 cuts it anymore. My guess would be $20 if not exceeds it.


Interestingly they are using HBME2 memory which is a few years old at this point. The price might end up being surprisingly good because of this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: