Skip Navigation

OpenAI releases GPT-4.5 with ridiculous prices for a mediocre model

29 comments
  • Ed Zitron:

    Sam Altman is talking about bringing online "tens of thousands" and then "Hundreds of thousands" of GPUs. 10,000 GPUs costs them $113 million a year, 100k $1.13bn, so this is Sam Altman committing to billions of dollars of compute for an expensive model that lacks any real new use cases. Suicide.

    Also, $1.30 per hour per GPU is the Microsoft discount rate for OpenAI. Safe to assume there are other costs but raw compute for GPT 4.5 is massive and committing such resources at this time is truly fatalistic, and suggests Altman has no other cards to play

    • Former OpenAI researcher Andrej Karpathy wrote on X that GPT-4.5 is better than GPT-4o but in ways that are subtle and difficult to express. "Everything is a little bit better and it's awesome," he wrote, "but also not exactly in ways that are trivial to point to."

      plebeian, you don't understand, you're sniffing our farts wrong

    • And GPT-4.5 is terrible for coding, relatively speaking, with an October 2023 knowledge cutoff that may leave out knowledge about updates to development frameworks.

      This is in no way specific to GPT4.5 but remains a weirdly undermentioned albatross about the neck of the entire LLM code-guessing field, probably because the less you know about what you told it to generate the likelier you are to think it's doing a good job, and the enthusiastically satisfied customer reviews in social media that I've interacted with certainly seemed to skew toward less-you-know types.

      Even when the up-to-date version release happened before the cut-off point you are probably out of luck, since the newer version is likely way underrepresented in the training data compared to the previous versions that people may have been using for years by that point.

29 comments