Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can this be interpreted as anything other than a scheme to charge you for hidden token fees? It sounds like they're asking users to just hand over a blank check to OpenAI to let it use as many tokens as it sees fit?

"ChatGPT can now do asynchronous research on your behalf. Each night, it synthesizes information from your memory, chat history, and direct feedback to learn what’s most relevant to you, then delivers personalized, focused updates the next day."

In what world is this not a huge cry for help from OpenAI? It sounds like they haven't found a monetization strategy that actually covers their costs and now they're just basically asking for the keys to your bank account.



No, it isn’t. It makes no sense and I can’t believe you would think this is a strategy they’re pursuing. This is a Pro/Plus account feature, so the users don’t pay anything extra, and they’re planning to make this free for everyone. I very much doubt this feature would generate a lot of traffic anyway - it’s basically one more message to process per day.

OpenAI clearly recently focuses on model cost effectiveness, with the intention of making inference nearly free.

What do you think the weekly limit is on GPT-5-Thinking usage on the $20 plan? Write down a number before looking it up.


If you think that inference at OpenAI is nearly free, then I got a bridge to sell you. Seriously though this is not speculation, if you look at the recent interview with Altman he pretty explicitly states that they underestimated that inference costs would dwarf training costs - and he also stated that the one thing that could bring this house of cards down is if users decide they don’t actually want to pay for these services, and so far, they certainly have not covered costs.

I admit that I didn’t understand the Pro plan feature (I mostly use the API and assumed a similar model) but I think if you assume that this feature will remain free or that its costs won’t be incurred elsewhere, you’re likely ignoring the massive buildouts of data centers to support inference that is happening across the US right now.


We don't charge per token in chatgpt




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: