I’ve noticed that LLMs are fast and cheap for simple Q&A. They just generate answers without needing to crawl or index.
But I wonder if this low cost will last. Once the LLM race keeps going, everyone will need to keep retraining and fine-tuning, and that will push up the costs.