#AI Sunday Rant
AI Products will have to get closer to the datasets (not the other way around). This is another way to say that the models will have to continue to shrink and get closer to the "edge." Even if that is not the case, it means that the power is in the hands of the data owners.
When thinking about the total cost of switching from one model (family) to another, it's a race to zero, and even now, the cost is extremely low. When building an AI application, abstracting components is as important as it always has been to avoid vendor lock-in in more likely areas.
The value of foundational models is a race to the bottom, especially considering some of the biggest players are betting on open-sourcing them forever. The continuous increase in AI Model Context Windows will result in diminishing returns for fine-tuned models. The reason is that if you could build your prompts dynamically and programmatically very granularly (at scale), there is no actual need or less of a need for fine-tuning.