LLM access is relatively cheap now because the LLM vendors are discounting their price at a massive loss, subsidized by VC, in order to get you addicted and to drive as much skilled human labor as possible out of the workforce permanently.
-
LLM access is relatively cheap now because the LLM vendors are discounting their price at a massive loss, subsidized by VC, in order to get you addicted and to drive as much skilled human labor as possible out of the workforce permanently.
The goal is monopolization, and if they’re successful, you’ll see monopolistic pricing in the future.
@lapcatsoftware this seems likely enough.
-
LLM access is relatively cheap now because the LLM vendors are discounting their price at a massive loss, subsidized by VC, in order to get you addicted and to drive as much skilled human labor as possible out of the workforce permanently.
The goal is monopolization, and if they’re successful, you’ll see monopolistic pricing in the future.
@lapcatsoftware I feel compelled to mention there are models you can self-host. There are even models where the architecture is available under a permissive license, so you can tweak / tune / retrain / distill or whatever beyond mere prompting.
I don't recommend or defend that approach. I think there are still problems, ethical and other.
But, it could be a way to prevent "vendor lock-in" with your LLM usage.
-
E em0nm4stodon@infosec.exchange shared this topic