Wow, I had not heard of that project. I did hear about browser that doesn't compile costing a lot as well though.
amy@ice.ch3.st
Posts
-
GenAI reaches another unexpected corner of #FreeSoftware: the #Hurdhttps://lists.gnu.org/archive/html/bug-hurd/2026-02/msg00133.html -
GenAI reaches another unexpected corner of #FreeSoftware: the #Hurdhttps://lists.gnu.org/archive/html/bug-hurd/2026-02/msg00133.html@civodul@toot.aquilenet.fr from Baccula's paper:
When Brent started this project on February 16, he purchased a Claude Max subscription for $100/month. This subscription provides a fixed allocation of usage—not per-token billing—for both interactive sessions and the claude --print API calls that the task runner uses. The actual cost of this project is $100/month, not the per-token amounts shown in the task runner’s cost tracking.
The per-token costs reported by the API represent what the usage would cost at retail API rates: approximately $297 across 169 task runs with billing data (plus ∼$111 estimated for 31 runs without billing), and ∼$338 for 11 interactive sessions—roughly $746 total at retail rates. They are useful for understanding relative expense between tasks, but they are not what was actually paid. At retail API rates, the project would have cost over seven times the subscription price—the subscription is a much better deal for heavy usage. -
GenAI reaches another unexpected corner of #FreeSoftware: the #Hurdhttps://lists.gnu.org/archive/html/bug-hurd/2026-02/msg00133.html@civodul@toot.aquilenet.fr $748 in less than a week!!!!! I get that they only paid $100 because of a temporary subscription deal, but holy shit... That's a lot of compute. How many guix subsitutes do you think could be built with $748 of compute?
-
@cda@social.sporiff.dev I'm curious what prompted that?@cda@social.sporiff.dev you mentioned that the data might be time-dependent, if I understand what you were saying, and in my opinion, databases ideally shouldn't know where or when they are. Like, if a row should be marked as stale if it was updated more than a week ago, most relational dbs offer a way to schedule a task every night that could set the 'stale' column to true if necessary. But that's not how I would approach it. I would pass the current time in to a query against the last_updated column, so that the stale state of the row is a computed property rather than a time-dependent column. I think a space/time/env-independent schema is always possible to design. Like, I would like to be able to physically pick up the database and move it to a different timezone, send it 10 years in the past or future with my portal, whatever, and the stateless backend should be able to use it still.
Beyond that, I don't have much high level advice lol. If I'm being honest, I think I'm sensing a little impostor syndrome. I think you have more experience than you think you do, and you're asking the right questions, just try not to get too stuck in the research phase. -
@cda@social.sporiff.dev I'm curious what prompted that?@cda@social.sporiff.dev I'm curious what prompted that? I think the risk of an improper db schema is not like, potential data loss or the db becoming harder to maintain and expand, but rather that the backend codebase will continually accrue technical debt in order to work around poorly defined entities and relationships. If the schema causes friction with the backend API, it should be addressed at the database level as quickly as possible. But also, like, perfectly aligned db schemas don't exist in the wild, like they do in textbooks. We are all imperfect beings with deadlines. Are you designing something from scratch?