Arbitraging Down LLM Inference to the Cost of Electricity inference.net 6 points by ycombyourhair 13 hours ago
srbhr 13 hours ago > Someone with cheap electricity can serve inference profitably at prices that would bankrupt the current small set of centralized providers.Strong claim, There's a similar article about China is eating the world on HN. Maybe then, they're the ones who can take over the inference cloud?
skeptrune 13 hours ago Why is there nowhere on the site where I can figure out how much I would get paid by adding my GPU to the network? srbhr 13 hours ago That's exactly I was thinking.
nick779 13 hours ago frontier labs are still making crazy margins. OSS models are impossible to host serverlessly at a profit even if you're Together
> Someone with cheap electricity can serve inference profitably at prices that would bankrupt the current small set of centralized providers.
Strong claim, There's a similar article about China is eating the world on HN. Maybe then, they're the ones who can take over the inference cloud?
Why is there nowhere on the site where I can figure out how much I would get paid by adding my GPU to the network?
That's exactly I was thinking.
frontier labs are still making crazy margins. OSS models are impossible to host serverlessly at a profit even if you're Together