There are local LLMs, they’re just less powerful. Sometimes, they do useful things.
The human brain uses around 20W of power. Current models are obviously using orders of magnitude more than that to get substantially worse results. I don’t think power usage and results are going to converge enough before the money people decide AI isn’t going to be profitable.
The power consumption of the brain doesn’t really indicate anything about what we can expend on LLMs… Our brains are not just biological implementation of the stuff done with LLMs.
Running LLM in 30 years seems really optimistic
how so? they can’t make locally run LLMs shit and I assume hardware isn’t going to get any worse
There are local LLMs, they’re just less powerful. Sometimes, they do useful things.
The human brain uses around 20W of power. Current models are obviously using orders of magnitude more than that to get substantially worse results. I don’t think power usage and results are going to converge enough before the money people decide AI isn’t going to be profitable.
The power consumption of the brain doesn’t really indicate anything about what we can expend on LLMs… Our brains are not just biological implementation of the stuff done with LLMs.