I tried to run an LLM locally.
-
I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.
-
S svenja@mstdn.games shared this topic
-
I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.
@lea same here, just isn't feasible ... if it's about privacy, I found https://duck.ai or https://huggingface.co/chat/ helpful, afaik they also allow you to play with LLMs without your data being used for training
-
I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.
Some people replied with "then your hardware might not be efficient". Yes, that is the point I was trying to make. When you want to add AI to your daily workflow but you want to use it purely locally, so you can work offline and be sure it isn't used for training, then the hardware requirements rise from office laptop (anything that runs the text editor of your choice) to up-to-date high-end gaming PC.