Mastodon Skip to content
  • Home
  • Aktuell
  • Tags
  • Über dieses Forum
Einklappen
Grafik mit zwei überlappenden Sprechblasen, eine grün und eine lila.
Abspeckgeflüster – Forum für Menschen mit Gewicht(ung)

Kostenlos. Werbefrei. Menschlich. Dein Abnehmforum.

  1. Home
  2. Uncategorized
  3. I tried to run an LLM locally.

I tried to run an LLM locally.

Geplant Angeheftet Gesperrt Verschoben Uncategorized
3 Beiträge 2 Kommentatoren 0 Aufrufe
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • lea@lea.lgbtL This user is from outside of this forum
    lea@lea.lgbtL This user is from outside of this forum
    lea@lea.lgbt
    schrieb zuletzt editiert von
    #1

    I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.

    loewe@metalhead.clubL lea@lea.lgbtL 2 Antworten Letzte Antwort
    1
    0
    • svenja@mstdn.gamesS svenja@mstdn.games shared this topic
    • lea@lea.lgbtL lea@lea.lgbt

      I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.

      loewe@metalhead.clubL This user is from outside of this forum
      loewe@metalhead.clubL This user is from outside of this forum
      loewe@metalhead.club
      schrieb zuletzt editiert von
      #2

      @lea same here, just isn't feasible ... if it's about privacy, I found https://duck.ai or https://huggingface.co/chat/ helpful, afaik they also allow you to play with LLMs without your data being used for training

      1 Antwort Letzte Antwort
      0
      • lea@lea.lgbtL lea@lea.lgbt

        I tried to run an LLM locally. The thing is, my computer is a potato. It gets super hot and is super slooow. I still guess it is the less harmful way to use AI, because everything is local and the data is not used for training. But it's like running Prime95 all the time. Maybe good in winter when you need a heating. I found it quite eye-opening how much energy it takes to generate an answer.

        lea@lea.lgbtL This user is from outside of this forum
        lea@lea.lgbtL This user is from outside of this forum
        lea@lea.lgbt
        schrieb zuletzt editiert von
        #3

        Some people replied with "then your hardware might not be efficient". Yes, that is the point I was trying to make. When you want to add AI to your daily workflow but you want to use it purely locally, so you can work offline and be sure it isn't used for training, then the hardware requirements rise from office laptop (anything that runs the text editor of your choice) to up-to-date high-end gaming PC.

        1 Antwort Letzte Antwort
        1
        0
        Antworten
        • In einem neuen Thema antworten
        Anmelden zum Antworten
        • Älteste zuerst
        • Neuste zuerst
        • Meiste Stimmen



        Copyright (c) 2025 abSpecktrum (@abspecklog@fedimonster.de)

        Erstellt mit Schlaflosigkeit, Kaffee, Brokkoli & ♥

        Impressum | Datenschutzerklärung | Nutzungsbedingungen

        • Anmelden

        • Du hast noch kein Konto? Registrieren

        • Anmelden oder registrieren, um zu suchen
        • Erster Beitrag
          Letzter Beitrag
        0
        • Home
        • Aktuell
        • Tags
        • Über dieses Forum